This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| en:av:autonomy_and_autonomous_systems:technology:sensor_technology [2021/07/28 14:05] – agrisnik | en:av:autonomy_and_autonomous_systems:technology:sensor_technology [Unknown date] (current) – external edit (Unknown date) 127.0.0.1 | ||
|---|---|---|---|
| Line 21: | Line 21: | ||
| Lidars (Light detection and ranging) sensors are very widely used in autonomous systems. In the same way, as sonars, Lidars exploit time differences. However, they might use other measuring techniques as well. Therefore, several types of Lidars sensors might be used in autonomous systems: | Lidars (Light detection and ranging) sensors are very widely used in autonomous systems. In the same way, as sonars, Lidars exploit time differences. However, they might use other measuring techniques as well. Therefore, several types of Lidars sensors might be used in autonomous systems: | ||
| - | * ** **Pulse Lidars** use the time of flight principle in the same way as sonars do. Knowing the speed of light gives enough information to calculate the distance from the object hit by the laser ray. Another mechanism used in scanning lasers is a rotating prism, which enables to control of the angle of the emitted laser pulse. Thereby both angle and distance might be estimated, which provides data to calculate the relative position of the object hit by the laser ray. | + | **Pulse Lidars** use the time of flight principle in the same way as sonars do. Knowing the speed of light gives enough information to calculate the distance from the object hit by the laser ray. Another mechanism used in scanning lasers is a rotating prism, which enables to control of the angle of the emitted laser pulse. Thereby both angle and distance might be estimated, which provides data to calculate the relative position of the object hit by the laser ray. |
| **Continuous-wave** amplitude Modulated (CWAM) Lidars exploit phase shift of continuous intensity-modulated laser signal. In this case, the phase shift provides in its essence the same information difference of time when the actual phase has been emitted and observed. | **Continuous-wave** amplitude Modulated (CWAM) Lidars exploit phase shift of continuous intensity-modulated laser signal. In this case, the phase shift provides in its essence the same information difference of time when the actual phase has been emitted and observed. | ||
| Line 60: | Line 60: | ||
| Unfortunately, | Unfortunately, | ||
| - | * Motion blur – caused by motion and sensor sensitivity. The less sensitivity the higher blur effects might be observed. Blurred images decrease object detection and distance estimation precision | + | * Motion blur – caused by motion and sensor sensitivity. The less sensitivity the higher blur effects might be observed. Blurred images decrease object detection and distance estimation precision; |
| - | * Lens distortion – distorts images in an unpredictable way as a result of imperfection of manufacturing | + | * Lens distortion – distorts images in an unpredictable way as a result of imperfection of manufacturing; |
| - | * Frames per second – fewer frames per second, less accurate the derived estimates will be | + | * Frames per second – fewer frames per second, less accurate the derived estimates will be; |
| * Changes of light condition from one frame to another, which complicates the overall processing. One of the obvious results is changes in colors, which reduces the usability of the frames detected. | * Changes of light condition from one frame to another, which complicates the overall processing. One of the obvious results is changes in colors, which reduces the usability of the frames detected. | ||
| Line 72: | Line 72: | ||
| </ | </ | ||
| - | Image is taken from an experimental agriculture robot that uses multiple time-synchronized sensors – Single RGB camera (upper-left), | + | Image is taken from an experimental agriculture robot that uses multiple time-synchronized sensors – Single RGB camera (upper-left), |
| ==== Inertial Measurement Unit (IMU) ==== | ==== Inertial Measurement Unit (IMU) ==== | ||
| Line 87: | Line 87: | ||
| Rotary encoders are widely used in ground systems, providing an additional relatively precise and reliable source of displacement estimate. The main purpose of the sensor is to provide output data on wheel or shaft angular displacement. There are two main types of rotary encoders – absolute and incremental ((https:// | Rotary encoders are widely used in ground systems, providing an additional relatively precise and reliable source of displacement estimate. The main purpose of the sensor is to provide output data on wheel or shaft angular displacement. There are two main types of rotary encoders – absolute and incremental ((https:// | ||
| As all sensors, rotary sensors have several main technologies: | As all sensors, rotary sensors have several main technologies: | ||
| - | * **Mechanical** – in its essence they are potentiometers, | + | * **Mechanical** – in its essence they are potentiometers, |
| - | * **Optical** – it uses opto-pair to detect a reflected signal from the rotating disk (mounted on the shaft) or a light going through the disk through dedicated gaps, thus providing a series of impulses while the shaft is rotating | + | * **Optical** – it uses opto-pair to detect a reflected signal from the rotating disk (mounted on the shaft) or a light going through the disk through dedicated gaps, thus providing a series of impulses while the shaft is rotating; |
| * **On- and Off-axis magnetic** – these types of senor use stationary or rotary magnets and exploit the hall effect to sense changes in a magnetic field. | * **On- and Off-axis magnetic** – these types of senor use stationary or rotary magnets and exploit the hall effect to sense changes in a magnetic field. | ||
| Using one of the designs absolute sensors allow to encode every single rotation angle with a unique code, while incremental produce a series of impulses. In the case of incremental encoding usually a quadrature A_B phase shifts are used to determine both direction and displacement. | Using one of the designs absolute sensors allow to encode every single rotation angle with a unique code, while incremental produce a series of impulses. In the case of incremental encoding usually a quadrature A_B phase shifts are used to determine both direction and displacement. | ||
| Line 102: | Line 102: | ||
| However, in reality, a predefined reliable map is very rarely available. Therefore, the map has to be constructed during the exploration of the environment, | However, in reality, a predefined reliable map is very rarely available. Therefore, the map has to be constructed during the exploration of the environment, | ||
| The overall process might be split into several steps: | The overall process might be split into several steps: | ||
| - | * Unordered List ItemSensing the environment before any action is executed. This helps to acquire the first data and probably find there some distinctive features like corners in office or crossings in an open traffic application | + | * Unordered List ItemSensing the environment before any action is executed. This helps to acquire the first data and probably find there some distinctive features like corners in office or crossings in an open traffic application; |
| - | * Execution of motion which provides motion data from IMU, rotary encoders, or other internal sensors, that provide data on the actual motion of the system. One might imagine this step as moving for short enough time with closed eyes | + | * Execution of motion which provides motion data from IMU, rotary encoders, or other internal sensors, that provide data on the actual motion of the system. One might imagine this step as moving for short enough time with closed eyes; |
| * Location calculation and map update this step is the most complicated since it combines the map data acquired before, with sensed motion data and uses this data to update the map. | * Location calculation and map update this step is the most complicated since it combines the map data acquired before, with sensed motion data and uses this data to update the map. | ||