Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
en:av:autonomy_and_autonomous_systems:technology:sensor_technology [2021/06/16 19:11] agrisniken:av:autonomy_and_autonomous_systems:technology:sensor_technology [Unknown date] (current) – external edit (Unknown date) 127.0.0.1
Line 44: Line 44:
 **Pulsed radars** in the same way as sonars or pulse Lidars, pulse radars use the time difference of emitted and received signal pulses enabling them to estimate the distance to the object detected.  **Pulsed radars** in the same way as sonars or pulse Lidars, pulse radars use the time difference of emitted and received signal pulses enabling them to estimate the distance to the object detected. 
  
-**Frequency modulated Continuous wave (FMCW) radars** use frequency modulated signal, which might vary from 30 – 300GHz. The emitted signal is mixed with the received signal to produce a so-called intermediate frequency signal of IF. IF signal is used to estimate object range, speed, and direction. Dedicated high-resolution FMCW radars are used to receive radar images enabling not only to detect but also to recognize the objects detected. Sometimes these radars are called broad-band radars or imaging radars. Currently mainly broad-band radars are used in combination with multiple receiving antennas enabling effective operation with different frequencies. +**Frequency modulated Continuous wave (FMCW) radars** use frequency modulated signal, which might vary from 30 GHz to 300 GHz. The emitted signal is mixed with the received signal to produce a so-called intermediate frequency signal of IF. IF signal is used to estimate object range, speed, and direction. Dedicated high-resolution FMCW radars are used to receive radar images enabling not only to detect but also to recognize the objects detected. Sometimes these radars are called broad-band radars or imaging radars. Currently mainly broad-band radars are used in combination with multiple receiving antennas enabling effective operation with different frequencies. 
  
 <figure> <figure>
Line 60: Line 60:
  
 Unfortunately, the mentioned systems suffer from several significant disadvantages: Unfortunately, the mentioned systems suffer from several significant disadvantages:
-  * Motion blur – caused by motion and sensor sensitivity. The less sensitivity the higher blur effects might be observed. Blurred images decrease object detection and distance estimation precision +  * Motion blur – caused by motion and sensor sensitivity. The less sensitivity the higher blur effects might be observed. Blurred images decrease object detection and distance estimation precision; 
-  * Lens distortion – distorts images in an unpredictable way as a result of imperfection of manufacturing +  * Lens distortion – distorts images in an unpredictable way as a result of imperfection of manufacturing; 
-  * Frames per second – fewer frames per second, less accurate the derived estimates will be+  * Frames per second – fewer frames per second, less accurate the derived estimates will be;
   * Changes of light condition from one frame to another, which complicates the overall processing. One of the obvious results is changes in colors, which reduces the usability of the frames detected.   * Changes of light condition from one frame to another, which complicates the overall processing. One of the obvious results is changes in colors, which reduces the usability of the frames detected.
  
Line 72: Line 72:
 </figure> </figure>
  
-Image is taken from an experimental agriculture robot that uses multiple time-synchronized sensors – Single RGB camera (upper-left), Event-based camera (upper-right – reds pixels with increasing intensity, blue ones with decreasing), Lidar (lower-left), and stereo vision camera (lower-left). Image and video produced in ERDF 1.1.1.2 “Post-doctoral Research Aid”, project num. FLPP Lzp-2018/1-0482+Image is taken from an experimental agriculture robot that uses multiple time-synchronized sensors – Single RGB camera (upper-left), Event-based camera (upper-right – reds pixels with increasing intensity, blue ones with decreasing), Lidar (lower-left), and stereo vision camera (lower-left). Image and video produced in ERDF 1.1.1.2 “Post-doctoral Research Aid”, project num. FLPP Lzp-2018/1-0482.
  
 ==== Inertial Measurement Unit (IMU) ==== ==== Inertial Measurement Unit (IMU) ====
Line 87: Line 87:
 Rotary encoders are widely used in ground systems, providing an additional relatively precise and reliable source of displacement estimate. The main purpose of the sensor is to provide output data on wheel or shaft angular displacement. There are two main types of rotary encoders – absolute and incremental ((https://www.thegeekpub.com/245407/how-rotary-encoders-work-electronics-basics/)). Rotary encoders are widely used in ground systems, providing an additional relatively precise and reliable source of displacement estimate. The main purpose of the sensor is to provide output data on wheel or shaft angular displacement. There are two main types of rotary encoders – absolute and incremental ((https://www.thegeekpub.com/245407/how-rotary-encoders-work-electronics-basics/)).
 As all sensors, rotary sensors have several main technologies: As all sensors, rotary sensors have several main technologies:
-  * **Mechanical** – in its essence they are potentiometers, enabling to encode the full or several full rotations as a continuous output signal. Due to the building principle used the sensor’s main disadvantage is wearing out due to internal friction +  * **Mechanical** – in its essence they are potentiometers, enabling to encode the full or several full rotations as a continuous output signal. Due to the building principle used the sensor’s main disadvantage is wearing out due to internal friction; 
-  * **Optical** – it uses opto-pair to detect a reflected signal from the rotating disk (mounted on the shaft) or a light going through the disk through dedicated gaps, thus providing a series of impulses while the shaft is rotating+  * **Optical** – it uses opto-pair to detect a reflected signal from the rotating disk (mounted on the shaft) or a light going through the disk through dedicated gaps, thus providing a series of impulses while the shaft is rotating;
   * **On- and Off-axis magnetic** – these types of senor use stationary or rotary magnets and exploit the hall effect to sense changes in a magnetic field.    * **On- and Off-axis magnetic** – these types of senor use stationary or rotary magnets and exploit the hall effect to sense changes in a magnetic field. 
 Using one of the designs absolute sensors allow to encode every single rotation angle with a unique code, while incremental produce a series of impulses. In the case of incremental encoding usually a quadrature A_B phase shifts are used to determine both direction and displacement.  Using one of the designs absolute sensors allow to encode every single rotation angle with a unique code, while incremental produce a series of impulses. In the case of incremental encoding usually a quadrature A_B phase shifts are used to determine both direction and displacement. 
Line 102: Line 102:
 However, in reality, a predefined reliable map is very rarely available. Therefore, the map has to be constructed during the exploration of the environment, i.e. the vehicle simultaneously constructs the map finds its position on the map. This process is known as Simultaneous Localization And Mapping (SLAM). Depending on sensors used to acquired data about the environment as well as depending on computational resources, there is rather a rich palette of the algorithms available. Most of them try to employ a kind of data approximation to tackle the problem, which is clearly a – chicken and egg problem (what is the first map or location on the map?).  However, in reality, a predefined reliable map is very rarely available. Therefore, the map has to be constructed during the exploration of the environment, i.e. the vehicle simultaneously constructs the map finds its position on the map. This process is known as Simultaneous Localization And Mapping (SLAM). Depending on sensors used to acquired data about the environment as well as depending on computational resources, there is rather a rich palette of the algorithms available. Most of them try to employ a kind of data approximation to tackle the problem, which is clearly a – chicken and egg problem (what is the first map or location on the map?). 
 The overall process might be split into several steps: The overall process might be split into several steps:
-  * Unordered List ItemSensing the environment before any action is executed. This helps to acquire the first data and probably find there some distinctive features like corners in office or crossings in an open traffic application +  * Unordered List ItemSensing the environment before any action is executed. This helps to acquire the first data and probably find there some distinctive features like corners in office or crossings in an open traffic application; 
-  * Execution of motion which provides motion data from IMU, rotary encoders, or other internal sensors, that provide data on the actual motion of the system. One might imagine this step as moving for short enough time with closed eyes+  * Execution of motion which provides motion data from IMU, rotary encoders, or other internal sensors, that provide data on the actual motion of the system. One might imagine this step as moving for short enough time with closed eyes;
   * Location calculation and map update this step is the most complicated since it combines the map data acquired before, with sensed motion data and uses this data to update the map.    * Location calculation and map update this step is the most complicated since it combines the map data acquired before, with sensed motion data and uses this data to update the map. 
  
en/av/autonomy_and_autonomous_systems/technology/sensor_technology.1623859883.txt.gz · Last modified: (external edit)
CC Attribution-Share Alike 4.0 International
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0