Differences

This shows you the differences between two versions of the page.

Link to this comparison view

en:ros:simulations:uku [2019/10/31 16:26]
Kasutaja created
en:ros:simulations:uku [2020/03/23 10:59] (current)
tomykalm
Line 10: Line 10:
 The UKU's main sensor is lidar. Lidars are very good sensors to sense the surroundings,​ but they are also relatively expensive. The use of capable lidars because of their price limits their use on personal robots. UKU uses a 3D Lid //Velodyne VLP-16//. Lidar has 16 channels and 360 ° viewing angle. This means that lidar emits 16 rays vertically in each direction. This lidar is also used by most self-driving cars and other sophisticated self-driving robot systems. The UKU's main sensor is lidar. Lidars are very good sensors to sense the surroundings,​ but they are also relatively expensive. The use of capable lidars because of their price limits their use on personal robots. UKU uses a 3D Lid //Velodyne VLP-16//. Lidar has 16 channels and 360 ° viewing angle. This means that lidar emits 16 rays vertically in each direction. This lidar is also used by most self-driving cars and other sophisticated self-driving robot systems.
  
-{{:en:​ros:​simulations:​vlp.jpg?​300|}}+{{:et:​ros:​simulations:​vlp.jpg?​300|}}
  
-{{:en:​ros:​simulations:​lidarspec.png?​900|}}+{{:et:​ros:​simulations:​lidarspec.png?​900|}}
  
 Lidar allows laser beams to obtain a point cloud from the surroundings,​ which is very useful for autonomous navigation. The more channels the lidar has and the higher its resolution, the more accurate the point cloud will be. Lidar allows laser beams to obtain a point cloud from the surroundings,​ which is very useful for autonomous navigation. The more channels the lidar has and the higher its resolution, the more accurate the point cloud will be.
  
-{{:en:​ros:​simulations:​lidardrive.gif|}}+{{:et:​ros:​simulations:​lidardrive.gif|}}
  
 The //Velodyne VLP-16// lidar on top of the UKU robot is expensive and simpler and cheaper lidars can also be used in simpler applications. One of the most popular cheaper lidar is [[https://​www.slamtec.com/​en/​Lidar/​A3|RPlidar]]. The //Velodyne VLP-16// lidar on top of the UKU robot is expensive and simpler and cheaper lidars can also be used in simpler applications. One of the most popular cheaper lidar is [[https://​www.slamtec.com/​en/​Lidar/​A3|RPlidar]].
 //RPlidar A1// costs around 100 € and //RPlidar A3// 600 €. Lidars differ mainly in the resolution and measurement frequency. //RPlidar A1// costs around 100 € and //RPlidar A3// 600 €. Lidars differ mainly in the resolution and measurement frequency.
  
-{{:en:​ros:​simulations:​rplidar1.jpg?​400|}} +{{:et:​ros:​simulations:​rplidar1.jpg?​400|}} 
-{{:en:​ros:​simulations:​rplidar3.jpg?​450|}}+{{:et:​ros:​simulations:​rplidar3.jpg?​450|}}
  
 //RPlidar// has a 360 ° field of view but has only one channel. Therefore, the output of the lidar is two-dimensional. Such lidars are a good choice for navigating indoors where walls and other vertical objects are obstructed. //RPlidar// has a 360 ° field of view but has only one channel. Therefore, the output of the lidar is two-dimensional. Such lidars are a good choice for navigating indoors where walls and other vertical objects are obstructed.
  
-{{:en:​ros:​simulations:​2dlidar.gif|}}+{{:et:​ros:​simulations:​2dlidar.gif|}}
  
  
Line 37: Line 37:
 Let's start by cloning the repository: Let's start by cloning the repository:
  
-   $ sudo apt install git+   $ sudo apt install git 
 +   $ git clone http://​gitlab.robolabor.ee/​ingmar05/​uku_simulation_ws
  
-   $ git clone http://​gitlab.robolabor.ee/​ingmar05/​uku_simulation_ws 
-    
 Install the required libraries and compile: Install the required libraries and compile:
  
-   $ cd uku_simulation_ws+   $ cd uku_simulation_ws 
 +   $ rosdep install --from-paths src --ignore-src -r -y 
 +   $ catkin_make 
 +   $ source devel / setup.bash
  
-   $ rosdep install --from-paths src --ignore-src -r -y 
- 
-   $ catkin_make 
- 
-   $ source devel / setup.bash 
-    
 So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the //.bashrc// file: So that we don't have to get a working environment every time we open a new terminal window, we can add the following line to the //.bashrc// file:
  
-   $ echo "​source ~/​uku_simulation_ws/​devel/​setup.bash"​ >> ~/.bashrc+   $ echo "​source ~/​uku_simulation_ws/​devel/​setup.bash"​ >> ~/.bashrc
        
 To avoid having to manually launch multiple nodes each time with the correct parameters, launch files (//​.launch//​) have been created. Using these, you can execute // roscore // with one command and several different nodes with the correct parameters. To run all the necessary nodes to simulate the robot, a startup file called //​uku_vehicle.launch//​ has been created. To avoid having to manually launch multiple nodes each time with the correct parameters, launch files (//​.launch//​) have been created. Using these, you can execute // roscore // with one command and several different nodes with the correct parameters. To run all the necessary nodes to simulate the robot, a startup file called //​uku_vehicle.launch//​ has been created.
Line 59: Line 55:
 Run simulation: Run simulation:
  
-   $ roslaunch uku_vehicle_gazebo uku_vehicle.launch +   $ roslaunch uku_vehicle_gazebo uku_vehicle.launch 
-   +
 The //Gazebo// simulator and //Rviz// should now open. The UKU robot should be visible in the //Gazebo// virtual world. The //Gazebo// simulator and //Rviz// should now open. The UKU robot should be visible in the //Gazebo// virtual world.
  
 Using the //Gazebo// interface, we can add objects to the virtual world. Using the //Gazebo// interface, we can add objects to the virtual world.
  
-{{:en:​ros:​simulations:​giphy.gif|}}+{{:et:​ros:​simulations:​giphy.gif|}}
  
 A robot remote control unit is also included with the workspace. A robot remote control unit is also included with the workspace.
Line 71: Line 67:
 Launch Remote Control Node in another window: Launch Remote Control Node in another window:
  
-   $ rosrun ackermann_drive_teleop keyop.py +   $ rosrun ackermann_drive_teleop keyop.py 
-   +
 Now we should be able to control the robot with the keyboard. Now we should be able to control the robot with the keyboard.
  
Line 79: Line 75:
 The startup file also launched the //Rviz// visualization tool. When we open the //Rviz// window, we see a 3D lidar image and a robot model. The startup file also launched the //Rviz// visualization tool. When we open the //Rviz// window, we see a 3D lidar image and a robot model.
  
-{{:en:​ros:​simulations:​gazebo480.gif|}}+{{:et:​ros:​simulations:​gazebo480.gif|}}
en/ros/simulations/uku.txt · Last modified: 2020/03/23 10:59 by tomykalm
www.chimeric.de Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0