1 of 58

SLAM & Transforms in ROSEROS4PRO Training: Day 3

Veiko Vunder

16.06.2021�Tartu, Estonia

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

2 of 58

Agenda: Day 3 (16.06)

  • 09:15 Transforms in ROS
  • 09:45 Workshop:
    • Day 2 catch up
    • ROS Android Sensors Driver
    • static TF, broadcaster programming
  • 12:00 Lunch Break
  • 12:45 Localization, Mapping, SLAM, Navigation with Path Planning
  • 13:15 Workshop
    • 2D mapping and navigation in Gazebo simulation
      • * Action Client programming
    • 2D mapping and navigation with Robotont
    • 3D mapping on Robotont
  • 16:00 End of Day 3

2

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

3 of 58

Teadaanded

Õhtusöök: Neljapäev (17.06) 19:15

@Raekojaplats 16 (Cafe Truffe)

3

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

4 of 58

What are transforms?

4

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

5 of 58

Terminology

  • We will make frequent use of coordinate reference frames or simply frames.

  • A coordinate reference frame i consists of an origin (Oi) and a triad of mutually orthogonal basis vectors (xi yi zi) – that are all fixed within a particular body.

  • The pose of a body is always expressed relative to some other body, so it can be expressed as the pose of one coordinate frame relative to another.

  • Similarly, rigid-body displacements can be expressed as displacements between two coordinate frames, one of which may be referred to as moving, while the other may be referred to as fixed.

5

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

6 of 58

Position and displacement

  • The position of the origin of coordinate frame i relative to coordinate frame j can be denoted by the 3×1 vector

6

Oi

Oj

A translation is a displacement in which no point in the rigid body remains in its initial position and all straight lines in the rigid body remain parallel to their initial orientations.

Any representation of position can be used to create a representation of displacement, and vice versa.

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

7 of 58

Rotation and orientation

7

A rotation is a displacement in which at least one point of the rigid body remains in its initial position.

As in the case of position and translation, any representation of orientation can be used to create a representation of rotation, and vice versa.

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

8 of 58

Representing rotation and orientation

  • Rotation matrix

  • Euler angles – rotations relative to moving frame (order matters!)

  • Fixed angles, e.g., roll-pitch-yaw (RPY) – rotations relative to fixed frame (order matters!)

  • Angle-axis – single angle θ about one vector w, denoted as θw or (θwx θwy θwz)

  • Quaternions

8

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

9 of 58

TF in ROS

9

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

10 of 58

Frames in ROS

  • Each part of the robot should have a reference coordinate frame attached to it
  • These frames are used to determine pose of each part in relation to other frames

10

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

11 of 58

Static transforms

  • Transform should not change during operation
  • e.g. computer -> base_link

11

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

12 of 58

Dynamic transforms

  • Transform can change during operation
  • e.g. base_link -> map

12

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

13 of 58

TF tree

  • TF tree shows all current transforms and their relations
  • Using rqt, we can visualize current TF tree

13

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

14 of 58

Rotations/orientations in ROS:�RPY (roll-pitch-yaw)

  • ROS uses quaternions [but RPY is also OK, as long as you know what you are doing☺]
  • RPY (roll-pitch-yaw)
    • Fairly intuitive
    • Originates from aerospace
      • ROLL – rotation about the axis�from nose to tail
      • PITCH – nose up/down
      • YAW – nose left/right
    • Axes move with the aircraft,�relative to Earth

14

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

15 of 58

geometry_msgs/Pose

  • The position and orientation of a rigid body in space are collectively termed the pose.

  • ROS has a geometry_msgs/Pose message type that consists of
    • geometry_msgs/Point position
      • float64 x
      • float64 y
      • float64 z
    • geometry_msgs/Quaternion orientation
      • float64 x
      • float64 y
      • float64 z
      • float64 w

15

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

16 of 58

geometry_msgs/PoseStamped

  • std_msgs/Header header
    • uint32 seq
    • time stamp
    • string frame_id
  • geometry_msgs/Pose pose
    • geometry_msgs/Point position
    • geometry_msgs/Quaternion orientation

16

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

17 of 58

tf2 tutorials

  • http://wiki.ros.org/tf2/Tutorials
  • Good tutorials that cover all the basics

17

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

18 of 58

tf2_ros library examples

  • void tf2::Quaternion::setRPY (const tf2Scalar &roll, const tf2Scalar &pitch, const tf2Scalar &yaw)
  • Quaternion& tf2::Quaternion::normalize ()
  • void tf2_ros::TransformBroadcaster::sendTransform (const geometry_msgs::TransformStamped &transform)

18

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

19 of 58

Using static_transform_publisher

  • Used to quickly define a static transform between two frames
  • Located in ROS package tf2_ros
  • Syntax: static_transform_publisher x y z yaw pitch roll frame_id child_frame_id
  • Note the order of RPY
  • e.g rosrun tf2_ros static_transform_publisher 0 0 1 0 0 0 base_link camera

19

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

20 of 58

Using static_transform_publisher

in roslaunch

<launch>

<node pkg="tf2_ros" type="static_transform_publisher" name="my_tf_broadcaster" args="0 0 1 0 0 0 base_link camera" />

</launch>

20

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

21 of 58

Timing with transforms

  • In some cases, the timing of transforms is very important
    • mapping
    • camera calibration
  • Example:
    • Camera, laserscan and odom coming from robot
    • Mapping software on external PC
    • Time not synchronized - mapping will fail
    • Use PTP or NTP (for example with Chrony) to sync
    • ...or run time critical tasks on same HW if possible

21

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

22 of 58

Writing a transform broadcaster in C++

22

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

23 of 58

Hands on time ....

23

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

24 of 58

24

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

25 of 58

Localization

25

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

26 of 58

Position and sensing

  • Dead reckoning
    • e.g. wheel odometry, IMU
    • subject to cumulative error (encoder values increasing when slipping)
  • Sensing
    • e.g camera, LiDAR, sonar
    • problem is perceptual aliasing - two different places seem the same

26

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

27 of 58

Localization

  • Assume a known map
  • Dead reckoning
    • Start from known place
    • Localization error increases over time
  • Use landmarks and reference them to a known map

27

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

28 of 58

Localization illustrated

28

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

29 of 58

Localization illustrated

29

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

30 of 58

Localization illustrated

30

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

31 of 58

Localization illustrated

31

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

32 of 58

Localization TF

  • map->odom->base_link
  • Dead reckoning is odom->base_link
  • map->odom TF fixes dead reckoning drift

32

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

33 of 58

Bayesian Filter

  • Two main steps:
    • Prediction and update
  • Prediction uses previous location and measurements and physical model to predict where the robot is going to be on the next timestep
  • Update step makes an observation of the current situation
  • https://github.com/rlabbe/Kalman-and-Bayesian-Filters-in-Python

33

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

34 of 58

Bayesian filter in localization

34

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

35 of 58

Mapping

35

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

36 of 58

Robotont sensors

  • Depth camera
  • Wheel odometry
  • 2D scan from depth camera

36

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

37 of 58

General mapping

  • Robot knows where it is but doesn’t know where anything else is
  • As with localization, the robot searches for landmarks
  • When a landmark is found, it saves it in a map
  • To tie all the landmarks together, a loop closure is needed
    • “Oh, I’ve already been here!”

37

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

38 of 58

SLAM

Simultaneous Localization and Mapping

38

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

39 of 58

Example of SLAM

39

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

40 of 58

Visual SLAM

  • Use only cameras and visual information
  • Create a depth map using one to multiple cameras
  • Usually, Lidars are used to increase the accuracy of Visual SLAM
    • Tesla said that they don’t need Lidars

40

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

41 of 58

Common ROS SLAM packages

41

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

42 of 58

Selection of ROS mapping algorithms

42

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

43 of 58

Example maps

43

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

44 of 58

amcl

  • http://wiki.ros.org/amcl
  • Adaptive Monte Carlo Localization
  • Localization against known map

44

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

45 of 58

RTABMap

  • http://wiki.ros.org/rtabmap_ros
  • Currently one of the best packages for 3D mapping

45

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

46 of 58

orb_slam2_ros

46

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

47 of 58

Costmaps

  • Costmap shows where it is safe to be for the robot
  • Usually costmaps are binary occupancy grids
  • Advanced costmaps can also be non-binary
    • Each part of the map has a different cost depending on how difficult it is to travel through
    • e.g. Different terrains have different “costs”

47

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

48 of 58

Local planner

  • Plans short paths
  • Publishes cmd_vel
  • Tries to follow global planner
  • Can avoid obstacles unknown to global planner
    • Avoiding cars on a street vs which street to take

48

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

49 of 58

Global planner

  • Plans the whole trajectory
    • e.g GPS navigation
  • Global planner sees the current assumed world state
  • In a warehouse:
    • Global map could be the layout of the warehouse, not updated often
    • Local map is constantly updated from sensor data to avoid collisions

https://www.researchgate.net/publication/258163012_Spline-Based_RRT_Path_Planner_for_Non-Holonomic_Robots

49

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

50 of 58

Steering mechanisms

50

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

51 of 58

Ackermann steering

  • Cars

51

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

52 of 58

Differential steering

  • Tracked vehicles

52

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

53 of 58

Omnidirectional steering

  • Robotont

53

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

54 of 58

ROS navigation

54

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

55 of 58

Requirements for navigation

  • ROS navigation stack currently supports only differential and omnidirectional steering.
  • For navigation, the robot must:
    • Accept velocity commands
    • Publish odometry messages

55

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

56 of 58

Common parameters

  • Goal tolerance
  • Maximum and minimum speed
  • Local and global map size
  • Numerous others

56

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

57 of 58

ROS navigation packages

  • http://wiki.ros.org/navigation
  • ROS navigation stack
  • If your robot meets navigation requirements, it can easily be configured to autonomously navigate

57

Copyright 2021, University of Tartu, Licence CC BY-ND-NC

58 of 58

3D navigation

58

Copyright 2021, University of Tartu, Licence CC BY-ND-NC