A differential drive robot is controlled using ROS2 Humble running on a Raspberry Pi 4 (running Ubuntu server 22.04). The vehicle is equipped with a Raspberry Pi camera for visual feedback and an RPlidar A1 sensor used for Simultaneous Localization and Mapping (SLAM), autonomous navigation using the Nav2 stack. Additionally, an MPU6050 inertial measurement unit (IMU) is employed by the robot_localization
package on the robot, to fuse IMU sensor data and the wheel encoders data, using an Extended Kalman Filter (EKF) node, to provide more accurate robot odometry estimates.
Hardware components are written for the Waveshare Motor Driver HAT and MPU6050 sensor to be accessed by the ros2_control
differential drive controller and IMU sensor broadcaster respectively, via the ros2_control
resource manager.
A preprint of this work is available here.
- Lidarbot
lidarbot_aruco
: Contains configuration, launch and node files to use ArUco markers with lidarbot.lidarbot_base
: Contains the ROS2 control hardware component for the lidarbot with low-level code for the Waveshare Motor Driver HAT.lidarbot_bringup
: Contains hardware component for the MPU6050 module, launch files to bring up the camera, lidar and the real lidarbot.lidarbot_description
: Contains the URDF description files for lidarbot, sensors andros2 control
.lidarbot_gazebo
: Contains configuration, launch and world files needed to simulate lidarbot in Gazebo Classic.lidarbot_gz
: Contains urdf, launch and world files needed to simulate lidarbot in Gazebo Fortress.lidarbot_navigation
: Contains launch, configuration and map files used for lidarbot navigation.lidarbot_slam
: Contains configuration files for the slam toolbox and RViz, launch file to generate maps using SLAM.lidarbot_teleop
: Contains configuration and launch files used to enable joystick control of the lidarbot in simulation and physically.
The following components were used in this project:
Part | |
---|---|
1 | Raspberry Pi 4 (4 GB) |
2 | SanDisk 32 GB SD Card (minimum) |
3 | Two wheel drive robot chassis kit |
4 | Waveshare Motor Driver HAT |
5 | 2 x Motors with encoders and wire harness |
6 | MPU6050 board |
7 | RPlidar A1 |
8 | Raspberry Pi camera v1.3 |
9 | 3D printed stands for RPlidar A1 and RPi 4 |
10 | Mount for Raspberry Pi camera |
11 | Powerbank for RPi 4 (minimum output: 5V 3A) |
12 | Gamepad |
13 | Mini Travel Router (optional) |
14 | 3 Slot 18650 battery holder |
15 | 3 x 18650 batteries to power Motor Driver HAT |
16 | Female to Female Dupont jumper cables |
17 | Spare wires |
18 | Logitech C270 webcam (optional) |
Some other tools or parts used in the project are as follows:
Tool/Part | |
---|---|
1 | Soldering iron |
2 | 3D printer |
3 | Screwdriver set |
4 | Double-sided tape |
Additionally, some nylon stand-offs were used in between the Raspberry Pi 4 and its 3D printed stand to make it easier to plug in the USB power cable from the powerbank.
The electronic components of the lidarbot are connected as shown below.
The MPU6050 board pins were connected to the Raspberry Pi 4 GPIO pins as follows for use with the I2C communication protocol:
MPU6050 board | GPIO.BOARD | GPIO.BCM |
---|---|---|
VCC | 3.3V | 3.3V |
GND | GND | GND |
SCL | 05 | GPIO03 |
SDA | 03 | GPIO02 |
In a previous hardware version of lidarbot, photo interrupters were used with encoder disks with 20 slots (included in the robot chassis kit). However, this setup proved restrictive in yielding satisfactory results in navigation due to the low number of encoder ticks, which was only 20 for a 360 degree turn of the motor. Therefore, the robot chassis kit motors were replaced with the ones below with built-in encoders --- which have encoder ticks of approximately 1084, calculated using this guide from Automatic Addison.
These are the new motors used for lidarbot:
The pins on right and left motors are connected to the GPIO pins as follows:
Right motor pins | GPIO.BOARD | GPIO.BCM |
---|---|---|
C1 (or Encoder A) | 22 | GPIO25 |
C2 (or Encoder B) | 16 | GPIO23 |
VCC | 5V | 5V |
GND | GND | GND |
Left motor pins | GPIO.BOARD | GPIO.BCM |
---|---|---|
C1 (or Encoder A) | 18 | GPIO24 |
C2 (or Encoder B) | 15 | GPIO22 |
VCC | 5V | 5V |
GND | GND | GND |
Where,
C1 - counts the respective motor pulses and
C2 - checks the direction of motion of the respective motor, that is, forward or reverse.
The screw terminal blocks on the Motor Driver HAT (shown below) are connected to the motor pins, M+ and M-, and battery holder cables as follows:
Motor Driver HAT pin | Connected to |
---|---|
MA1 | Red wire (Left motor) |
MA2 | Black wire (Left motor) |
GND | Black wire (battery holder) |
VIN | Red wire (battery holder) |
MB1 | Red wire(Right motor) |
MB2 | Black wire (Right motor) |
Solder the cables (provided) to the motors. Spare wires might be needed if the provided ones are too short to reach the Motor Driver Hat. Should the wheel(s) move in the direction opposite of what is expected, exchange the respective motor cables screwed into the terminal blocks.
Finally, the Raspberry Pi camera is connected to the ribbon slot on the Raspberry Pi 4 and the RPlidar A1 sensor is plugged into one of the RPi 4's USB ports.
A development machine or PC (laptop or desktop) is used to run more computationally intensive applications like Gazebo and Rviz. Additionally, the PC can be used to remotely control lidarbot.
Ubuntu 22.04 LTS is required for this project to work with ROS2 Humble. Ubuntu 22.04 LTS can be installed on a PC by following this guide. The ROS2 Humble installation procedure is available here. The Desktop version is installed on the PC (which includes RViz):
sudo apt install ros-humble-desktop
Then install the ROS development tools:
sudo apt install ros-dev-tools
After the ROS2 Humble installation, create a workspace on the PC/development machine and clone this repository:
mkdir -p ~/dev_ws/src
cd ~/dev_ws/src
git clone https://github.com/TheNoobInventor/lidarbot.git .
Next install all the ROS dependencies for the lidarbot packages:
cd ~/dev_ws
sudo rosdep init
rosdep update
rosdep install --from-paths src --ignore-src --rosdistro humble -r -y
Any ROS packages referred to subsequently are assumed to be installed using the rosdep install
command above unless it is explicitly specified.
Two more dependencies need to be met before building the workspace: installing the WiringPi i2c library to use the Raspberry Pi 4 GPIO pins and dependencies for the MPU6050 RPi 4 C++ library.
To be able to utilize the GPIO pins of the Raspberry Pi 4 and program them using C/C++, an unofficial WiringPi was installed. This is required as hardware interfaces used by ros2_control
are currently written only in C++ and low-level communication between Waveshare's Motor Driver HAT and ros2_control
is needed.
The library is installed by executing the following commands in a terminal:
cd ~/Downloads
git clone https://github.com/wbeebe/WiringPi.git
cd WiringPi/
./build
To check the current gpio version run this:
gpio -v
The reference article for the WiringPi library can be found here.
Alex Mous' C/C++ MPU6050 library for Raspberry Pi 4, with modifications to incorporate quaternions, was used to setup the ros2_control
IMU sensor broadcaster in the lidarbot_bringup
package.
Recall that the MPU6050 module uses the I2C communication protocol, the I2C dependencies for using this library are installed with:
sudo apt install libi2c-dev i2c-tools libi2c0
To avoid manually sourcing the ROS installation (or underlay) in each terminal window opened, and if ROS2 Humble is the only distribution on the PC, the command to source the underlay is added to the respective shell configuration file.
Using bash:
echo "source /opt/ros/humble/setup.bash" >> $HOME/.bashrc
Using zsh:
echo "source /opt/ros/humble/setup.zsh" >> $HOME/.zshrc
Additionally, to avoid manually sourcing our workspace (or overlay), add the command to source the workspace to the respective configuration file.
Using bash:
echo "source ~/dev_ws/install/setup.bash" >> $HOME/.bashrc
source $HOME/.bashrc
Using zsh:
echo "source ~/dev_ws/install/setup.zsh" >> $HOME/.zshrc
source $HOME/.zshrc
The command: source $HOME/.zshrc
sources the configuration file for use in the current terminal. However, this step is not necessary for terminal windows opened hereafter.
Finally, navigate to the workspace directory and run the build command:
cd ~/dev_ws
colcon build --symlink-install
The --symlink-install
argument uses symlinks instead of copies which saves you from having to rebuild every time you tweak certain files.
UPDATE: Previously, Gazebo Classic was used as the simulator for the project but Gazebo Classic has been deprecated. The Gazebo Classic lidarbot package is still available here, lidarbot_gazebo
, however, you might encounter installation issues with Classic. It is recommended to migrate to Gazebo, formerly called "Ignition", and choosing the Fortress release for use with Ubuntu 22.04.
Gazebo Fortress is used as the simulator for lidarbot. It can be installed with ROS following the steps outlined in this guide.
The installed xacro tool dependency is used to process the lidarbot URDF files and combine them into a single complete URDF file.
The description_launch.py
launch file displays the model in RViz:
ros2 launch lidarbot_description description_launch.py
The joint_state_publisher_gui package is used to bringup a window with sliders to move non-static links in RViz. Set the use_gui
argument to true
to turn the left and right wheels of lidarbot:
ros2 launch lidarbot_description description_launch.py use_gui:=true
The different arguments for the launch file, and their default values, can be viewed by adding --show-args
at the end of launch command:
ros2 launch lidarbot_description description_launch.py --show-args
A wireless gamepad, like the one shown below, is used to control lidarbot both in simulation and physically.
The joy_tester package is used to test and map the gamepad (joystick) keys to control lidarbot. To use it, plug in the USB dongle in the PC, then run:
ros2 run joy joy_node
And the following, in a new terminal:
ros2 run joy_tester test_joy
This opens a GUI window like the one shown below,
Click each button and move each stick of the gamepad to confirm that the actions are shown in GUI. The numbers correspond to the axis of the buttons and joystics (sticks) that will be used in mapping the movements of lidarbot.
The gamepad configuration for this project is in joystick.yaml
, where:
Button/stick | Button/stick axis | Function |
---|---|---|
L1 button | 4 | Hold this enable button to move robot at normal speed |
Left stick | 2 | Move stick forward or backward for linear motion of the robot |
Right stick | 1 | Move stick left or right for angular motion of the robot |
Setting require_enable_button
to true
ensures that L1 has to be held before using the sticks to move the robot and stops the robot once L1 is no longer pressed.
To enable turbo mode for faster speed, the enable_turbo_button
option in the config file can be set to an unused button axis.
The joy_node
parameter, deadzone
, specifies the amount a joystick has to be moved for it to be considered to be away from the center. This parameter is normalized between -1
and 1
. A value of 0.25
indicates that the joytsick has to be moved 25%
of the way to the edge of an axis's range before that axis will output a non-zero value.
The deadzone
parameter should be tuned to suit the performance of user's game controller.
The twist_mux
package is used to multiplex several velocity command sources, used to move the robot with an unstamped geometry_msgs::Twist message, into a single one. These sources are assigned priority values to allow a velocity source to be used or disabled. In this project, the command velocity sources are from the joystick and navigation.
The twist_mux
configuration file is in twist_mux.yaml
, and is used in the Gazebo and lidarbot bringup launch files, gz_launch.py
and lidarbot_bringup_launch.py
respectively.
It can be observed from the configuration file, that the joystick commmand velocity source has a higher priority, with an assigned value of 100
, compared to the navigation velocity source that is assigned a value of 10
.
twist_mux:
ros__parameters:
topics:
navigation:
topic : cmd_vel
timeout : 0.5
priority: 10
joystick:
topic : cmd_vel_joy
timeout : 0.5
priority: 100
To install ROS2 Humble on the Raspberry Pi, Ubuntu Server 22.04 was first flashed on a 32GB micro SD card, this process is detailed in this guide.
After inserting the SD card and booting up the Pi, the environment for ROS2 Humble is setup by following this guide. Afterwards, ROS-Base (Bare Bones) and ROS development tools are installed:
sudo apt install ros-humble-ros-base ros-dev-tools
Similarly, after the ROS2 Humble installation, create a workspace on the Raspberry Pi and clone this repository:
mkdir -p ~/robot_ws/src
cd ~/robot_ws/src
git clone https://github.com/TheNoobInventor/lidarbot.git .
Install ROS dependencies:
cd ~/robot_ws
sudo rosdep init
rosdep update
rosdep install --from-paths src --ignore-src -r -y \
--skip-keys "rviz2 gazebo_ros2_control gazebo_ros_pkgs" --rosdistro humble
rviz2
and the Gazebo related packages are skipped in the ROS dependency installation process as they are only run on the PC and not on the robot --- the rosdep
keys for the ROS2 Humble distribution are available here.
WiringPi i2c library and MPU6050 RPi 4 C++ library are also installed before building the workspace --- a Downloads
directory will need to be created to clone the WiringPi files.
Likewise to avoid manually sourcing the underlay and overlay, the same steps employed in the development machine setup are followed but replacing dev_ws
with robot_ws
where necessary.
Afterwards, navigate to the workspace directory then run the build command:
cd ~/robot_ws
colcon build --symlink-install
Waveshare's Motor Driver HAT was used to control the motors of lidarbot. The relevant files are found in the include
and src
directories of the lidarbot_base
package, the file tree of these directories are as shown,
├── CMakeLists.txt
├── include
│ └── lidarbot_base
│ ├── Debug.h
│ ├── DEV_Config.h
│ ├── lidarbot_hardware.hpp
│ ├── MotorDriver.h
│ ├── motor_encoder.h
│ ├── PCA9685.h
│ └── wheel.hpp
├── lidarbot_hardware.xml
├── package.xml
└── src
├── DEV_Config.c
├── lidarbot_hardware.cpp
├── motor_checks_client.cpp
├── motor_checks_server.cpp
├── MotorDriver.c
├── motor_encoder.c
├── PCA9685.c
└── wheel.cpp
The following table summarizes the package files specifying those that were made available by Waveshare, the newly added ones and the functions they serve.
File(s) | New | Modified | Function |
---|---|---|---|
Debug.h |
No | No | Output debug information with printf . |
DEV_Config.h , DEV_Config.c |
No | Yes | Configure underlying device interfaces and protocols. |
lidarbot_hardware.hpp , lidarbot_hardware.cpp |
Yes | No | Configure the WaveShare Motor Driver HAT system hardware component. It reads the motor encoder values to calculate the wheel positions and velocities. It also sends commands to the drive the motors of each wheel proportional to the velocity commands received from the ros2_control differential driver controller. |
motor_checks_client.cpp |
Yes | No | Request motor tests to confirm motor connections and the motor operation. |
motor_checks_server.cpp |
Yes | No | Run requested motor tests. |
MotorDriver.h , MotorDriver.c |
No | Yes | Program the TB6612FNG motor Integrated Circuit (IC) to drive the motors. |
motor_encoder.h , motor_encoder.c |
Yes | No | Calculate encoder ticks and set the speed each motor should run at according to the commands received from the hardware component. |
PCA9685.h , PCA9685.c |
No | No | Program the PCA9685 chip to adjust the motor speed using Pulse Width Modulation (PWM). |
wheel.hpp , wheel.cpp |
Yes | No | Configure motor wheel parameters. |
The hierarchy of motor control is shown in the image below.
The main reasons for modifying the Waveshare files was to enable the use of the WiringPi library and also to find a workaround to determining the motor direction of rotation --- this was made relatively easier since the motor encoders used here have hall effect sensors affixed to them as shown below.
Hall effect sensors in motor encoders detect changes in the magnetic field associated with the motor's rotation. These changes can be used by the motor encoder to determine the position of the motor. As the motor rotates, it generates alternating electrical signals of high and low voltage, and each time the signal goes from from low high (that is, it is rising), that counts as a single pulse. The image below, from this guide from Automatic Addison, visualizes this.
The encoder pulse, sometimes referred to as a tick, for a 360 degree turn of the motor is used as the wheel odometry data of the robot. This data will be used with the MPU6050 IMU sensor data using an Extended Kalman Filter with the robot_localization package to provide more accurate robot odometry estimates. Recall that the encoder ticks for the motors used here is 1084, for a 360 degree turn of the motor.
The library dependencies for using the Motor Driver HAT were met after installing the MPU6050 library.
The following packages are installed to use the Raspberry Pi Camera v1.3:
sudo apt install libraspberrypi-bin v4l-utils raspi-config
Support for the RPi camera v1.3 will need to be enabled by navigating the menu options after running the following command:
sudo raspi-config
Confirm the RPi camera is connected by running this command:
vcgencmd get_camera
This should output the following result:
supported=1 detected=1, libcamera interfaces=0
Prior to using the IMU sensor broadcaster, the MPU6050 module needs to be calibrated to filter out its sensor noise/offsets. This is done in the following steps:
-
Place lidarbot on a flat and level surface and unplug the RPlidar.
-
Generate the MPU6050 offsets. A Cpp executable is created in the CMakeLists.txt file of the
lidarbot_bringup
package before generating the MPU6050 offsets. This section of the CMakeLists.txt file is shown below:# Create Cpp executable add_executable(mpu6050_offsets src/mpu6050_lib.cpp src/mpu6050_offsets.cpp) # Install Cpp executables install(TARGETS mpu6050_offsets DESTINATION lib/${PROJECT_NAME} )
Build the
lidarbot_bringup
package:colcon build --symlink-install --packages-select lidarbot_bringup
Run the executable:
ros2 run lidarbot_bringup mpu6050_offsets
Which outputs something like this:
Please keep the MPU6050 module level and still. This could take a few minutes. Calculating offsets ... Gyroscope offsets: ------------------ X: -104.689 Y: 651.005 Z: -158.596 Accelerometer offsets: ---------------------- X: -13408.8 Y: 2742.39 Z: -14648.9 Include the obtained offsets in the respective macros of the mpu6050_lib.h file.
-
Calibrate the MPU6050 module. Substitute the generated offsets into this section of the
mpu6050_lib.h
file://Offsets - supply your own here (calculate offsets with getOffsets function) // Gyroscope #define G_OFF_X -105 #define G_OFF_Y 651 #define G_OFF_Z -159 // Accelerometer #define A_OFF_X -13409 #define A_OFF_Y 2742 #define A_OFF_Z -14649
-
Afterwards, the
lidarbot_bringup
package is rebuilt to reflect any changes made to thempu6050_lib.h
file.
The MPU6050 module is set to its most sensitive gyroscope and accelerometer ranges, which can be confirmed (or changed) at the top of the mpu6050_lib.h
file.
Both the development machine and lidarbot need to be connected to the same local network as a precursor to bidirectional communication between the two systems. This guide by Robotics Backend was used in configuring the network communication.
To ensure communication between the dev machine and lidarbot, firstly, the firewall on the development machine had to be disabled (the firewall on the Ubuntu server was disabled by default):
sudo ufw disable
The next step is to export ROS_DOMAIN_ID
number, a number between 0 (by default) and 101, to the shell configurations of both systems:
echo "export ROS_DOMAIN_ID=31" >> ~/.zshrc
This step is necessary particularly if there are other robots and development machines on the same local network. This YouTube video and article go into more detail about ROS_DOMAIN_ID
.
Then source the shell configuration file:
source $HOME/.zshrc
Both systems might need to be rebooted to effect these changes.
A static IP address was assigned to lidarbot on the router for easy discoverability on the network. Furthermore, it is advisable to use a router that supports at least the WiFi 5 wireless standard to avoid excessive data lag on RViz and terminal crashes when recording a ros2bag for instance; in relation to ros2bags, changing the default data storage format to MCAP should improve read and write performance.
A dedicated network travel router, following this setup by Articulated Robotics was used in this project to establish a reliable connection between the development machine and lidarbot.
Additional network troubleshooting information can be found here.
To control the motion of lidarbot both in simulation and physically, the ros2_control
framework was used. It is defined as a hardware-agnostic control framework focusing on the modularity of control systems for robots, sharing of controllers and real-time performance. It provides hardware abstraction and low-level management utility, for instance, hardware lifecycle, communication and access control, which is useful for other frameworks like the manipulation path planning and autonomous navigation frameworks, MoveIt2 and Nav2 respectively.
The ros2_control
framework can be thought of as a middleman between the robot hardware and the robot software and is comprised of the following parts:
- Hardware Components
- Resource Manager
- Controllers
- Controller Manager
Hardware components are drivers for physical hardware and represent its abstraction in the ros2_control
framework. These components are usually provided by robot or hardware manufacturers, otherwise the components would either have to be sourced from the ROS community or written from scratch using this guide made available by the ros2_control
development team. Hardware components are written in C++
and exported as plugins using the pluginlib
-library; these plugins are dynamically loaded at runtime by the Resource Manager.
Hardware components are configured in the robot's Unified Robot Description Format (URDF) file --- an XML file that describes the model of the robot --- using the <ros2_control>
tag and the recommended xacro
(XML) macro files.
There are 3 basic types of components:
This refers to a hardware setup made up of more than 1 Degree of Freedom (DOF) which contains multiple joints and sensors. This component type offers a lot of flexibility in handling hardware setups which can range from mobile robots to humanoid robot hands and more. The Waveshare Motor Driver HAT controls the two motors (or two wheel joints) on lidarbot, hence the robot base is classifed as a system component.
A system component has reading and writing capabilities, these are expressed as state interfaces and command interfaces respectively. The system component configuration for the robot base with a velocity command interface and state interfaces for both the velocity and position is shown in the code snippet below, available in this xacro file.
<ros2_control name='RealRobot' type='system'>
<hardware>
<plugin>lidarbot_base/LidarbotHardware</plugin>
<param name='left_wheel_name'>left_wheel_joint</param>
<param name='right_wheel_name'>right_wheel_joint</param>
<param name='enc_ticks_per_rev'>1084</param>
<param name='loop_rate'>30.0</param>
</hardware>
<joint name='left_wheel_joint'>
<command_interface name='velocity'>
<param name='min'>-10</param>
<param name='max'>10</param>
</command_interface>
<state_interface name='velocity'/>
<state_interface name='position'/>
</joint>
<joint name='right_wheel_joint'>
<command_interface name='velocity'>
<param name='min'>-10</param>
<param name='max'>10</param>
</command_interface>
<state_interface name='velocity'/>
<state_interface name='position'/>
</joint>
</ros2_control>
A similar system hardware component is configured for Gazebo Fortress in this URDF file.
A sensor provides information about changes to a physical phenomenon it is observing. A sensor component has only reading capabilities and is related to a joint or a link in the URDF file. This component type can be configured as an externally connected sensor, with its own <ros2_control>
tags, or as an integrated sensor embedded in system component <ros2_control>
tags. The MPU6050 module is configured as an externally connected sensor component and is associated with a fixed link named imu_link
. This configuration is shown in the code snippet below which is available in the MPU6050 section of this xacro file.
<ros2_control name='MPU6050' type='sensor'>
<hardware>
<plugin>lidarbot_bringup/MPU6050Hardware</plugin>
<param name='sensor_name'>mpu6050</param>
<param name='frame_id'>imu_link</param>
</hardware>
<sensor name='mpu6050'>
<state_interface name='orientation.x'/>
<state_interface name='orientation.y'/>
<state_interface name='orientation.z'/>
<state_interface name='orientation.w'/>
<state_interface name='angular_velocity.x'/>
<state_interface name='angular_velocity.y'/>
<state_interface name='angular_velocity.z'/>
<state_interface name='linear_acceleration.x'/>
<state_interface name='linear_acceleration.y'/>
<state_interface name='linear_acceleration.z'/>
</sensor>
</ros2_control>
However, a standalone IMU sensor plugin was used to simulate an IMU sensor in Gazebo Fortress.
This component type only has 1 DOF. Some examples of these include motors and valves. Actuator components are related to a single joint and have both reading and writing capabilities, however, obtaining state feedback is not mandatory if is inaccessible; for instance, controlling a DC motor that has no encoders.
The image below (adapted from Denis Štogl (CC-BY)) shows the system and sensor components used for the lidarbot base and the MPU6050 module respectively — custom hardware components were written for the Waveshare Motor Driver HAT and the MPU6050 module using the MPU6050 library.
The coloured icons represent the state and command interfaces of the hardware components. The state interfaces for the respective cartesian axes, associated with the MPU6050 module, were merged into one for clarity.
The Resource Manager (RM) abstracts the hardware components for the ros2_control
framework. The RM is responsible for loading the components using the pluginlib
-library, managing their lifecycle and components’ state and command interfaces. In the control loop execution, which is managed by the Controller Manager, the RM’s read()
and write()
methods handle the communication with the hardware components.
The abstraction provided by RM allows reuse of implemented hardware components. A sensor component, like the MPU6050 module, has only reading capabilities, thus it can be shared by multiple controllers simultaneously. However, components with command interfaces require exclusive access by only one controller.
The bidirectional arrows in the figure below (adapted from Denis Štogl (CC-BY)) indicate interfaces that have both read and write capabilities, but interfaces that only provide state feedback are represented with unidirectional arrows.
The controllers in the ros2_control
framework are based on control theory which compare the reference value with the measured output and, based on this error, calculate a system’s input. Controllers access the latest hardware lifecycle state when the Controller Manager’s update()
method is called and enables them to write to the hardware command interfaces.
There are a number of controllers available for specific hardware configurations, such as Ackermann steering controller, bicycle steering controller, gripper controller and differential drive controller (which is used on lidarbot). The full list of controllers is available on the ros2_controllers
page. Similar to hardware components, a guide is also provided on how to write a custom controller if a particular hardware configuration is missing.
The differential drive controller receives Twist.msg
velocity command messages from the Nav2 stack which are written to the velocity command interfaces of the robot base system hardware component. This in turn drives the motors in accordance with the received velocity command interfaces. This sequence of commands is illustrated in the next image.
Broadcasters only have reading abilities and are used to publish sensor data from hardware components to ROS topics. The joint state broadcaster reads all state interfaces and reports them on the /joint_states
and /dynamic_joint_state
topics. These topics are used by RViz to display the current state of the robot joints.
The IMU sensor broadcaster on lidarbot publishes Imu.msg
messages on the topic, imu_broadcaster/imu
, from the MPU6050 IMU hardware sensor component. The IMU sensor data is utilized by the robot_localization
package to fuse wheel odometry data with an Extended Kalman Filter (EKF). This is used to improve the robot’s localization for autonomous navigation with the Nav2 framework. The topic connections between Nav2, relevant packages and the lidarbot controllers are presented in the navigation subsection.
The lidarbot differential drive controller and IMU sensor broadcaster are configured in the controllers.yaml
file in the bringup package; the joint state broadcaster here uses the default parameters so no configuration is needed.
In configuring the IMU sensor broadcaster, the static orientation, velocity and linear acceleration covariances, which have a row major arrangement about the x, y and z axes, need to be calculated. A covariance matrix, sometimes referred to as a variance-covariance matrix, is a square matrix where the diagonal elements represent the variance and the off-diagonal elements represent the covariance.
Visualizing the static velocity covariance, for instance, with the matrix below, the diagonal elements represent the variance of a chosen sample data size of velocity measurements for the respective axes. While the off-diagonal elements represent the covariance between velocities of two axes over a chosen sample data size.
┌──────────┬──────────┬──────────┐
│ Var(x,x) │ Cov(x,y) │ Cov(x,z) │
├──────────┼──────────┼──────────┤
│ Cov(y,x) │ Var(y,y) │ Cov(y,z) │
├──────────┼──────────┼──────────┤
│ Cov(z,x) │ Cov(z,y) │ Var(z,z) │
└──────────┴──────────┴──────────┘
The matrix above is represented as a 9 element array, in the IMU Sensor Broadcaster configuration of the controllers.yaml
file for the respective covariances to be calculated.
However, only the variance array elements are calculated. It is assumed that the different axes do not vary together, therefore the off-diagonal elements/covariances are zero. The mpu6050_covariances.cpp
file can be modified to incorporate the covariance calculations later on.
The variance was calculated with a sample data size of 500 points, with a delay of 0.25s to read non-consecutive values from the MPU6050 module.
Keep the module still then run this node to generate the variances:
ros2 run lidarbot_bringup mpu6050_covariances
An output similar to the following will be shown in the terminal window
Please keep the MPU6050 module level and still.
Reading and appending 500 sensor data points to respective arrays, it may take a while ...
Calculating variances ...
static_covariance_orientation: [5.46409e-06, 0.0, 0.0, 5.72254e-06, 0.0, 0.0, 4.22433e-07, 0.0, 0.0]
static_covariance_angular_velocity: [2.01015e-06, 0.0, 0.0, 1.9657e-06, 0.0, 0.0, 8.13776e-07, 0.0, 0.0]
static_covariance_linear_acceleration: [0.000632951, 0.0, 0.0, 0.000801987, 0.0, 0.0, 0.00117363, 0.0, 0.0]
Paste covariance arrays in the imu_broadcaster ros__parameters section in lidarbot_bringup/config/controllers.yaml.
These arrays are then pasted in the imu_broadcaster ros__parameters
section of the controllers.yaml
file to complete the IMU Sensor Broadcaster configuration.
The Controler Manager (CM) is responsible for managing the lifecycle of controllers (for instance, loading, activating, deactivating and unloading respective controllers) and the interfaces they require. Through the RM, the CM has access to the hardware component interfaces. The Controller Manager connects the controllers and hardware-abstraction sides of the ros2_control
framework, and also serve as the entry-point for users via ROS services using the ros2 cli
.
The Controller Manager matches required and provided interfaces, grants contollers access to hardware when enabled, or reports an error if there is an access conflict. The execution of the control-loop is managed by the CM's update()
method which reads data from the hardward components, updates the outputs of all active controllers, and writes the result to the components.
In essence, the CM manages the controllers, which calculate the commands needed to make the robot move as desired.
The following image (adapted from Denis Štogl (CC-BY)) illustrates all the parts of the ros2_control
framework modified for use on lidarbot.
The ros2_control
framework can be explored in more detail by reading the official documentation, this article from Masum, this tutorial from Articulated Robotics and the preprint for this project.
Before setting up lidarbot for autonomous navigation with the Nav2 stack, its wheel encoder and IMU sensor data are fused. Slipping between the wheels and the ground is common in differential drive robots, and can lead to substantial deviation from the actual position of a wheeled mobile robot from its desired position. Fusing different sensor data aids in correcting for this variation. An Extended Kalman Filter is employed by the robot_localization
package to fuse the wheel odometry and IMU sensor data to obtain more accurate robot odometry estimates.
This results in improved robot navigation as the Nav2 stack makes use of odometry data to estimate the pose (position and orientation) of the robot. The wheel odometry and IMU sensor data are obtained from the differential drive controller topic, diff_controller/odom
, and the IMU sensor broadcaster topic, imu_broadcaster/imu
respectively.
The EKF is configured using this guide from Nav2; more information about the robot_localization
package can be found in the official documenation. The EKF configuration file, ekf.yaml
, is available here.
By default the Gazebo and real world lidarbot bringup launch files have the use_robot_localization
argument set to true
but this can also be set to false
if one chooses to do so.
ros2 launch lidarbot_bringup lidarbot_bringup_launch.py use_robot_localization:=false
To drive lidarbot around in Gazebo Fortress, first ensure that the USB dongle of the gamepad is plugged in to the development machine then run this command to bringup lidarbot, sensors and also integrates ros2_control
, twist_mux
, robot_localization
and gamepad/joystick control:
ros2 launch lidarbot_gz gz_launch.py
The GIF below shows lidarbot being driven around by the gamepad (with the gamepad configuration presented here) in a world made up of obstacles.
A ROS service was written to test the connections of the motor(s) and by extension to know if the motor is faulty. Before running the tests, ensure that the 18650 batteries are charged, then prop the robot on a box or similar to prevent it falling off an edge for instance.
Run the client node to request the motor checks:
ros2 run lidarbot_base motor_checks_client
Then run the server node to check the motors:
ros2 run lidarbot_base motor_checks_server
These are the steps followed by the server node:
-
The server initializes the left and right motor pulse counts to 0.
-
It then runs each motor in the forward direction at 50% speed for 2 seconds. The terminal output is shown below:
[INFO] [1699988646.067449176] [rclcpp]: Ready to check motors [INFO] [1699988648.190017279] [rclcpp]: Received request to check motors... [INFO] [1699988648.190117076] [rclcpp]: Checking left motor... [INFO] [1699988652.196330587] [rclcpp]: Checking right motor... [INFO] [1699988656.202619229] [rclcpp]: Sending back response...
-
The current pulse counts for each motor are checked to confirm that the pulse is above 0.
-
If both motors have their pulse counts above 0, a success message is sent to the client:
[INFO] [1699991078.643028687] [rclcpp]: service not available, waiting again... [INFO] [1699991087.233641544] [rclcpp]: The checks were successful!
-
If one or both of the motors do not have pulse counts above zero a warning message is sent from the server to the client and identifies the faulty motor(s). At the moment, however, the message sent to the client does not identify the faulty motor but instead outputs this message when there is an error:
terminate called after throwing an instance of 'std::future_error' what(): std::future_error: No associated state [ros2run]: Aborted
This section will be updated once the issue has been fixed.
If a motor moves backward instead of forward, swap the cables for the specific motor to change the direction.
After it is confirmed that both motors moved forward, lidarbot can be driven around with the gamepad after running this command:
ros2 launch lidarbot_bringup lidarbot_bringup_launch.py
This launch file brings up the physical lidarbot, raspberry pi camera, RPLIDAR A1 and also integrates ros2_control
, twist_mux
, robot_localization
and gamepad/joystick control.
Note: There are some warning and error messages outputted to the terminal related to the camera. These are mostly related to calibrating the camera and can be ignored.
The Nav2 stack uses laser scan data to build a map of an environment and to localize the robot in the map while in motion. The SLAM package, slam_toolbox
, is used to generate a map of Lidarbot’s surroundings with the RPLIDAR A1 by driving the robot around using the teleop_twist_joy
package with the gamepad.
Before starting the mapping operation, ensure that the mode
key, under ROS Parameters
in the mapper_params_online_async.yaml
file, is set to mapping
and also that the map_file_name
, map_start_pose
and the map_start_at_dock
keys are commented out:
# ROS Parameters
odom_frame: odom
map_frame: map
base_frame: base_footprint
scan_topic: /scan
use_map_saver: true
mode: mapping #localization
# if you'd like to immediately start continuing a map at a given pose
# or at the dock, but they are mutually exclusive, if pose is given
# will use pose
#map_file_name: /path/to/map_file
#map_start_pose: [0.0, 0.0, 0.0]
#map_start_at_dock: true
To start mapping in a simulation environment, launch the Gazebo simulation of lidarbot on the development machine (which includes the joystick node for teleoperation):
ros2 launch lidarbot_gz gz_launch.py
For Gazebo Fortress run:
ros2 launch lidarbot_gazebo gazebo_launch.py
In a separate terminal, navigate to the workspace directory, lidarbot_ws
for example, and launch slam_toolbox
with the online_async_launch.py
file:
ros2 launch lidarbot_slam online_async_launch.py \
slam_params_file:=src/lidarbot_slam/config/mapper_params_online_async.yaml \
use_sim_time:=true
Note: The online asynchronous mode in slam_toolbox
uses live and the most recent scan data to create a map, to avoid any lags therefore, some scans can be skipped.
In another terminal, navigate to the workspace directory again and start rviz2
with the lidarbot_slam.rviz
config file:
rviz2 -d src/lidarbot_slam/rviz/lidarbot_slam.rviz
Drive around the obstacles to generate a map of the environment:
After generating the map, in the SlamToolboxPlugin in RViz, type in a name for the map in the field beside the Save Map button, then click on it.
The saved map can be found in the workspace directory and will be used by Nav2 stack for navigation.
To begin mapping in the real world, first run the bringup command:
ros2 launch lidarbot_bringup lidarbot_bringup_launch.py
Then ensure that the mapper_params_online_async.yaml
file is configured for mapping (refer to the previous subsection). Then open a new terminal, on the development machine, navigate to the workspace directory and launch slam_toolbox
with the use_sim_time
parameter set to false
:
ros2 launch lidarbot_slam online_async_launch.py \
slam_params_file:=src/lidarbot_slam/config/mapper_params_online_async.yaml \
use_sim_time:=false
In a new terminal, also on the development machine, navigate to the workspace directory again and start rviz2
:
rviz2 -d src/lidarbot_slam/rviz/lidarbot_slam.rviz
Drive around the environment to generate a map:
Then save the generated map.
To localize a robot in a given map, Nav2 makes use of the Adaptive Monte Carlo Localization (AMCL) package, nav2_amcl
. This a probabilistic localization module that outputs the pose estimates of a robot taking LIDAR scans, a map (from the Nav2 map_server
) and transform messages as inputs.
It was noted in the controllers subsection that the Nav2 stack outputs command velocity messages to the differential drive controller. These are sent on the topic diff_controller/cmd_vel_unstamped
for lidarbot. The velocity messages are sent to the robot base system hardware component to drive the motors. The updated robot odometry data from this motion is sent to the EKF on the topic /diff_controller/odom
, and fused with the IMU data from the imu_broadcaster/imu
topic. The fused odometry data is input into Nav2, in addition to the laser scan data, occupancy grid map and transform messages, to output new velocity messages. This process keeps iterating as the robot moves towards the goal position.
The Nav2 topic graph is shown in the following image adapted from Linorobot2.
Lidarbot will be localized and navigated within the map previously generated with the slam_toolbox
.
Bring up lidarbot in Gazebo Fortress:
ros2 launch lidarbot_gz gz_launch.py
Bring up lidarbot in Gazebo classic:
ros2 launch lidarbot_gazebo gazebo_launch.py
Navigate to the development workspace and open the following terminals.
To open RViz
:
rviz2 -d src/lidarbot_navigation/rviz/lidarbot_nav.rviz
To localize lidarbot in the map with nav2_amcl
:
ros2 launch lidarbot_navigation localization_launch.py \
map:=./src/lidarbot_navigation/maps/sim_map.yaml use_sim_time:=true
To launch navigation:
ros2 launch lidarbot_navigation navigation_launch.py use_sim_time:=true \
map_subscribe_transient_local:=true
Make sure to set the use_sim_time
argument to true
.
In RViz
, set the initial pose using the 2D Pose Estimate
button in the toolbar so that lidarbot is aligned correctly both in RViz
and Gazebo Classic. Afterwards, click on the 2D Goal Pose
and choose a place on the map for lidarbot to navigate to:
The blue arrows indicate unfiltered odometry, while the green ones are the filtered odometry.
TODO: add gif of navigation in Gazebo Fortress
Run a similar set on commands on the physical lidarbot for localization and navigation. Bring up lidarbot:
ros2 launch lidarbot_bringup lidarbot_bringup_launch.py
Launch RViz
:
rviz2 -d src/lidarbot_navigation/rviz/lidarbot_nav.rviz
To localize lidarbot in the map with nav2_amcl
:
ros2 launch lidarbot_navigation localization_launch.py map:=./real_map.yaml use_sim_time:=false
To launch navigation:
ros2 launch lidarbot_navigation navigation_launch.py use_sim_time:=false \
map_subscribe_transient_local:=true
Make sure to set the use_sim_time
argument to false
.
Similarly, set the initial pose using the 2D Pose Estimate
button in the RViz
toolbar so that lidarbot is aligned correctly both in RViz
and in its environment. Afterwards, click on the 2D Goal Pose
and choose a place on the map for lidarbot to navigate to.
To save time, all of these commands can be called from one launch file and adding delays where necessary to ensure certain processes are launched before others to avoid errors. Additional information on setting up navigation with Nav2 is available in this setup guide.
An ArUco marker was placed on top of lidarbot to track its trajectory during navigation, a Logitech C270 webcam was utilized in tracking the marker.
The opencv-contrib-python
module needs to be installed and not opencv-python
:
pip uninstall opencv-python
pip install opencv-contrib-python
The computer might need to be restarted for the install to be effected.
Next navigate to the path in the lidarbot_aruco
directory:
cd ~/dev_ws/lidarbot_aruco/lidarbot_aruco
Then run the following script:
python generate_aruco_marker.py --id 24 --type DICT_4X4_50 \
--output ../tags/DICT_4X4_50_id24.png
The script arguments:
--id
: The unique identifier of the ArUco tag — this is a required argument and ID must be a valid ID in the ArUco dictionary used to generate the tag
--type
: The name of the ArUco dictionary used to generate the tag; the default type is DICT_4X4_50
--output
: The path to the output image where the generated ArUco tag will be saved; this is a required argument
Running the previous script opens a window with the generated ArUco tag displayed,
To close the window, press the q key on the keyboard on the opened window.
The Logitech webcam C270 HD is used in this project and needs to be calibrated.
First install the ROS usb camera driver package:
sudo apt install ros-humble-usb-cam
Camera calibration was done following the steps outlined this guide
Execute the command below to run the usb-cam driver node:
ros2 run usb_cam usb_cam_node_exe --ros-args --params-file ~/dev_ws/src/lidarbot_aruco/config/params_1.yaml
ros2 run lidarbot_aruco aruco_trajectory_visualizer_node
Launch file to bringup the usb driver and aruco trajectory visualizer node:
ros2 launch lidarbot_aruco trajectory_visualizer_launch.py
The following images show the start and end positions of navigation with lidarbot.