Platform
  • RVC
  • Features
  • Depth
  • Deployment
  • Camera Sensors
  • Calibration
  • Environmental Testing
  • Comparisons

ON THIS PAGE

  • RAE ROS
  • Setting up procedure
  • SSH
  • Generating docker image
  • Developing on docker
  • Calibration
  • Configuration:
  • Feature Tracker
  • Setting Camera and IMU parameters
  • Robot Localization
  • Sensors and sockets
  • Peripherals:
  • Testing motors
  • LED node
  • LCD node
  • Microphone and speakers
  • USB ports

RAE ROS

Note The RAE project is under active development so you can expect changes in the API and improvements in the performance in near future. Please report problems on Github issues as it's important for the development efforts.

Setting up procedure

SSH

  1. Connect via USB cable or wifi rae-<ID>, password wifiwifi@ (See rae getting started documentation).
  2. To use SHH without typing password each time - ssh-copy-id root@192.168.11.1. Also a easier solution for USB devices, can append to ~/.ssh/config the following, entering just with ssh ku:
Plain Text
1Host ku 192.168.197.55
2  HostName 192.168.197.55
3  User root
4  StrictHostKeyChecking no
5  UserKnownHostsFile /dev/null
  1. Currently date resets after each startup to set current - ssh root@192.168.11.1 sudo date -s @( date -u +"%s" )
  2. If you want to run ROS packages while bypassing Luxonis Hub it would be advised to stop RH agent before starting docker containers, otherwise you can easily run into conflicts as they would be competing for same hardware resources - robothub-ctl stop. Keep in mind that since wpa_supplicant is a subproccess of the RH agent, the WiFi connection will get killed along with the agent. To resolve this we recommend you manually setup the WiFi connection as done in this guide.

Generating docker image

You can download prebuilt images form dockerhub, in which case you can skip first 2 steps in guide below. We reccomend using image tagged as humble as all other images are generally experimental images. You can download docker image with:docker pull luxonis/rae-ros-robot:humbleDownloading prebuilt images is reccomended if you are not planning to considerably change source code.
  1. Clone repository git clone git@github.com:luxonis/rae-ros.git
  2. Build docker image cd rae && docker buildx build --platform arm64 --build-arg USE_RVIZ=0 --build-arg SIM=0 --build-arg ROS_DISTRO=humble --build-arg CORE_NUM=10 -f Dockerfile --squash -t <docker-image-name>:<tag> --load . You need to install necessary tools along buildx sudo docker run --rm --privileged multiarch/qemu-user-static --reset -p yes
  3. Upload docker image to robot. Connect robot to your PC via USB so you can transfer image quicker. Note that currently space on the robot is limited, so you need to have 7-8 GB of free space in /data directory - docker save <docker-image-name>:<tag> | ssh -C root@192.168.197.55 docker load
  4. SSH into robot and run docker image or just skip all first 3 steps and run the second:
    • docker run -it --restart=unless-stopped -v /dev/:/dev/ -v /sys/:/sys/ --privileged --net=host <docker-image-name>:<tag>
    • docker run -it -v /dev/:/dev/ -v /sys/:/sys/ --privileged -e DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix -v /dev/bus/usb:/dev/bus/usb --device-cgroup-rule='c 189:* rmw' --network host <docker-image-name>:<tag>
  5. Search for docker container name with docker ps
  6. Attach to the shell - docker attach <container_name>, or if you want to create separate session docker exec -it <container_name> zsh
  7. To launch robot hardware - ros2 launch rae_bringup robot.launch.py. This launches:
    • Motor drivers and differential controller
    • Camera driver, currently set up to provide Depth and streams from left & right camera. Note here that you have to calibrate cameras (see steps below). Currently a default calibration file is loaded. It's located in rae_camera/config/cal.json. To use one on the device or from other path, change i_external_calibration_path parameter in rae_camera/config/camera.yaml
    • Depth image -> LaserScan conversion node used for SLAM
  8. Launching whole stack - ros2 launch rae_bringup bringup.launch.py. It has following arguments used for enabling parts of the stack:
    • enable_slam_toolbox (true)
    • enable_rosbridge (false)
    • enable_rtabmap (false)
    • enable_nav (false) Example launch with an argument - ros2 launch rae_bringup bringup.launch.py enable_nav:=false

Developing on docker

You can rebuild rae_hw related packages directly on the robot if you limit colcon (for example run: MAKEFLAGS="-j1 -l1" colcon build --symlink-install --packages-select rae_hw). For related camera stuff you can build locally on PC due to hardware limitation of RAE. For quicly rebuild broken images is recommended to use docker.You can also test/develop/build on PC with conecteted RAE for rapid prototiping. This is recommended for rebuilding camera related stuff and testing, also need to comment from rae_camera.launch.py the following:
Python
1RegisterEventHandler(
2            OnProcessStart(
3                target_action=perception,
4                on_start=[
5                    TimerAction(
6                        period=15.0,
7                        actions=[reset_pwm, LogInfo(msg='Resetting PWM.'),],
8                    )
9                ]
10            )
11        ),
  1. Open rae-ros workspace in vscode
  2. Create .devcontainer directory in the workspace
  3. In it, create devcontainer.json
  4. You'll need to add Remote containers extension to VScode if you didn't already
  5. After that, a window should pop out that will ask if you want to reopen the directory in container, select yes If nothing pops up, CTRL+SHIFT+P , select option Rebuild and reopen
devcontainer.json
1// See https://aka.ms/vscode-remote/devcontainer.json for format details.
2{
3	"dockerFile": "../src/rae-ros/Dockerfile",
4	"build": {
5		"args": {
6			"USE_RVIZ": "1",
7			"SIM": "1",
8			"CORE_NUM": "10",
9			"--ssh": "default=$HOME/.ssh/id_rsa.pub ."
10		}
11	},
12	"remoteUser": "root",
13	"runArgs": [
14		"--device=/dev/ttyUSB0",
15		"--privileged",
16		"--network=host",
17		"--cap-add=SYS_PTRACE",
18		"--security-opt=seccomp:unconfined",
19		"--security-opt=apparmor:unconfined",
20		"--volume=/dev:/dev",
21		"--volume=/tmp/.X11-unix:/tmp/.X11-unix",
22        "--volume=${env:HOME}/.ssh:/${HOME}/.ssh",
23		// "--gpus=all"
24	],
25	"containerEnv": {
26		"DISPLAY": "${localEnv:DISPLAY}",
27		"QT_X11_NO_MITSHM": "1",
28		// "LIBGL_ALWAYS_SOFTWARE": "1" // Needed for software rendering of opengl 
29	},
30	// Set *default* container specific settings.json values on container create.
31	"settings": {
32		"terminal.integrated.profiles.linux": {
33			"zsh": {
34				"path": "zsh"
35			},
36			"bash": {
37				"path": "bash"
38			}
39		},
40		"terminal.integrated.defaultProfile.linux": "zsh"
41	},
42	"extensions": [
43		"dotjoshjohnson.xml",
44		"ms-azuretools.vscode-docker",
45		"ms-iot.vscode-ros",
46		"ms-python.python",
47		"ms-vscode.cpptools",
48		"redhat.vscode-yaml",
49		"smilerobotics.urdf",
50		"streetsidesoftware.code-spell-checker",
51		"twxs.cmake",
52		"yzhang.markdown-all-in-one",
53		"augustocdias.tasks-shell-input",
54		"eamodio.gitlens"
55	]
56}
Note: For displaing RViz in docker you need to run on PC: xhost +local:docker.

Calibration

Every shipped rae has already been factory calibrated, so this step is rarely needed. Besides the section below, Calibration documentation is also a good source of information.With PC conecteted to RAE via SSH (RH agent use depthai_gate):
Command Line
1robothub-ctl stop
On your PC:
Command Line
1git clone --branch rvc3_calibration https://github.com/luxonis/depthai.git
2cd depthai/
3python3 install_requirements.py
4# To calibrate rae's front cameras - for back cameras we would change the board name to "RAE-D-E"
5python3 calibrate.py -s <size>  -brd RAE-A-B-C -cd 1 -c 3
Some tips:
  1. Try to fill stereo pairs matrices (color camera preview can be out of FOV, but not the Stereo pairs)
  2. Put charuco on a flat surface (bigger board it will be better)
  3. When you take a frame is a better practice to freeze board (motion are not ok)

Configuration:

Feature Tracker

Each sensor node (and rectified streams from Stereo node) has the option to add FeatureTracker node, which publishes depthai_ros_msgs/msg/TrackedFeatures messages. To enable features on, for example rgb node, set rgb: i_enable_feature_tracker: true. To enable publishing on rectified streams, set for example stereo: i_left_rect_enable_feature_tracker

Setting Camera and IMU parameters

You can set params for camera and IMU in file: rae_camera/config/camera.yaml. For all available settings refer to depthai-ros

Robot Localization

Config file for robot localization package can be found: rae_hw/config/ekf.yaml . For more information refer to this link

Sensors and sockets

Socket 1 - OV9782 Socket 0 - IMX214 Socket 2 - OV9782 Socket 3 - OV9782 Socket 4 - OV9782

Peripherals:

  • LCD node - accepts BGR8 image (best if already resized to 160x80px) on /lcd Image topic
  • LED node - Subscribes to /led topic, message type is LEDControl (refer to rae_msgs/msg/LEDControl)
  • Mic node - Publishes audio_msgs/msg/Audio (from gst_bridge package) on /audio_in, configuration is S32_LE, 48kHz, 2 channel interleaved
  • Speakers node - Subscibes to audio_out to same type as Mic node, configuration is S16_LE, 41kHz, 2 channel interleaved

Testing motors

In rae_hw/test you can find three scripts that will help you verify that the motors are running correctly. If you want to change arguments, you need to provide all of them.
  1. To find out if encoder is working accurately, execute ros2 run rae_hw test_encoders and rotate the wheel by 360 degrees. After rotation, encoder readout should be ~2PI. If not, adjust encoder tick per rev parameter. Scipt arguments - [encRatioL encRatioR]. Full arg version ros2 run rae_hw test_encoders 756 756
  2. Finding out max speed - ros2 run rae_hw test_max_speed. Script arguments [duration encRatioL encRatioR]. Full arg version ros2 run rae_hw test_max_speed 1.0 756 756
  3. Motor verification - ros2 run rae_hw test_motors. Script arguments [duration speedL speedR encRatioL encRatioR maxVelL maxVelR]. Full arg version ros2 run rae_hw test_motors 5.0 16.0 16.0 756 756 32 32
Motor configuration parameters are provided in rae_description/urdf/rae_ros2_control.urdf.xacro file. Pin numbers shouldn't change between devices, but if that's the case you can edit that file to set new ones.
  • PWM pins (speed control):
Xml
1<param name="pwmL">2</param>
2<param name="pwmR">1</param>
  • Phase pins (direction control)
Xml
1<param name="phL">41</param>
2<param name="phR">45</param>
  • Encoder pins - each motor has A and B pins for encoders.
Xml
1<param name="enLA">42</param>
2<param name="enLB">43</param>
3<param name="enRA">46</param>
4<param name="enRB">47</param>
  • How many encoder tics are there per revolution - this might vary from setup to setup. To verify that, run the controller and rotate a wheel manually. You can see current positions/velocities by listening on /joint_states topic - ros2 topic echo /joint_states.
Xml
1<param name="encTicsPerRevL">756</param>
2<param name="encTicsPerRevR">756</param>
  • Max motor speed in rads/s
Xml
1<param name="maxVelL">32</param>
2<param name="maxVelR">32</param>
  • Both wheels have parameters for PID control set in that file, those values could need some tuning:
Xml
1<param name="closed_loopR">1</param>
2<param name="PID_P_R">0.2</param>
3<param name="PID_I_R">0.1</param>
4<param name="PID_D_R">0.0005</param>
Parameters for differential driver controller are present in rae_hw/config/controller.yaml. wheel_separation and wheel_radius parameters might also need tuning depending on the setup.Implementation of motor control is found in rae_hw package. You can set the motor to be ready to recieve twist commands on /cmd_vel topic by running:ros2 launch rae_hw control.launch.pyYou can then control the robot via keyboard teleopt from your pc via (assuming you are connected to same network robot is in):
  1. sudo apt-get install ros-humble-teleop-twist-keyboard
  2. ros2 run teleop_twist_keyboard teleop_twist_keyboard
If keyboard is too limitng for your tests you could also use ros-humble-teleop-twist-joy and connect a joystick.

LED node

Under rae_msgs package you can find custom messages that let you control LED lights around the robot. Under those messages there are 3 control types:
  1. Control all (set control_type to 0) gives all LEDs the same color
  2. Control single (control_type to 1) lets you control just a single LED light by setting single_led_n variable
  3. Custom control (control_type to 2) where you send a list that defines every value of LED lights at once.
Useful example of how to work with LEDs can be found in rae_bringup/scripts/led_test.py where this for loop is populating LED values (with custom control) for each individual LED:
Python
1for i in range(40):
2            led_msg.single_led_n = 0
3            led_msg.control_type = 2 
4            if i < 8:
5                color = "white"
6                led_msg.data[i]=(colors[color])
7            if i >9 and i < 14 and angular_speed > 0.0 and blinking==True:
8                color = "yellow"
9                led_msg.data[i]=(colors[color])
10            if i > 20 and i < 29 and linear_speed < 0.0:
11                color = "red"
12                led_msg.data[i]=(colors[color])
13            if i> 34 and i < 39 and angular_speed < 0.0 and blinking==True:
14                color = "yellow"
15                led_msg.data[i]=(colors[color])
We can then send that message to a topic that LED node listens to - by default that is /leds . Easiest way to run LED node (and all other peripheral node) is to run one of the following launch files:
Command Line
1ros2 launch rae_hw peripherals.launch.py
2ros2 launch rae_hw control.launch.py
3ros2 launch rae_bringup robot.launch.py
Where first file is running only peripherals, 2nd is running peripherals and motors, while 3rd is running both of those along with cameras. It could be useful to create your own launch file that mimics peripherals launch file, so you can run your own examples along with nodes.

LCD node

LCD node is listening to messages on /lcd topic and showing it on the screen in the. Useful demo for this node is in rae_bringup/scripts/battery_status.py where we subscribe to battery status topic and based on that create image of battery status on LCD screen. This node expects ROS (sensor_msg/Image) image, so you will generally have to transform opencv imaget to ROS image:img_msg = self.bridge.cv2_to_imgmsg(img_cv, encoding="bgr8")

Microphone and speakers

Microphone node expects audio messages and example of how to use that data (along with some other peripherals) can be found in rae_bringup/scripts/audio_spectrum.py . For fair amount of use caes, you will need to decode incoming data as shown in example below.
Python
1if msg.encoding == "S32LE":
2            audio_data = np.frombuffer(msg.data, dtype=np.int32)
3        elif msg.encoding == "S16LE":
4            audio_data = np.frombuffer(msg.data, dtype=np.int16)
5        if msg.layout == Audio.LAYOUT_INTERLEAVED:
6            # Deinterleave channels
7            audio_data = audio_data.reshape((msg.frames, msg.channels))
GST-ROS bridge You can use gst-bridge for testing, for example to play audio on a ros topic:
  • gst-launch-1.0 --gst-plugin-path=install/gst_bridge/lib/gst_bridge/ filesrc location=sample.mp3 ! decodebin ! audioconvert ! rosaudiosink ros-topic="/audio_out"
  • gst-launch-1.0 --gst-plugin-path=install/gst_bridge/lib/gst_bridge/ rosaudiosrc ros-topic="audio_out" ! audioconvert ! wavenc ! filesink location=mic1.wav
  • gst-launch-1.0 --gst-plugin-path=install/gst_bridge/lib/gst_bridge/ rosimagesrc ros-topic="/rae/right_front/image_raw" ! videoconvert ! videoscale ! video/x-raw,width=160,height=80 ! fbdevsink
  • gst-launch-1.0 alsasrc device="hw:0,1" ! audio/x-raw,rate=48000,format=S32LE ! audioconvert ! spectrascope ! videoconvert ! video/x-raw,width=160,height=80 ! fbdevsink
Speakers operate similarly, in that they output audio messages. In bringup package in scripts folder sound_test.py offers a decent example of how you can create audio messages. We will shortly create more demos for speakers and microphone.

USB ports

To use top USB port to for example connect it to some external device, you need to disable communication side USB.To do that, execute
Command Line
1gpioset gpiochip0 44=1
2echo host > /sys/kernel/debug/usb/34000000.dwc3/mode
This will be reverted after reboot. To revert it manually, execute
Command Line
1gpioset gpiochip0 44=0
2echo device > /sys/kernel/debug/usb/34000000.dwc3/mode