Warning
This page is located in archive.

Lab 07

During this lab, you should learn how to visualize data within ROS and how to debug when something goes wrong.

Simulation Time

An important concept in robotics is the so-called simulation time. On real robots, time always advances according to the rules of universe1). However, in simulation, you can (or have to) slow down or speed up the pace of time2).

Working with simulation time in ROS is easy - just be sure to never use time.time() in your code, and that you always use rospy.Time.now(). That's it. ROS automatically manages the time internals so that on real robots, you get the system clock time (what time.time() would give), but in simulation, you will get the properly slowed-down time. All the other time-related features like rospy.Rate or rospy.Timer work with this simulation time.

Internally, ROS decides between realtime clock and simulation time based on the value of global parameter /use_sim_time. It set to True, every node creates a subscriber to topic /clock and updates the simulation time based on whatever is published to this topic. If the parameter is False, it automatically uses system clock. The Gazebo simulator or rosbag play --clock are publishers of the /clock topic and publish time on 1 kHz. So that is the clock resolution you get when running from simulation. You can add the real_time_factor argument to any *_sim.launch files from ARO to slow down the simulation. By default, it runs with RTF (real-time-factor) 1.0.

For example:

rosparam set /use_sim_time true
roslaunch aro_exploration aro_localization_sim.launch real_time_factor:=0.3

It is also usually a good practice to use simulation time when replaying bag files. Just rosparam set /use_sim_time true before playing the bag file, and add --clock option to rosbag play. The play node will then publish the /clock topic on 1 kHz. If you then use the --rate rosbag play parameter, all nodes listening to the messages from bag file will be properly slowed down, e.g.

rosparam set /use_sim_time true
rosbag play --clock --rate 0.25 data.bag

Visualization and debugging

There are several quite well-working programs within ROS that you can use to visualize the data measured by the sensors or produced by your nodes. Most notably the RViz, rqt and PlotJuggler.

There are several new key aspects/buzzwords to know from today's lab:

  • RViz - GUI program to visualize data such as cameras, maps, robot positions.
  • rqt - GUI program with many plugins to, e.g., plotting, calling services, publishing messages, showing graph of nodes.
  • PlotJuggler - GUI program to visualize data online on a certain topic or data loaded from rosbag.

RViz

RViz is mainly useful for visualization of the 2D and 3D parts of current state of your robotic system, i.e., to show camera images, lidar scans, show mapped environment or show the robot's position. You can start RViz by simply calling rviz in your terminal once you are inside the singularity image. By default, you should see 3D plot area in the middle, and the currently visualized data list in the left bar. You can add new data to the visualization by pressing Add in the bottom left and then choose which data either by topic or by type of message. If you choose to add the data by message type, you will have to specify the topic later in the list of displayed messages in the left bar. Also check the Views tab which allows you to specify parameters of the GUI camera (e.g. top-down or 3D view).

rqt

rqt has many plugins to visualize various information about your robotic system and its data. By calling command rqt in your terminal you can start rqt and then select the plugin you want to use in the menu. Try visualizing in rqt plugins:

  • Plugins → Visualization → Image View, or standalone as rosrun rqt_image_view rqt_image_view (view image data)
  • Plugins → Introspection → Node Graph, or standalone as rosrun rqt_graph rqt_graph (see connections between nodes)
  • Plugins → Visualization → TF Tree, or standalone as rosrun rqt_tf_tree rqt_tf_tree (view TF tree)
  • Plugins → Visualization → Plot or standalone as rosrun rqt_plot rqt_plot (plot 1D data from some topic)
  • Plugins → Logging → Bag or standalone as rosrun rqt_bag rqt_bag (visualize the contents of a bagfile)
  • Plugins → Logging → Console or standalone as rosrun rqt_console rqt_console (interactive GUI for the printed console messages (allows filtering))
  • Plugins → Robot Tools → Diagnostics viewer or standalone as rosrun rqt_runtime_monitor rqt_runtime_monitor or rosrun rqt_robot_monitor rqt_robot_monitor (view diagnostic reports)
  • Plugins → Topics → Topic monitor or standalone as rosrun rqt_topic rqt_topic (see frequencies and bandwidth of topics)

PlotJuggler

The PlotJuggler is a tool for plotting data (timeseries) online on a certain topic or plotting data loaded from rosbag. The plot could be either 1D timeseries (time + value dimensions) or 2D (one topic on x axis, second topic on y axis). You can start PlotJuggler using rosrun plotjuggler plotjuggler. When started, you can load data in the File tab (on the top left) from files such as rosbags. Alternatively, you can select ROS Topic Subscriber in the Streaming tab, and select what topics should be visualized. Afterward, you can select what timeseries (particular fields in the message, e.g., pose.position.x in Odometry message) you want to plot by dragging it over the plot area (drag two selected timeseries with right mouse button for 2D values). Test the functionality e.g. by:

  • run the simulator (roslaunch aro_sim robocop_sim.launch) and try plotting live published data.

A Complex ROS-based system example

A good example of a complex robotic system is the Unmanned Ground Vehicle (UGV) and Unmanned Aerial Vehicle (UAV) system developed for the DARPA SubT challenge by CTU.

You can take a look at the TF tree for the CTU's Unmanned Ground Vehicles in DARPA SubT challenge. A zoomable version is here .

Similar complexity has the node graph, here of the CTU's Unmanned Aerial Vehicles. You can find full image for zooming here .

If you are interested more in Aerial robotics, consider Multi-robot Aerial Systems courses in next semester.

Lab Task: Doctor Turtlehouse (Polámal se mraveneček)

During this lab, you will become an investigator who has received a suspicious recording from a Turtlebot. Your task is to diagnose the robot and figure out what bad or weird things happened to the poor (non)being.

You're looking for anything. Write down a list of findings indexed by timestamp in the UNIX format (1712143327). For each finding, write down what you think was wrong, what tools did you use to find or prove it, and a screenshot or image demonstrating the problem.

The bad actors also left you a few textual messages. Can you find them? Don't rely only on what eyes can see.

The recording can be downloaded here: Full file (1.4 GB), Small file (0.05 GB) (please ignore the security warnings from your browser). The full file contains also cameras and lidar. But you will probably have problems downloading it over wifi during the lab. If you haven't downloaded it in advance, use the small file to have at least something to investigate. There is still lots you can discover only in the small file!

When replaying the bag file and visualizing in rviz or rqt_tf_tree, you have to make sure to first launch the visualization tools and only then start the rosbag play. Otherwise, some transforms may be missing. A workaround allowing you to run these in any order is to download and unzip static_transform_mux.zip. Before rosbag play, run python3 static_transform_mux in a console with Singularity and keep it running all the time. With this node running, you can start visualization tools at any time and they will see the correct transforms.

1)
That's actually not true: time on a computer advances according to its crystal and each PC has a different crystal. If you need to sync time on more computers inside one robot you need specialized hardware and software (PTP) to get at least microsecond accuracy. If you need to synchronize time on multiple robots, you will probably use NTP protocol and get a millisecond accuracy.
2)
Slowing down is useful if you have a compute-intensive node and you want to test how it works, but you don't yet have resources for refactoring the node into a faster realtime-capable version. Sometimes you are forced to slow down - e.g. if you simulate exteroceptive sensors (lidars, cameras), it often happens that the computer is not powerful enough to render all the sensor data in realtime.
courses/aro/tutorials/lab07.txt · Last modified: 2024/04/04 10:59 by peckama2