Semestral project on mobile robot exploration.

Deadline (early submission) 13. December 2020, 23:59 PDT
(final submission) 20. December 2020, 23:59 PDT
Points Up to 30
Required minimum points 10
Label in BRUTE project
Files to submit The .zip archive of the project files
A short report on the used solution in pdf
Resources Project resource pack
Multi-robot blocks scene

The aim of the project is to leverage on the steps implemented in T1a-ctrl, T1c-map, T1d-plan, and T1e-expl and create an integrated solution for autonomous robotic exploration. The exploration strategy is implemented in a runnable python script file (and other files described below).

The solution implemented in file is expected to connect to the simulated robot in the CoppeliaSim simulator and autonomously navigate the robot (or a team of robots) through the environment while avoiding collisions, and collecting the metric map of the environment. The success metric of the mobile (multi-)robotic exploration is the environment coverage (percentage of the explored area) in time. Hence, the task is to collect as much of spatial information about the environment as possible in the shortest time. For that purpose, intelligent decision making in the problem of navigation towards frontier goals is necessary.

Assignment requirements

  • The solution is implemented in file that is runnable, additional files with implementation are allowed and encouraged (e.g. the implementation of navigation and exploration-related functions in and files) - the expected directory structure of the submitted solution is detailed in section Submission requirements.
  • The correct operation of the submitted solution is demonstrated in the realistic robotic simulator CoppeliaSim (formarly V-REP).
  • During runtime, the implementation shall continuously plot the current occupancy grid map of the environment and the current navigation goal of the robot together with the path towards the goal.
  • The script finishes correctly when there are no more accessible frontier goals.
  • The submission is accompanied by a brief report of the used solution in pdf. There are no specific requirements on the formatting of the report. Its purpose is to briefly summarize the student's approach to the assigned problem.
  • The scoring is based on the quality and complexity of the submitted solution. Expected scoring of the implementation features is provided in section Scoring. Implementation features outside the listed ones are encouraged; however, consult your approach with the lab teachers prior spending a lot of time implementing something that might not be worthy.

An example behavior of the Explorer is shown in the following videos.


The proposed scoring presents possible extensions of the base implementation that consolidates implementation of the T1a-ctrl, T1c-map, T1d-plan, and T1e-expl tasks. You may combine the following features in pursuing your own exploration system or come up with your own features (in that case please consult the features with your lab teacher). The implementation features include but are not limited to: scalable mapping, different methods of frontiers detection and navigation goals selection, multi-robot exploration with simple, or advanced task allocation including full or just limited communication with known or unknown base robot odometry, etc.

Designation Feature Description Points Notes
report Report Short pdf summary of the used solution. 1 Point
code_style Coding style The submitted code quality 2 Points
m1 Map static size The OccupancyGrid map container has a fixed size from the beginning of the run 1 Point
m2 Map dynamic size The OccupancyGrid map container dynamically grows with the explored area +2 Points
Frontier detection
f1 Free-edge cluster frontiers Similar to T1e-expl - frontiers are selected as centroids of segmented free-edge cells. 1 Point
f2 Multiple-representative free-edge cluster frontiers Possible extension of f1. Divide free-edges to multiple clusters and select their representatives when appropriate. +2 Points
f3 Mutual information at frontiers Possible extension of f1 and f2. Compute the mutual information which can be gained by observing the environment from the frontier (computed in f1, or f2), thus assigning each frontier a utility value based on the information which can be gained about occupancy. +7 Points
p1 Planning towards the closest frontier The closest frontier is selected as a goal for the exploration. 1 Point
p2 Planning with highest utility The goals are selected as places with the highest mutual information gain. 2 Points mutually exclusive to p1 and p3
p3 Planning considering visitation of all goals The goals are selected with as a solution of the TSP. 4 Points mutually exclusive to p1 and p2
Additional extensions
a1 Multi-robot simple task-allocation Multiple-robots exploring the environment creating the fused map. Greedy goal assignment - fully centralized system. 5 Points
a2 Multi-robot advanced task-allocation Min-pos decentralized multi-robot exploration strategy. +8 Points
a3 Multi-robot without interprocess communication All robots are initialized in a single script. (baseline) +0 Points mutually exclusive to a4
a4 Multi-robot with interprocess communication Each robot is run as a single instance of the script. It is necessary to transfer data between the robots using interprocess communication. +4 Points mutually exclusive to a3
a5 Multi-robot without common coordinate frame The robots have different coordinate frames given by their initial position. They need to first synchronize their coordinate frame before they are able to merge their maps of the environment. +10 Points Read more

The table lists the possible extension ideas. The detailed description of the individual considered features follows in section Approach.

Submission requirements

The exploration strategy is implemented in a runnable python script file The file is expected to fit into the current project files with the following directory structure (also check the Project resource pack):

  • Project - root project directory
    • - [required for submission] main runnable project script
    • hexapod_explorer - directory with exploration helper functions
      • - [recommended for submission] implementation of the map building, processing and planning functions
    • hexapod_robot - directory with simulated robot interface
      • - [recommended for submission] implementation of the robot locomotion-related functions
      • - main robot interface and locomotion control script
      • cpg - directory with the locomotion controlling CPG implementation
      • hexapod_sim - directory for simulator interface

Only the file is mandatory for submission, other files are optional with the and files being recommended for submission. The projects are evaluated by the lab teachers, the automated evaluation script in BRUTE only checks the existence and runnability of the main script. The submitted files are expected to be copied into the provided resource pack. Therefore you should submit only the files that you have changed for the purpose of your project, i.e., you should leave out, e.g., the simulator library files, etc.

Submission evaluation

The project files are submitted to the BRUTE system which performs only a simple sanity check that the solution is runnable and it does not contain any major errors regarding the used libraries and/or obvious syntax issues. Afterwards, a demonstration of the working robotic exploration in CoppelliaSim simulator is evaluated by the lab instructors after the submission deadline. If you hand the project before the early deadline (on 13. December 2020, 23:59 PDT), the lab instructors will provide you with the feedback as soon as possible which allows for correcting possible issues prior the hard deadline (on 20. December 2020, 23:59 PDT).


Implementation of the mobile robot exploration consists of building blocks, that cope to a large extent with the assignments in the T1a-ctrl, T1c-map, T1d-plan, and T1e-expl tasks. In general, the system has the following inputs given by the environment sensing:

  1. odometry - Odometry message - pose.position.x, pose.position.y and pose.orientation.quaternion encodes the current robot absolute position in the environment
  2. collision - boolean - True if the robot collides with some obstacle, False otherwise
  3. laser_scan - LaserScan message - contains the current laser scan data perceived by the robot.

And the following outputs:

  1. velocity command cmd_msg = Twist() for steering of the robot
  2. the OccupancyGrid message with the grid map of the environment

Hence, the robot has to be able to collect the laser scan measurements, fuse them into the map, select navigation goals in this map and navigate towards these temporary goals to discover the unknown parts of the environment.


Here you will find the recommendations and guidelines for implementation of the basic exploration pipeline and for each of the scored items described above.

Basic implementation

To kickstart your awesome muli-robot information theoretic-based exploration project, few guidelines on how to organize your code and connect the t1 tasks into a working exploration pipeline follows.

Read More

  • report

$\qquad\bullet\,$ Read more

  • code_style

$\qquad\bullet\,$ Read more


  • m1

$\qquad\bullet\,$ Read more

  • m2

$\qquad\bullet\,$ Read more

Frontier detection

  • f1

$\qquad\bullet\,$ Read more

  • f2

$\qquad\bullet\,$ Read more

  • f3

$\qquad\bullet\,$ Read more


  • p1

$\qquad\bullet\,$ Read more

  • p2

$\qquad\bullet\,$ Read more

  • p3

$\qquad\bullet\,$ Read more

Additional extensions

With multi-robot system you should distinguish between decentralized and distributed systems. Read more
  • a1

$\qquad\bullet\,$ Read more

  • a2

$\qquad\bullet\,$ Read more

  • a3

$\qquad\bullet\,$ Read more

  • a4

$\qquad\bullet\,$ Read more

  • a5

$\qquad\bullet\,$ Read more

Known issues and FAQ

  1. [15.11.2020] There is a known issue for multi-robot setup when sometimes the first robot get the laser scan of the second robot and vice versa. The issue happens randomly and it is inherent to the CoppeliaSim simulator. This issue is not critical in pursuing the project goals because when using probabilistic representation of the map, the wrong laser scans will filter out. It actually represents a common situation in real robotics when the sensory data gets corrupted, and the system has to be able to deal with it. Nevertheless, we are working on a patch that will fix this issue.
courses/b4m36uir/projects/start.txt · Last modified: 2020/12/01 13:18 by cizekpe6