Search
Responsible lecturer: Tomáš Musil (musilto8@fel.cvut.cz)
Your task will be to finish a prepared node for finding potential goal positions that should lead the robot to explore an unknown environment.
Where should the robot go to expand its map? The vast majority of approaches to this problem are based on the idea of “frontiers” introduced in [1] in 1997. A frontier is usually defined as the boundary between space that is known and free (containing no obstacles) and the space that is unknown. When dealing with a lidar-equipped robot using an occupancy grid in an indoor setting, simply moving the robot close to a frontier will nearly always uncover a new part of the environment and expand the occupancy grid. In this assignment, you will be implementing a node for finding these frontiers on a 2D occupancy grid (see HW06 for detailed info about the data type). We will use these specific definitions:
As motivation, here is how more sophisticated exploration approaches, which still use the concept of frontiers, can work in 3D:
Searching for frontier cells and navigating to them on the input occupancy grid would work nicely if your robot was smaller than the cell of the grid. However, the robot is much larger than a single cell. This could thus cause the detected frontier cells to lie in places where the robot would collide with obstacles! For this reason, you need to inflate obstacles and unknown space (since it also might contain obstacles), similarly as you did in Homework 06 - Path Planning. Use the provided get_circular_dilation_footprint() kernel with the following function to perform the inflation:
get_circular_dilation_footprint()
https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.grey_dilation.html
You should then run the WFD algorithm on the modified grid. Make sure you merge the inflated obstacles and unknown space in the correct order!
Also, as in the Homework 06 - Path Planning, you should handle the case when the robot is too close to obstacles. If that happens, the start position can seem untraversible in the inflated grid, and then the first BFS of the WFD will immediately stop and return no frontiers. Setting the area around the robot as traversible, as in HW06, is the simplest solution, but beware - this could lead to finding deformed frontiers, depending on the specific way of implementation. We suggest starting the first BFS of the WFD with a larger starting 'open set' - not just the one cell corresponding to the robot's center position, but putting all the cells within the robot's dilation footprint into the starting open set.
To find the frontiers in the grid, we will use the WFD algorithm introduced in the lectures. In a nutshell, it is a 2-stage BFS algorithm on a 2D grid. We run it in two iterations:
For the pseudocode and more details, please see the ARO lectures. Make sure to only accept frontier cells that are reachable, and filter out frontiers that have fewer cells than the value of the min_frontier_size parameter. You can find pseudocode and more information in the lectures and in [2]. An example result can be seen below. Green = frontier cells, Red = frontier centers, Blue = robot's position.
min_frontier_size
Your task will be to implement functionality for the detection of frontiers in an occupancy grid. You will need to finish parts of code in aro_exploration/nodes/frontier.py and aro_exploration/src/aro_exploration/utils.py.
aro_exploration/nodes/frontier.py
aro_exploration/src/aro_exploration/utils.py
1) If you haven't already done it in the Path Planning homework, implement the map_to_grid_coordinates() and grid_to_map_coordinates() functions in aro_exploration/src/aro_exploration/utils.py. These should transform a given position in the 'icp_map' frame (e.g. the robot's position) to discrete coordinates specifying a cell in the occupancy grid (see above for an image explaining the indexing we will use in this assignment).
2) In the find_frontiers function or when receiving an occupancy grid message, construct a modified grid where the obstacles and unknown space are inflated. Use the get_circular_dilation_footprint() in utils.py as the kernel for inflating the obstacles with the provided dilation function: https://docs.scipy.org/doc/scipy/reference/generated/scipy.ndimage.grey_dilation.html. For this assignment (you can change it for the semestral work), do not use a different kernel or different function, because the evaluation uses these and a mismatch might cause your solution to find frontiers at positions that could be unsafe according to the different kernel
find_frontiers
3) Implement the Wavefront Frontier Detection algorithm and run it on the modified occupancy grid to obtain a list of reachable frontiers. We suggest you use the provided Frontier class.
4) From all found frontiers, select the one with the center closest to the robot and return its center in map coordinates in the get_closest_frontier service.
get_closest_frontier
Make sure to:
Run the script aro_exploration/homeworks/create_hw07_package.sh to create an archive containing your implementation (frontier.py and utils.py) and upload it to the upload system. Your solution will be evaluated based on the output of your get_closest_frontier service. The output position of this service will be compared to the reference solution at several positions of the robot in the map. You can use the provided frontier_evaluate.py file for testing your solution locally before submitting to BRUTE. The expected output is in the same coordinates as the robot's position, ie. the map frame, not the discrete grid cell coordinates.
aro_exploration/homeworks/create_hw07_package.sh
frontier.py
utils.py
frontier_evaluate.py
For visualization, use the following launchfile:
roslaunch aro_exploration aro_frontier_planning_test_local.launch
This launchfile, as in the previous homework, starts the frontier detection and planning nodes, plays a repeating rosbag with OccupancyGrid messages, and opens RVIZ. Since this is not a simulation, only these two nodes running and taking data from a rosbag, you need to manually publish the robot's position yourself using static transform. You can do this using the following command:
rosrun tf2_ros static_transform_publisher x y z r p y icp_map base_footprint
To detect the frontiers and get the closest one, call the service using the command line as
rosservice call /get_closest_frontier
For evaluating your solution locally, use the provided evaluation script as rosrun aro_exploration frontier_evaluate.py file. This is a self-contained evaluation which initializes ros, publishes the transformations automatically and compares the outputs of your service to the reference solution.
rosrun aro_exploration frontier_evaluate.py
[1] YAMAUCHI Brian. A frontier-based approach for autonomous exploration. Proceedings 1997 IEEE International Symposium on Computational Intelligence in Robotics and Automation CIRA'97. 'Towards New Computational Principles for Robotics and Automation', 146-151
[2] TOPIWALA, Anirudh; INANI, Pranav; KATHPAL, Abhishek. Frontier Based Exploration for Autonomous Robot. arXiv preprint arXiv:1806.03581, 2018.