Humanoid robot programming

The first part of the subject aims to get you acquainted with programming of a humanoid robot. We will use a Python simulator of humanoid robot iCub.

**Lab 11.3.2025 is CANCELED**

Lab 1

PyBullet documentation can be useful and can be found here.

Running bugs solutions

  • Use Python 3.8-3.11
  • if you are using conda (or similar) install the packages using 'conda install …' and not 'python -m pip install …'. These two are not the even though it may look like it
  • do not use conda (use python venv or install into global environment). Be sure to do 'conda deactivate' first
  • if you use IDE (pycharm, vscode) try to run it also in normal terminal outside the IDE. IDEs sometimes replace system variables which can cause problems with graphical things
  • use open3D web visualizer. Put this 'o3d.visualization.webrtc_server.enable_webrtc()' to icub_pybullet/visualizer.py straight under 'self.client = client'. Open http://localhost:8888/ in your browser (you need to refresh this after you run scripts again)
  • in config for the program you try to run set gui=False. In icub_pybullet/pycub.py set 'super().init(p.DIRECT)' on line 47 to 'super().init(p.GUI)'
    • here be careful that if you did 'setup.py install' the changes will most probably not take action. Use setup.py install again, or ideally uninstall the package 'pip uninstall icub_pybullet' and use not installed version while testing
  • Use gitpod. Open https://gitpod.io/#github.com/rustlluk/B3M33HRO-gitpod (you will need github account). The first run can take some time, but after that you should see an environment inside a virtualized system. The performance is not great, but if you change simulation speed (in client.update_simulation(time)) it is fine. t = 0.025 seems to work okay. The code is already downloaded and installed in /home/gitpod/pycub
    • you will be asked to select web interface or to open using plugins to vscode or PyCharm
  • use lab computers

iCub Kinematics

This picture shows the joints of iCub with their names and indexes that can be used to control the robot in joint space in the PyCub simulator.

PyCub

The simulator is written in Python3 and uses PyBullet as a physics engine and Open3D for visualization. It should run without problem on any system with Python3 (it is tested in Python3.11; anything over 3.8 should be fine; the theoretical limit is 3.6 because of f-strings).

The whole documentation can be found here.

A presentation with a description of basic functionality can be found here.

Installation

Installation instructions can be found on Github or this presentation. But basically, you need to clone the repository and install dependencies: numpy, scipy, pybullet, open3d (and their dependencies).

Alternative installation (Lab computers)

Python only
  • clone the repository to your home directory git clone https://github.com/rustlluk/pycub
  • create venv and install dependencies cd pyCub && mkdir venv && cd venv && virtualenv . && source bin/activate && pip3 install numpy scipy pybullet open3d roboticstoolbox-python
  • install using python3 setup.py install or add export PYTHONPATH=$PYTHONPATH:/home.nfs/your_ctu_username/pyCub/icub_pybullet to ~/.bashrc

You will need to run source /home.nfs/your_ctu_username/pyCub/venv/bin/activate every time you open new terminal

Lab 2

This lab is focused on line and smooth movements. You can use this template to play together with the lab tutor and later implement the function move(args) to test your solution in the test environment.

The goal of function move() is to perform smooth line and circle movements. Th function have to return start and end pose of the trajectory (type icub_pybullet.utils.Pose; can be obtained by pycub.end_effector.get_position())

Evaluation

  • both circle and line movements will be evaluated with three sub-criteria:
    • circle:
      • standard deviation of each point of the circle to center; must be < r*0.125
      • difference between expected and real radius; must be < r*0.1
      • mean distance of all points from the expected plane; must be < 0.1
    • line:
      • difference from real length to expected length; must be < 0.0075
      • angle between the expected and real line; must be <0.1
      • mean distance of all points from the expected line; must be < 0.01
  • points will be awarded if all the sub-criteria are met
  • BRUTE also shows image of your trajectory (black) vs expected trajectory (blue). It may not be always 100% correct.

Possible inputs

  • action: string, either “line” or “circle”
  • axis: list of ints. For “circle” it the length is always 1. For “line” the length can be from 1-3. Individual numbers on the list are axes along which the robot should move. For example, [0, 1] means that the robot should move in x- and y-axis.
  • r: list of floats. The same length as axis. Number in metres. For “circle” it is the radius of the circle. For “line” it is the length of the line in the given axis.
  • Examples:
    • action=“circle”, axis=[0], r=[0.01] - the end-effector should move in a circle around the x-axis with a radius of 0.01m
    • action=“line”, axis=[1], r=[-0.05] - the end-effector should follow a line in the y-axis with a distance of -0.05m
    • action=“line”, axis=[0, 1], r=[0.05, -0.05] - the end-effector should follow a line in the x-axis for 0.05cm and y-axis for -0.05m
      • it should be simultaneous movement in both axes, i.e., it will be one line
  • The movements can be anywhere in space. E.g., when you should do a circle around X-axis, it can be anywhere in the space around an axis that is parallel to the world X-axis

Lab 4

Resolved-Rate Motion Control (RRMC) The goal of this lab is to use the skin of the robot to detect collision and utilize RRMC to move away from the collision. Please, use the attached template to complete the task. There is no automatic evaluation, but if you submit a correct solution to BRUTE, you will get bonus points from the lab tutor.

The template contains a few lines of code, but you do not have to use them; they are there just for your convenience and to guide you towards the solution.

The task is to detect the biggest cluster of activated skin points and then move the body part that contains the activated skin part away from the collision (away means against the normal of the contact) using RRMC. See the video below.

Try to play with both Jacobian inverse and transpose. Together with the code, you can also submit a text file that contains reasoning about the following questions:

  1. Which approach worked better? Why do you think it is like that?
  2. Do you have any proof (slide from the lectures; scientific paper; etc.) to prove your answer to the previous question?

courses/hro/tutorials/1icubtraining.txt · Last modified: 2025/03/16 16:15 by rustlluk