In this context, we would like to implement an immersive whole-body tele-operation of the iCub, so that a human operator can control in real-time the humanoid iCub, to provide demonstrations of the desired robot behavior during these tasks. The human operator will be equipped with a XSens MVN suit, a wearable system for motion capture, a virtual-reality system with joysticks (e.g., HTC vive), sensorized gloves and shoes. In the team we proposed preliminary theoretical model for offline synthesis of the robot's movement from XSens data, and we already have a working kinematics retargeting implementation online. The goal of the engineer will be to retarget the movement and actions of the human operator into the robot iCub, in real-time, ensuring the safety and stability of the robot motion.
Ms or PhD in software engineering or robotics and/or experience in collaborative software development
Excellent teamwork and experience in collaborative software development (Agile, ...)
Excellent knowledge of robotics
Expertise in programming development and software architecture
Previous experience in collaborative projects, real robots or tele-operation is a plus
The position is for 1 year, extensions are possible. Salary depends on experience.