Daniel Fernández Villanueva
Escuela Politécnica de Ingeniería de Gijón
Universidad de Oviedo
33204 Gijón, Asturias
This work presents a human-robot interface (HRI) for controlling both the arm and the base of a youBot robot. It is designed as a tool for robot teaching, by a mix of demonstration and imitation. The youbot mobile base is tele-operated (demonstration). As a difference, in imitation learning is necessary a specific recording of the teacher actions, and a subsequent mapping between recorded execution and the robot learner capabilities (embodiment mapping).
In our HRI system the embodiment mapping consists of direct anthropomorphism, using a network of inertial sensors located directly on the human teacher arm. An independent inertial sensor is used as a joystick for direct motion of the YouBot wheeled platform. The system allow us to record human motions for robot teaching purposes, but also to move the robot in real time with the arm acting as an anthropomorphic interface.
A human operator wears three separate inertial measurement units on his arm. The system records the state of his arm, and translate it to the YouBot in order to mimic the state of the human extremity. The control of the platform is done with a fourth inertial measurement unit, which works as a joystick.
The inertial measurement units used for this system are four Xsens MTx attached to an Xbus Master. The MTx are small and accurate 3DOF inertial Orientation Trackers that provide drift-free 3D orientation as well as kinematic data: 3D acceleration, 3D rate of turn (rate gyro) and 3D earth magnetic field. For this system, the operator wears three of this sensors on his arm, while he can use the fourth one as a joystick to operate the platform.
All the software created for this project was implemented using the ROS framework (Robot Operating System).