We present a method for odometric localization of humanoid robots using standard sensing equipment, i.e., a monocular camera, an inertial measurement unit (IMU), joint encoders and foot pressure sensors. Data from all these sources are integrated using the prediction-correction paradigm of the Extended Kalman Filter. Position and orientation of the torso, defined as the representative body of the robot, are predicted through kinematic computations based on joint encoder readings; an asynchronous mechanism triggered by the pressure sensors is used to update the placement of the support foot. The correction step of the filter uses as measurements the torso orientation, provided by the IMU, and the head pose, reconstructed by a VSLAM algorithm. The proposed method is validated on the humanoid NAO through two sets of experiments: open-loop motions aimed at assessing the accuracy of localization with respect to a ground truth, and closed-loop motions where the humanoid pose estimates are used in real-time as feedback signals for trajectory control.
Dettaglio pubblicazione
2016, AUTONOMOUS ROBOTS, Pages 867-879 (volume: 40)
Humanoid odometric localization integrating kinematic, inertial and visual information (01a Articolo in rivista)
Oriolo Giuseppe, Paolillo Antonio, Rosa Lorenzo, Vendittelli Marilena
Gruppo di ricerca: Robotics
keywords