Dear Okasa-san,
as for mobile robots, every frames on top of `base_link` are actively relying on the localization / SLAM node and therefore the result you will obtain will depend in these particular cases will depend on how it behaves.
In my case, I use `robot_pose_ekf` (on HRP2) so that I both combine the integration of the command *and* the IMU data. The `odom` is published by this node like on PR2.
For `base_footprint`, I use the point defined as the center of the robot soles center. If you walk on non-flat floor, of course you have to project this point at the right height depending on your perception. This is not an issue on flat-floor although.
This definition works fine for any number of legs.
If the robot is falling then of course the "footprint" is not really a footprint anymore but it is still defined (mathematically speaking). As for the `odom` frame, by changing the covariance matrices appropriately you can only integrate the IMU data when you fall to get an estimate of your position.
P.S. FYI the `robot_pose_ekf` node input is an odometry message sent by the robot controller (hrpsys in my case).
↧