Quantcast
Channel: ROS Answers: Open Source Q&A Forum - RSS feed
Viewing all articles
Browse latest Browse all 7

Answer by Thomas for Hi,I have question, on odom and base_footprint, especially when the robot is falling down or 4-legged walking in REP 120. I assume that humanoid robot did not know about the current waist position, but can estimate orientation from IMU sensors.does odom correspond to the sensor based orientation and world reference based position of the waist?does base_footprint respect the sensor based orientation and model based calculation of the height of the waist?if it is ok, the question ishow do you estimate the position of the odom, from the walking controllers? does that include any sensor data?does base_footprint does not reflect any sensor data? what's happens if the robot falling down or 4-legged walking mode?

Previous: Comment by Thomas for Dear Okasa-san,as for mobile robots, every frames on top of base_link are actively relying on the localization / SLAM node and therefore the result you will obtain will depend in these particular cases will depend on how it behaves.In my case, I use robot_pose_ekf (on HRP2) so that I both combine the integration of the command and the IMU data. The odom is published by this node like on PR2.For base_footprint, I use the point defined as the center of the robot soles center. If you walk on non-flat floor, of course you have to project this point at the right height depending on your perception. This is not an issue on flat-floor although. This definition works fine for any number of legs.If the robot is falling then of course the "footprint" is not really a footprint anymore but it is still defined (mathematically speaking). As for the odom frame, by changing the covariance matrices appropriately you can only integrate the IMU data when you fall to get an estimate of your position.P.S. FYI the robot_pose_ekf node input is an odometry message sent by the robot controller (hrpsys in my case).
$
0
0
Dear Okasa-san, as for mobile robots, every frames on top of `base_link` are actively relying on the localization / SLAM node and therefore the result you will obtain will depend in these particular cases will depend on how it behaves. In my case, I use `robot_pose_ekf` (on HRP2) so that I both combine the integration of the command *and* the IMU data. The `odom` is published by this node like on PR2. For `base_footprint`, I use the point defined as the center of the robot soles center. If you walk on non-flat floor, of course you have to project this point at the right height depending on your perception. This is not an issue on flat-floor although. This definition works fine for any number of legs. If the robot is falling then of course the "footprint" is not really a footprint anymore but it is still defined (mathematically speaking). As for the `odom` frame, by changing the covariance matrices appropriately you can only integrate the IMU data when you fall to get an estimate of your position. P.S. FYI the `robot_pose_ekf` node input is an odometry message sent by the robot controller (hrpsys in my case).

Viewing all articles
Browse latest Browse all 7

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>