Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RViz and gazebo inconsistency: robot moves through dock #75

Open
alsora opened this issue Oct 7, 2021 · 12 comments
Open

RViz and gazebo inconsistency: robot moves through dock #75

alsora opened this issue Oct 7, 2021 · 12 comments
Labels
bug Something isn't working

Comments

@alsora
Copy link
Contributor

alsora commented Oct 7, 2021

Describe the bug
Commanding the robot to move against the dock shows the robot correctly blocked in gazebo, but in rviz the robot moves through the dock.

To Reproduce
Start empty world (robot will start on the dock)

ros2 launch irobot_create_gazebo create3.launch.py

Command forward velocity

ros2 topic pub -r 20 /cmd_vel geometry_msgs/msg/Twist "{linear: {x: 0.2, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.0}}"

Expected behavior
The robot should "push" against the dock, without making progress forward.

Actual behavior
The expected behavior happens in gazebo, but on the other hand, in rviz, the robot moves through the dock as if it was immaterial.

@alsora alsora added the bug Something isn't working label Oct 7, 2021
@apojomovsky
Copy link
Collaborator

apojomovsky commented Oct 12, 2021

This sounds expected as long as we rely on pure odometry for RViz because of slipping.
We should likely publish/consume ground truth information to prevent this from happening.
I'm curious about what's the approach followed with the real robot for RViz?

@alsora
Copy link
Contributor Author

alsora commented Oct 12, 2021

If the robot is pushing against an obstacle (let's treat it as a wall, i.e. it can't be moved) the wheels indeed will be slipping or even stuck.
However, the robot will use the mouse to compute dead reckoning estimate.

In this case, the mouse differential motion would be null, so the robot would appear as not moving.

@alsora
Copy link
Contributor Author

alsora commented Oct 12, 2021

For the sake of the simulated robot, I would say that dead reckoning pose and ground truth pose should coincide.

@eborghi10
Copy link
Collaborator

As far as I remember, the dead reckoning pose is calculated automatically by ros2_control. We could instead publish our own odom TF from ground truth data.

@apojomovsky
Copy link
Collaborator

apojomovsky commented Oct 12, 2021

That's certainly interesting, though I kinda agree with @alsora we could simplify things for the simulator and rely on simulated ground truth.

@alsora
Copy link
Contributor Author

alsora commented Oct 12, 2021

Besides this particular problem, do you see any issue for users who want to test a SLAM system in the simulator?

By using the ground truth they would get a perfect odometry that will also never drift. This would be different from the odometry obtained from perfect sensors (i.e. no noise) which on the other hand would drift due to sampling and approximations in the integration procedure.

Maybe the best, long term solution would be to have the odom TF to be computed from mouse data rather than wheels, under the assumption that mouse always tracks correctly (i.e. I assume mouse delta is obtained from ground truth)

@eborghi10
Copy link
Collaborator

Isn't better to use robot_localization or another Kalman Filter to fuse dead reckoning, IMU, and mouse data? I don't exactly know how the real robot is performing this but it'll solve the issue and it won't add any issue for SLAM testing.

@alsora
Copy link
Contributor Author

alsora commented Oct 12, 2021

At the moment the robot is not using a Kalman filter.

@apojomovsky
Copy link
Collaborator

apojomovsky commented Oct 12, 2021

Returning to the simulated robot discussion, my understanding is that we would like to have frame of reference in RViz that behaves better than pure odometry. I don't think there's a better option that the absolute gazebo pose published as ground truth data for this particular use case (RViz reference frame).
Estimators are great, but I suspect it should be on the robot user (developers, researchers, robot tinkerers) to implement such algorithms. Even for these applications, having ground truth data would become handy for benchmarking, etc.

@eborghi10
Copy link
Collaborator

Sounds good to me if we modify the ground truth plugin to publish the odom <-> base_link TF. But it should be reverted with a flag.

@shuhaowu
Copy link

shuhaowu commented Jan 5, 2022

Should rviz be showing the "perfect" position of the robot? For a real robot it definitely doesn't as some sort of estimation is always present (SLAM, odometry, whatever).

It seems to me there are few things at play here, let me know if my assessment is accurate as I'm just getting started working on this. Sorry if I get anything wrong:

  1. The Gazebo sim_ground_truth_pose emitted by p3d which should coincide perfectly with the pose of the robot in Gazebo. This represents the "actual" pose of the robot if it is in the physical world. In a real robot, this would be provided by a system like Optitrack, so what's emitted is seems correct to me.
    • This topic is useful if the robot user want to evaluate the accuracy of SLAM/localization algorithms in a downstream project. I don't think this should be eliminated.
    • A minor note about this is that the current sim_ground_truth_pose message appears to be emitted for the transform world -> base_link, but there's nothing broadcasting this, as p3d can't broadcast tfs in ros2. This is probably a good thing to not conflict with the below...
  2. Right now it seems like /odom and the odom -> base_link transform is calculated via ros2_control, via its Gazebo plugin. For a real robot, this would be emitted either via the wheel encoders, or perhaps via the mouse (or a fusion). While I'm not quite sure how this is calculated with ros2_control and gazebo, I assume it's based on the simulated wheel turns, or some other simulation mechanism. This feels like it correctly "simulates" the real world.
    • Downstream projects might want to make this odometry value noisier, to evaluate performance of various algorithms in the presence of noisy odometry. So it would be nice to be able to perhaps rename this topic when launching the create3_nodes.launch, so downstream simulations can implement their own noise filter on the odometry information.
    • The above comments mentioned that maybe we should modify P3D to publish odom -> base_link. This would imply that the odometry of the robot
  3. Right now, there's no world/map -> odom transform published by this repo. This is probably the correct thing to do, as usually this transform is published by the localization/SLAM algorithm, which probably shouldn't be a part of this repo.

Most of the behaviour here seems actually OK to me, so is there a problem here? I might be misunderstanding something tho. One minor problem is that world frame is technically undefined due to the lack of a tf broadcast. This makes it difficult to plot the sim_ground_truth with the actual odometry value to compare them in rviz as there's no world frame (or maybe I'm doing it wrong somehow, which is always possible). EDIT: my mistake. The odometry coming from /sim_ground_truth_pose with the aws small house (which I'm using) starts at a non-zero X/Y/Z position while the position for odom obviously starts at 0, 0, 0. I just didn't zoom out far enough to see the arrow for the ground truth... 😅

@shuhaowu
Copy link

shuhaowu commented Jan 5, 2022

After playing around it a bit, I do see that if I hit an object, the rviz position will continue to increase while the ground truth position doesn't increase. This also kind of make sense? The wheels would be slipping in this case so the encoders should still register the ticks. Solving this seems like it's a downstream project's problem? As it is kind of expected that /odom is going to behave like that if wheels slipped, downstream project should "fix" this by fusing/using alternative odometry sources (like the mouse sensor).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants