- Robot club TB3 [Track and push ball to goal]
- Tracks a ball using machine vision
- Robot is tasked to move ball into the goal
- Pretty cool, reminiscent of Robo-Soccer
- Turtlebot 3 Maze Explorer
- Seems to use a bug type algorithm
- Smooth navigation of really confined rooms
- Code explaination though
- Multi Robot Exploration
- Really neat for swarm robotics
- Multiple robots mapping an environment
- multi robot collaboration (Broadcast and request help)
- Kadyn Martinez
- Contributions
- Wataru Oshima
- Contributions
Visual Odometry
Implement it and combine indoor naviation.
Implement it ourselves or extend package
Visual Odometry Demo from package
Visual Odometry Survey https://github.com/klintan/vo-survey
Learning VI-ORB repo https://github.com/jingpang/LearnVIORB
Open VINS # Visual Inertial Navigation https://github.com/rpng/open_vins
Simple Visual Monocular Odometry https://github.com/avisingh599/mono-vo
Ros Simple visual Monocular Odometry (Based on Previous) https://github.com/atomoclast/ros_mono_vo
Run the following command to test out optical flow and trajectory estimation.
The calibration file is the one generated from ros image pipeline. For more information see CALIB/README.md
Depending on whats entered for the second argument,
- If the value is an integer it will be treated as a device index and open that camera for live optical flow.
- If the value is a filepath, it will open the file and run the optical flow on the file.
python prototype/optiflow_w_essential.py [CALIBRATION.YAML] [DEVICE INDEX or FILEPATH]
Run the following or your camera's native ros package to get video stream.
ros2 run usb_cam usb_cam_node_exe --ros-args --params-file camera_params.yaml
NOTE: camera_params should be copied from
/opt/ros/humble/share/usb_cam/config/params.yaml
and be modified accordingly (camera dimensions, and device path are most common changes)