Abstract: Autonomous rovers are mobile robots, designed to operate in various conditions without any human intervention. These rovers are equipped with sensors, actuators, and control systems that enable them to navigate and interact with their environment. They are widely employed in fields like exploration, search and rescue, environmental monitoring, and surveillance. This research presents the construction and development of an autonomous rover that integrates parallel object tracking, videography, and collision avoidance. The proposed system is designed to operate in real-world scenarios where the rover can navigate safely and capture high-quality visual data of its targeted moving object which can be a living or non-living thing. The rover's control system is based on a combination of classical controllers such as PID controller accompanied by computer vision techniques, enabling it to detect and track objects at a parallel level in its environment, avoid obstacles and collisions, and record video footage with minimal hardware requirements. The proposed system's performance will be evaluated through extensive testing, where the rover's ability to navigate complex environments and capture high-quality visual data will be assessed.
The TrackerBot Prototype (STL files available here)
(a) |
(b) |
(c) |
(d) |
(e) |
(f) |
(g) |
(h) |
(i) |
(j) |
(k) |
(l) |
(a) Ardunio Uno (2x), (b) Nvdia Jetson Nano (a), (c) L298 Motor Driver, (d) LM393 IR Speed Sensor (2x), (e) 128x32 OLED I2C Display, (f) HC-SR04 Ultrasonic Distence Sensor (2x), (g) HP W100 480p/30 fps Webcam, (h) 802.11n WIFI Adapter, (i) 2 Wheel Car Robot Chassis with Motors, (j) 0.5 m Power Sharing USB A to USB B Cable for Arduino UNO (2x) (k) AA Battery (8x), (l) 5V ~ 3A Raspberry Pi 3 Power adapter (micro USB charging)
(a) |
(b) |
(a), (b) Detailed diagram of the basic Navigation module (left) The overall schematic architecture of Navigation Module (right).