As a Gesture Recognition application, this project aims at simulating a completely touchless device (i.e., only a camera and a hand are required to control it).
- To start with, we have to detect the hand, and get its position and landmarks. This task will be done by a Hand Tracking Module that detects hands' landmarks (illustrated in the below figure) with their respective coordinates, using MediaPipe. This module can be solicited in many Computer Vision tasks that require hand tracking.
- The second functionlity of this project is to adjust the volume of the device (increase, decrease, mute and unmute sound), with the help of the PyCaw library, using the device's camera (distance Volume Controller).
The code is optimized continuously, and more functionalities will be added soon...