A Rock-Paper-Scissors game using Machine Learning and Mediapipe’s hand tracking.
- Designed and implemented a Rock-Paper-Scissors game using Machine Learning using Mediapipe’s hand tracking.
- Collected a dataset of size 1329 using OpenCV and preprocessed it to our requirements.
- Trained a classification model with a validation accuracy of 99.32% using TensorFlow and implemented it on website using TensorFlow.js.
- Running the project locally
- How to collect data
- How to train a new model
- How to convert model to tensorflow.js format
- Results
git clone https://github.com/dev-DTECH/vision-rps.git
cd ./vision-rps
python -m http.server 8080
The project should be live at https://localhost:8080
pip install -r requirments.txt
python collect_data_manual.py
Press the keys from 0 to 9 as labels while showing your hand in the camera. It will generate a dataset and store it into data.csv
Open the train.ipynb with Jupyter notebook and execute all the cells accordingly
pip install tensorflowjs
tensorflowjs_converter \
--input_format=keras \
/models/v1.h5 \
/models/v1_tfjs_model
Accuracy = 0.9932279909706546