Skip to content

DJCodesStuff/HandSign

Repository files navigation

HandSign - Hand Sign Language Recognition API

This project is an API designed to receive frame data (such as hand images) and return the corresponding hand sign language using a machine learning model. The API leverages a pre-built model to interpret hand signs and can be integrated into various applications such as sign language translators or educational tools. The project is still under development, and further improvements are being made.

Features

  • Hand Sign Language Detection: The API processes input frames (images) to recognize and return corresponding hand signs.
  • Model Integration: The API uses a trained model for recognizing hand gestures.
  • Extensible: The project is designed to be expanded with more features in the future, including improving model accuracy and performance.

Project Structure

  • API_Hands.py: Contains the main API functionality to receive frame data and return hand sign predictions.
  • API_Hands_req.py: Handles API requests and manages the required inputs and outputs for the hand sign recognition.
  • buildapimodel.py: Used to build and configure the machine learning model for hand sign recognition.
  • hands_package/: This package contains the model building logic (Build_Model_nn.py) and initialization scripts.
  • working/: Stores the serialized data and model files, including data.pickle and model.h5.

Installation

To set up this project locally, follow the steps below:

  1. Clone the repository:

    git clone https://github.com/your-username/HandSign.git
  2. Navigate to the project directory:

    cd HandSign
  3. Install the required dependencies:

    pip install -r requirements.txt
  4. Ensure that you have the necessary environment set up for running the API (Python version >= 3.8 is recommended).

Usage

You can start the API and begin sending requests to recognize hand sign language from images. Here’s how you can run the API locally:

  1. Create your own dataset using the script buildapimodel.py. When you start the script, a webcam will window will open up with a propmt to press Q wheneer you are ready, put up the hand sign that you want to be as Label 0 and wait for the script to take rames of the Hand sign. After it takes the frames, it will prompt you again for the next label. This will go on for the number of times mentioned in the script Build_Model_nn.py line 17 variable: "number_of_classes".
python buildapimodel.py
  1. After this script finishes, you model is ready with your dataset. Now you can start the API_Hands.py script to deploy the flask api.
python API_Hands.py
  1. I have prepared a test script that runs locally where you test the same signs that you used to create your datasets.
python API_Hands_req.py

About

Hand sign prediction REpo

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published