Skip to content

Custom Yolov8x-cls edge model deployment and training to classify trash vs recycling.

License

Notifications You must be signed in to change notification settings

Eilliw/trash-classification-public

Repository files navigation

Trash Classification - Edge Deployment, Traning, & Testing

Contributors Forks Stargazers Issues MIT License LinkedIn


Logo

Trash Classification - Edge Deployment, Traning, & Testing

For GEEN-1400 or First-Year Engineering Projects at the University of Colorado Boulder, we, Team-51, were tasked with creating a product that had a positive impact on sustainability especially on our campus. We came up with the idea of a smart trash can that would sort your trash to help with the problem of waste being disposed of improperly using computer vision.

Below you will find all of the source code used to train and inference our AI model. The Usage section contains many examples and helpful visuals
Explore the docs »

View Demo · Report Bug · Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Simple Documentation
  5. Roadmap
  6. Contributing
  7. License
  8. Contact
  9. Acknowledgments

About The Project

Logo

This is our project that we presented at the ITLL Design Expo.

Product Name Screen Shot

Here is a depiction of a very rough prototype of our project.

(back to top)

Built With

(back to top)

Getting Started

This repository contains the triton_server submodule that you can use if you wish to inference a remote model. In this project, a remote triton server is being used.

If you would like to initilise the git lfs tc-triton-submodule after cloning the repo follow these steps.

  • Make sure git lfs is installed
    git lfs install
  • cd into submodule directory
    cd triton_server
  • Initilize local configuration file & clone
    git submodule init
    git submodule update

Or you could just make it easy and just run

git clone  --recurse-submodules https://github.com/Eilliw/trash-classification-public.git

Edge Prerequisites

This is intended to be run on a raspberry pi running on Debian bookworm

  • Pi updates
    sudo apt update
    sudo apt upgrade
  • Pi required packages
    sudo apt install -y python3-picamera2 python3-rpi.gpio
  • libcamera Make sure libcamera is up to date and shows a preview from one of the follwing commands
    libcamera-hello
    libcamera-hello --qt-preview
  • pigpiod This package is heavily reccomended since it controls servos with minial twitch
    sudo apt install pigpiod

Edge Installation

Follow these instructions to get a local copy up and running. Much of the code will have to be changed to fix path issues.

  1. Clone the repo
    git clone https://github.com/Eilliw/trash-classification-public.git
  2. cd into repo directory
  3. Create python venv
    python3 -m venv venv --system-site-packages
  4. Install python dependencies
    source venv/bin/activate
    pip3 install -r edge_requirements.txt
  5. Edit bin/run_edge_on_startup.sh paths
  6. Set Roboflow API key in .env
    ROBOFLOW_API_KEY="YOUR API KEY HERE"
  7. Run either testing script or running script
    bash bin/run_testing.sh
    bash bin/run_on_startup.sh

Train Prerequisites

Train Installation

(back to top)

Usage

Testing GUI examples

Trash

The dorito bag and the reciept are being classified as trash at 97.6% and 94.6% confidence respectively

Recycle

The Celcius can above is being classified as recycling at 97% confidence

Triton Inference Server

Of the above models trash-classification is being used as our main model as it is trained apon our dataset and the current version of our model is version 3. All models stored in the tc-triton-server are stored in the torchscript format using git lfs.

For more examples, please refer to the Documentation

(back to top)

Simple Documentation

Below you will find some simple documentation including flowcharts and explanations of code to a shallow extent.

Button Callback Flowchart

During our expo we controlled when the trash can would open using a button. This Button has a call back that would take an image, output a model inference, and depending on the output, open one of the trashcan doors.

Testing GUI Flowchart

To testing our AI model's accuracy, see the image it was working with, and have a more acessible way to testing our product we created a GUI. This GUI was built with tkinter and automatically captures the most recent camera image. You click a button based on the classification of the image and it is saved to disk to later be uploaded to our dataset on Roboflow. The classification of said image is also shown to give you information on if the model already knows this object.

Roadmap

  • Simple training Notebook
  • Edge inference
    • via Triton Server
    • Via local tflite model
  • Testing
    • pi camera data collection
    • Auto dataset upload
    • Gui
    • Voxel Fiftyone integration
  • Triton server container script
  • Post Expo
    • Docker traning container
    • Auto traning

See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under an MIT License. See LICENSE for more information.

(back to top)

Contact

Willie Chuter - willie.chuter@colorado.edu

Project Link: https://github.com/Eilliw/trash-classification-public

(back to top)

Acknowledgments

(back to top)