Skip to content

Latest commit

 

History

History
27 lines (21 loc) · 1.83 KB

README.md

File metadata and controls

27 lines (21 loc) · 1.83 KB

Rooaa

GitHub license Maintenance made-with-python

An experimental web application that aims to help the blind navigate their surroundings using voice generated messages based on detected objects and their distance using only the mobile camera / web cam.

Getting Started

  • Clone the repo to your local machine.

  • Install Docker and ensure it is running

  • Install Make if you're on Windows. OSX already has it installed. Linux will tell you how to install it (i.e., sudo apt-get install make)

  • Run make run and then navigate to https://[YOUR IPV4 ADDRESS]:5000/

    • To get your IPV4 Address run ipconfig in cmd if you're on Windows or ifconfig if you're on Linux
  • Click Advanced then proceed to unsafe The warning is because we are using a dummy SSL context in order to use https to be able to access the camera feed.
    Note: Dense service has quite a high inital loading time (60 seconds) as we are currently running on the CPU but it should function normally after this

Demo:

Acknowledgements: