Skip to content

Latest commit

 

History

History
36 lines (24 loc) · 1.12 KB

ros.md

File metadata and controls

36 lines (24 loc) · 1.12 KB

tf-pose-estimation for ROS

Human pose estimation is expected to use on mobile robots which need human interactions.

Installation

Cloning this repository under src folder in your ros workstation. And the same should be carried out as README.md.

$ cd $(ros-workspace)
$ cd src
$ git clone https://github.com/ildoonet/tf-pose-estimation
$ pip install -r tf-pose-estimation/requirements.txt

There are dependencies to launch demo,

Video/camera demo

CMU
640x360
Mobilenet_Thin
432x368
cmu-model cmu-model

Above tests were run on a P40 gpu. Latency between current video frames and processed frames is much lower on mobilenet version.

Source : https://www.youtube.com/watch?v=rSZnyBuc6tc

$ roslaunch tfpose_ros demo_video.launch

You can specify 'video' arguments to launch realtime video demo using your camera. See [./launch/demo_video.launch](ros launch file).