Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

video_objects.py

idata
Employee
779 Views

How to use this script so I can use USB webcam? It only uses recorded .mp4 video file.

0 Kudos
5 Replies
idata
Employee
595 Views

@VictoryKnocks Edit the video_objects.py python file and change line 338 from cap = cv2.VideoCapture(input_video_file) to cap = cv2.VideoCapture(0) for video index device 0. Also try 1 if you have a USB camera in addition to your laptop webcam.

0 Kudos
idata
Employee
595 Views

oooh, you're a clever man!

0 Kudos
idata
Employee
595 Views

Right then Mr Clever pants let's see if you can help with this one

 

What are the fps differences between SSD Mobilenet and Tiny Yolo on the NCS, which do you recommend for performance/best fps

 

And

 

I used stream_infer.py to stream gstreamer UDP video from my rpi to my laptop the other day and it runs just great, but I am eventually hoping to stream the same setup from rpi to another rpi with screen. Have you tried this setup and have you any good advice for setting up the gstreamer pipeline, at the moment i'm using this pipeline on my laptop and it works nicely:

 

gst-launch-1.0 udpsrc port=9000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink

 

But I am wondering in stream_infer.py it suggests using:

 

(Line 82) #SINK_NAME="glimagesink' # use for Raspbian Jessie platforms

 

How come glimagesink and not autovideosink or xvimagesink?

0 Kudos
idata
Employee
595 Views

@VictoryKnocks I haven't tried streaming from rpi to another rpi. If you can share the details on how to do this, I'm sure other members of the NCS dev community would be interested in your work.

 

As far as the stream_infer.py application, glimagesink was what was working for us so that's what we went with at the time and xvideo was not working on Jessie. I haven't checked it lately so I can't say if it is/is not working for the Raspian Stretch at the time of writing.

 

As far as FPS, you can use the benchmark_ncs app to run some benchmarks and compare them yourself.

 

Run this command in the benchmarkncs folder for Tiny Yolo v1: python3 benchmarkncs.py ../../caffe/TinyYolo/ ../../data/images 448 448

 

and use this command for SSD Mobilenet: python3 benchmarkncs.py ../../caffe/SSD_MobileNet ../../data/images 300 300
0 Kudos
idata
Employee
595 Views

I created this post on Rpi forum to discuss UDP Streaming from Rpi (with pi camera) to Rpi (with portable HDMI screen):

 

https://www.raspberrypi.org/forums/viewtopic.php?f=63&t=225955&p=1386513#p1386513

 

It seems that the rpi doesn't perform video decoding very well using only cpu,

 

The camera enabled pi uses gpu hardware to stream.

 

The HDMI rpi so far I have managed to get 24fps streaming using hardware decoding with omxplayer and ffmpeg/avconv. However, the latency is around 6 seconds between sending rpi and receiving rpi.

 

Maybe another h264 plugin might be better such as omxh264dec but I haven't been able to figure it out yet.

 

My goal is a few things:

 

-Low latency Digital video

 

-Between one SOC board and another (I just happen to have 2 x rpi & a Jetson TK1)

 

-The receiving soc board will connect to a NCS

 

-12v or 5v soc boards

 

     

  • Using SSD Mobilenets with the Video stream (a mix between stream_infer.py & video_objects.py)
  •  

 

I'm not sure about the performance of the rpi + ncs with stream_infer.py & video_objects.py yet, I have only been using them with my laptop and they run just fine there

 

Robotics project :smiley:

0 Kudos
Reply