Showing results for 
Search instead for 
Did you mean: 
Community Manager

How to use this script so I can use USB webcam? It only uses recorded .mp4 video file.

Tags (1)
0 Kudos
5 Replies
Community Manager

@VictoryKnocks Edit the python file and change line 338 from cap = cv2.VideoCapture(input_video_file) to cap = cv2.VideoCapture(0) for video index device 0. Also try 1 if you have a USB camera in addition to your laptop webcam.

Community Manager

oooh, you're a clever man!

Community Manager

Right then Mr Clever pants let's see if you can help with this one


What are the fps differences between SSD Mobilenet and Tiny Yolo on the NCS, which do you recommend for performance/best fps




I used to stream gstreamer UDP video from my rpi to my laptop the other day and it runs just great, but I am eventually hoping to stream the same setup from rpi to another rpi with screen. Have you tried this setup and have you any good advice for setting up the gstreamer pipeline, at the moment i'm using this pipeline on my laptop and it works nicely:


gst-launch-1.0 udpsrc port=9000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink


But I am wondering in it suggests using:


(Line 82) #SINK_NAME="glimagesink' # use for Raspbian Jessie platforms


How come glimagesink and not autovideosink or xvimagesink?

Community Manager

@VictoryKnocks I haven't tried streaming from rpi to another rpi. If you can share the details on how to do this, I'm sure other members of the NCS dev community would be interested in your work.


As far as the application, glimagesink was what was working for us so that's what we went with at the time and xvideo was not working on Jessie. I haven't checked it lately so I can't say if it is/is not working for the Raspian Stretch at the time of writing.


As far as FPS, you can use the benchmark_ncs app to run some benchmarks and compare them yourself.


Run this command in the benchmarkncs folder for Tiny Yolo v1: python3 ../../caffe/TinyYolo/ ../../data/images 448 448


and use this command for SSD Mobilenet: python3 ../../caffe/SSD_MobileNet ../../data/images 300 300
Community Manager

I created this post on Rpi forum to discuss UDP Streaming from Rpi (with pi camera) to Rpi (with portable HDMI screen):


It seems that the rpi doesn't perform video decoding very well using only cpu,


The camera enabled pi uses gpu hardware to stream.


The HDMI rpi so far I have managed to get 24fps streaming using hardware decoding with omxplayer and ffmpeg/avconv. However, the latency is around 6 seconds between sending rpi and receiving rpi.


Maybe another h264 plugin might be better such as omxh264dec but I haven't been able to figure it out yet.


My goal is a few things:


-Low latency Digital video


-Between one SOC board and another (I just happen to have 2 x rpi & a Jetson TK1)


-The receiving soc board will connect to a NCS


-12v or 5v soc boards



  • Using SSD Mobilenets with the Video stream (a mix between &


I'm not sure about the performance of the rpi + ncs with & yet, I have only been using them with my laptop and they run just fine there


Robotics project :smiley: