Intel® Makers
Intel® Edison, Intel® Joule™, Intel® Curie™, Intel® Galileo
Welcome - This is a Peer-to-Peer Forum only. Intel has discontinued these products but you may find support from other customers on this Forum
9867 Discussions

Capture Video Stream from Edi-Cam Server using openCV or urllib in Python?

New Contributor I

Greetings, I am trying to do some processing on video being streamed from an Edi-Cam server which is written in nodejs. I have been trying to wrap my head around its structure and syntax, hoping to modify it, but its a bit too much at the moment. My main goal is to capture the video stream into python and use the opencv library. Looking online there are two main solutions that should work, but they are not for me. I think my URL is incomplete.

1) Use opencv's video capture class.

import cv2

cap = cv2.VideoCapture('

while True:

ret, frame =

cv2.imshow('Video', frame)

if cv2.waitKey(1) == 27:


2) Using the urllib

import cv2

import numpy as np

import urllib2




while True:


a = bytes.find('\xff\xd8')


b = bytes.find('\xff\xd9')


print bytes


if a!=-1 and b!=-1:


jpg = bytes[a:b+2]


bytes= bytes[b+2:]


i = cv2.imdecode(np.fromstring(jpg, dtype=np.uint8),cv2.CV_LOAD_IMAGE_COLOR)




if cv2.waitKey(1) ==27:



Looking at what bytes is, its the source code of the web page, and doesn't change, there is no video stream found in it. Others examples have addresses like..

cap = cv2.VideoCapture('')

cap = cv2.VideoCapture('')

From my limited understanding of Edi-Cam, it uses ffmpeg to encode the video and then sets up a server on the Edison and client on the browser written in Node.js, as so to be non-blocking. I see that there are three ports used, one to listen to the hardware port for video(8020), one to listen to clients(8040), and one to serve clients the video page (8080). It is not clear to me what other address entries are needed in the URL to access the mpeg stream. Is there a way to scan the url for any streams urls?

I am using python 2,7 and opencv3.0. I am not sure if I have ffmeg on my pc. Running windows 10.

I am quite new to this stuff, the only simulair thing I have done was setup a simple TCP server client using the Edison to send sensor data.

I have seen a clean solution to this problem on the Rasberry pi which uses its raspivid library to read the camera and encode it and netcat to set up a TCP pipe on the pi. Wirelessly streaming a video from a Raspberry to a remote laptop - YouTube

I could imagine all that is needed is to capture video from the camera, encode it, send it over tcp, receive it, and read a frame. Any advice on getting the Edi-cam to work or making another solution work. I like the edicam solution because it preforms very well.

4 Replies
Community Manager




To know the necessary parameters for the OpenCV functions I'd recommend you to look at the OpenCV documentation. .Here you can see the video capture class and the options you have to use the command. Was this what you were referring to?



There are some guides that illustrate live video streaming on Edison. The approach might not be exactly the one you're looking for but they are helpful to show an alternative. You can check .





Community Manager

Do you still need assistance with this thread, were you able to use opencv?






Hi intel_corpintel_admin,

I need help on this. I am using c++ in openCV. The video being streamed from an Edi-Cam server ( GitHub - drejkim/edi-cam: Video streaming on Intel Edison).

The C++ coding that i wrote as follow:

  1. # include
  2. # include
  3. # include
  4. # include
  5. # include
  6. using namespace std;
  7. using namespace cv;
  8. int main()
  9. {
  10. Mat frame;
  11. namedWindow("video", 1);
  12. VideoCapture cap("");
  13. <span style="font-weight: inherit...
Community Manager

Hi Yeong,



This reply seems to be a duplicate of the thread you posted here: /thread/106958 . We'll post a reply to you in the aforementioned link soon.