Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

OpenVino Raspberry pi - Movidius Support


Dear Team

I was able to Train an object detection pipeline on custom dataset for food and moved forward by using the model optimizer from OpenVino toolkit there by converting my Mobilenet_SSD Tensorflow model to xml and bin files. The inference Representations were tested and validated on the Movidius using the inference engine from OpenVino in windows environment setup on my Laptop using -d MYRIAD . But now I want to run the optimized model using the Inference representations from Openvino on a raspberry pi and Movidius combo . Please help me out with the same. 

When I use the mvNCCompile i get the following error saying Value error : non nonmaxsupressionV3 not Found. Could you please help generate a graph without the processing layers such that it is compatible and deployable on the Movidius Neural Computing stick by using a raspberry pi



0 Kudos
4 Replies

Dear Neeraj, 

with our current release R4, we are not yet supporting Raspeberry Pi with Movidius. 

For cutting the graph in OpenVINO, you can use the argument --input <start_cut_node> when you use the Model Optimizer. 




Thanks Habert I was able to use the model optimizer the file to convert it into inference representation everything worked fine . Later I used the Object_detection_sample_ssd from the Inference Engine samples and using -d MYRIAD was able to even validate the model on the Movidius Neural computing stick.  SO the model is working on Movidius , is there any way I can add the Inference engine to deploy the model fromt he raspberry pi to movidius NCS.

I couldnt optimize the model using mvNCCompile as it gives me an error saying Non-max-supression not found. 

Please Help me with solving this .
Thank you 


Interesting .. the OpenVINO "Getting Started" examples work equally well with the original NCS and the NCS 2, which suggests that they are pretty similar.

I presume all of the magic in translating inference representations takes place in a Myriad-specific implementation of the InferenceEngine::InferencePlugin abstract class, and that the demo applications "know" to use that implementations due to the -d MYRIAD argument that is passed to the demo applications.

In any case, the fact that the older NCS works just fine with OpenVINO, using the -d MYRIAD argument suggests that the two devices are pretty similar under the hood. Both devices appear to use Virtual Serial Communications for interfacing to the host. For whatever reason, the  older XLink functions in the NCSSDK 2 are unable to bind to the NCS 2. I'm not enthused about diving in to XLink (and maybe some underlying modules as well) to understandy why this might be.

In any case, since the OpenVINO system is hypothetically "open", is there any chance of seeing the source for the VPU implementation of InferenceEngine::InferencePlugin? The apparent similarity of the devices suggests that it might not be all that difficult to make a library that would allow an RPi (or something similar) to load and manage a net on the NCS 2.

Until the NCS 2 can actually be hosted by something that could be properly considered an "Edge' device, it's not especially useful.


Dear Habert,

Now that we have R5 is OpenVINO supporting Raspeberry Pi with Movidius ? any documentation/tutorial how to manage it