Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6403 Discussions

How to run own Python-code on Intel Neural Compute Stick2?

fw0001
Beginner
3,041 Views

Hello,

I have the Intel Neural Compute Stick 2 and  just followed the steps from the intel website (https://software.intel.com/en-us/articles/get-started-with-neural-compute-stick) to get started and the demos all work properly. My question is, how can I let run my own python-codeon the device? I just have some easy python examples, how can I let them run on the NCS device? What are the steps to compile my own code onto the device?

Would you mind helping me ? I can’t find a step-by-step instruction how to run own written python-codes on the ncs device.

Ubuntu 16.04 / Python 3.6 / Intel Neural Compute Stick 2

Thank you in advance!

3 Replies
David_C_Intel
Employee
2,856 Views

Hi fw0001,

 

Thank you for reaching out.

Regarding the Python code, it will run on your development platform and using the inference engine from OpenVINO ™ toolkit, an inference request can be made to the NCS2.

You can check python demos here, using our pre-trained models.

 

Let us know if you have more questions.

 

Best regards,

 

David C.

Intel Customer Support Technician

A Contingent Worker at Intel

0 Kudos
fw0001
Beginner
2,856 Views

Hello David,

Thank you for the quick response!

I have a further question: Is it possible to run an own written python-program on the stick?

Just like for example

>>> a=1

>>> b=2

>>> c=a+b

>>> print(c)

Can you please explain it step by step how I have to do this to run this code on the NCS2 device? Or is this not possible?

I’m a very beginner with the NCS Stick…

Many thanks in advance!

Best regards

David_C_Intel
Employee
2,856 Views

Hi fw0001,

 

Thanks for your reply.

It is not possible to run Python code on the Intel® NCS2. The Intel® NCS2 is only used for inference on trained neural network models. This can be done with the inference engine from OpenVINO™ toolkit. You can check the documentation here for installation, samples and tools available.

 

Regards,

 

David C.

Intel Customer Support Technician

A Contingent Worker at Intel

 

0 Kudos
Reply