- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello,
I have the Intel Neural Compute Stick 2 and just followed the steps from the intel website (https://software.intel.com/en-us/articles/get-started-with-neural-compute-stick) to get started and the demos all work properly. My question is, how can I let run my own python-codeon the device? I just have some easy python examples, how can I let them run on the NCS device? What are the steps to compile my own code onto the device?
Would you mind helping me ? I can’t find a step-by-step instruction how to run own written python-codes on the ncs device.
Ubuntu 16.04 / Python 3.6 / Intel Neural Compute Stick 2
Thank you in advance!
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi fw0001,
Thank you for reaching out.
Regarding the Python code, it will run on your development platform and using the inference engine from OpenVINO ™ toolkit, an inference request can be made to the NCS2.
You can check python demos here, using our pre-trained models.
Let us know if you have more questions.
Best regards,
David C.
Intel Customer Support Technician
A Contingent Worker at Intel
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hello David,
Thank you for the quick response!
I have a further question: Is it possible to run an own written python-program on the stick?
Just like for example
>>> a=1
>>> b=2
>>> c=a+b
>>> print(c)
Can you please explain it step by step how I have to do this to run this code on the NCS2 device? Or is this not possible?
I’m a very beginner with the NCS Stick…
Many thanks in advance!
Best regards
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi fw0001,
Thanks for your reply.
It is not possible to run Python code on the Intel® NCS2. The Intel® NCS2 is only used for inference on trained neural network models. This can be done with the inference engine from OpenVINO™ toolkit. You can check the documentation here for installation, samples and tools available.
Regards,
David C.
Intel Customer Support Technician
A Contingent Worker at Intel

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page