Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Matlab and Open VINO toolkit connectivity

Nazim__Sadia
Beginner
1,460 Views

I want to connect matlab with open vino toolkit in order to train custom developed deep learning network on intel neural stick 2

0 Kudos
6 Replies
Sahira_Intel
Moderator
1,460 Views

Hi Sadia,

Sure, here's a brief overview of how to do this:

  1. Train your model in Matlab
  2. Export your model to ONNX
  3. Convert the ONNX model to IR using OpenVINO's Model Optimizer
  4. Run inference on the Neural Compute Stick 

I hope this information is helpful.

Best Regards,

Sahira 

0 Kudos
Nazim__Sadia
Beginner
1,460 Views

Respected Sahira,

Actually u didn't get my point. I want to train my deep learning model for huge dataset. and its training is impossible on my laptop therefore i want to train my network through Intel Neural Compute stick 2 along with MATLAB 2019 connection. Plz assist me for this scenario.

Regards,

Sadia Nazim

0 Kudos
JesusE_Intel
Moderator
1,460 Views

Hi Sadia,

The Intel Neural Compute Stick 2 and OpenVINO Toolkit are only used for Inference not Training. As Sahira mentioned, you will need to train on Matlab on a system with enough resources and export the model to ONNX for Inference.

Regards,

Jesus

0 Kudos
Nazim__Sadia
Beginner
1,460 Views

Hello,

But According to Intel documentation it can be used for Training also otherwise whats its purpose?  Any one suggest me if this Compute stick2 can not be used for training than how can i train my deep learning model for huge dataset as i don't want to use my laptop  for this purpose.

Regards,

Sadia Nazim

0 Kudos
JesusE_Intel
Moderator
1,460 Views

Hi Sadia,

Could you please point me to the documentation where you read the Intel Neural Compute Stick 2 can be used for training? The Intel Neural Compute Stick 2 product page states the following:

A Plug and Play Development Kit for AI Inferencing

  • Build and scale with exceptional performance per watt per dollar on the Intel® Movidius™ Myriad™ X Vision Processing Unit (VPU)
  • Start developing quickly on Windows® 10, Ubuntu*, or macOS*
  • Develop on common frameworks and out-of-the-box sample applications
  • Operate without cloud compute dependence
  • Prototype with low-cost edge devices such as Raspberry Pi* 3 and other ARM* host devices

Regards,

Jesus 

0 Kudos
Nazim__Sadia
Beginner
1,460 Views

Hello,

Could any one please help me out regarding my problem. Actually i am facing difficulty in training through my laptop. Is there any way so i can develop deep learning model in matlab at my laptop but can perform training on any other medium along with my huge dataset......

 

Regards,

Sadia Nazim

0 Kudos
Reply