Sure, here's a brief overview of how to do this:
- Train your model in Matlab
- Export your model to ONNX
- Convert the ONNX model to IR using OpenVINO's Model Optimizer
- Run inference on the Neural Compute Stick
I hope this information is helpful.
Actually u didn't get my point. I want to train my deep learning model for huge dataset. and its training is impossible on my laptop therefore i want to train my network through Intel Neural Compute stick 2 along with MATLAB 2019 connection. Plz assist me for this scenario.
The Intel Neural Compute Stick 2 and OpenVINO Toolkit are only used for Inference not Training. As Sahira mentioned, you will need to train on Matlab on a system with enough resources and export the model to ONNX for Inference.
But According to Intel documentation it can be used for Training also otherwise whats its purpose? Any one suggest me if this Compute stick2 can not be used for training than how can i train my deep learning model for huge dataset as i don't want to use my laptop for this purpose.
Could you please point me to the documentation where you read the Intel Neural Compute Stick 2 can be used for training? The Intel Neural Compute Stick 2 product page states the following:
A Plug and Play Development Kit for AI Inferencing
- Build and scale with exceptional performance per watt per dollar on the Intel® Movidius™ Myriad™ X Vision Processing Unit (VPU)
- Start developing quickly on Windows® 10, Ubuntu*, or macOS*
- Develop on common frameworks and out-of-the-box sample applications
- Operate without cloud compute dependence
- Prototype with low-cost edge devices such as Raspberry Pi* 3 and other ARM* host devices
Could any one please help me out regarding my problem. Actually i am facing difficulty in training through my laptop. Is there any way so i can develop deep learning model in matlab at my laptop but can perform training on any other medium along with my huge dataset......