- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I found an issue when I was doing inference using R5 version of OpenVINO toolkit on Intel Movidius Myriad X VPU. I used the following command to generate IR for my model:
Python mo.py --input_model E:\frcnn_resnet101\frozen_inference_graph.pb --tensorflow_use_custom_operations_config C:\Intel\computer_vision_sdk_2018.5.445\deployment_tools\model_optimizer\extensions\front\tf\faster_rcnn_support_api_v1.7.json --tensorflow_object_detection_api_pipeline_config E:\models-resnet\pipeline.config --data_type FP16
After getting the IR, when doing inference on Intel Movidius Myriad X VPU, the program stuck on the following step:
[ INFO ] Loading model to the plugin
More details are showing in the attachment.
Could anyone help to figure out what causes my problem?
Thanks,
Link Copied

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page