Community
cancel
Showing results for 
Search instead for 
Did you mean: 
V_B__Anakha
Beginner
61 Views

Inference on Mobilenet v1 SSD using the converted model fails with error

Hi,

I am trying to run inference on Mobilenet v1 SSD COCO downloaded from the tensorflow website. I converted the frozen model to the IR using the following command.

sudo /opt/intel/openvino/deployment_tools/model_optimizer/mo.py --input_meta_graph /home/user1/Desktop/IntelNCS/Mobilenetv1_SSD/model.ckpt.meta --tensorflow_use_custom_operations_config /opt/intel/openvino_2019.2.242/deployment_tools/model_optimizer/extensions/front/tf/ssd_support.json --tensorflow_object_detection_api_pipeline_config /home/user1/Desktop/IntelNCS/Mobilenetv1_SSD/pipeline.config  --data_type half --output_dir /home/user1/Desktop/Anakha/Mobilenetv1_SSD --model_name ssd_mobilenet_v1

 

This command generated the .xml and .bin files successfully.

However while doing the inference I am getting the following error.

[ INFO ] InferenceEngine:
    API version ............ 2.0
    Build .................. custom_releases/2019/R2_f5827d4773ebbe727c9acac5f007f7d94dd4be4e
    Description ....... API
Parsing input parameters
[ INFO ] Files were added: 1
[ INFO ]     /home/user1/Desktop/IntelNCS/car_1.bmp
[ INFO ] Loading Inference Engine
[ INFO ] Device info:
    MYRIAD
    myriadPlugin version ......... 2.0
    Build ........... 27579
[ INFO ] Loading network files:
    /home/user1/Desktop/Anakha/Mobilenetv1_SSD/ssd_mobilenet_v1.xml
    /home/user1/Desktop/Anakha/Mobilenetv1_SSD/ssd_mobilenet_v1.bin
[ INFO ] Preparing input blobs
[ INFO ] Batch size is 1
[ INFO ] Preparing output blobs
[ INFO ] Loading model to the device
[ ERROR ] AssertionFailed: !ieDims.empty()

 

======================================================

The command used for inference:

./object_detection_sample_ssd -i /home/user1/Desktop/IntelNCS/car_1.bmp -m /home/user1/Desktop/Anakha/Mobilenetv1_SSD/ssd_mobilenet_v1.xml -d MYRIAD

The object_detection_sample_ssd is located in the /home/user1/inference_engine_samples_build/intel64/Release folder.

Kindly let me know if I am doing something wrong with the inference command. Looking forward for help.

0 Kudos
1 Reply
Shubha_R_Intel
Employee
61 Views

Dear V B, Anakha I believe I have answered your question in your  previous post

Thanks,

Shubha

Reply