Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Is there any sample for fast_rcnn_resnet101_coco?

kovrigin__vladimir
541 Views

Hello everyone.

I successfully transformed faster_rcnn_resnet101_coco model to IR using code bellow

<path_to_file>/mo_tf.py
--input_model=<path_to_file>/faster_rcnn_resnet101_coco_2018_01_28/frozen_inference_graph.pb 
--tensorflow_use_custom_operations_config <path_to_file>/faster_rcnn_support.json --tensorflow_object_detection_api_pipeline_config <path_to_file>/faster_rcnn_resnet101_coco_2018_01_28/pipeline.config 
--reverse_input_channels 
--data_type FP16
--output=detection_boxes,detection_scores,num_detections

When I try to run the object_detection_demo_ssd_async.py using NCSV2

python3 <path_to_file>/object_detection_demo_ssd_async/object_detection_demo_ssd_async.py 
-i cam 
-m <path_to_file>/frozen_inference_graph.xml 
--labels <path_to_file>/frozen_inference_graph.mapping 
-d MYRIAD

I get an error:

[ INFO ] Reading IR...
Traceback (most recent call last):
  File "/opt/intel/openvino_2019.1.144/deployment_tools/inference_engine/samples/python_samples/object_detection_demo_ssd_async/object_detection_demo_ssd_async.py", line 185, in <module>
    sys.exit(main() or 0)
  File "/opt/intel/openvino_2019.1.144/deployment_tools/inference_engine/samples/python_samples/object_detection_demo_ssd_async/object_detection_demo_ssd_async.py", line 75, in main
    assert len(net.inputs.keys()) == 1, "Demo supports only single input topologies"
AssertionError: Demo supports only single input topologies

It possible run faster_rcnn_resnet101_coco on the python sample/demo?

0 Kudos
5 Replies
RDeBo
Novice
541 Views

Same issue here - anyone have a solution?

0 Kudos
Ranchhod__Shiresh
541 Views

Hi,

I have successfully run the demo for  ssd_mobilenet_v2_coco model on CPU however unable to run inference on the faster_rcnn models (both resnet101 and inception). The same error as you have posted above is returned when running inference for both model variants.

Have you had any success in resolving this?

0 Kudos
Shubha_R_Intel
Employee
541 Views

Dear kovrigin, vladimir and others - the non-SSD should work with faster_rcnn, http://docs.openvinotoolkit.org/latest/_inference_engine_samples_object_detection_demo_README.html

Unfortunately there is a bug which will be fixed in 2019R2 - which should be released "Any Day Now".

Thanks,

Shubha

 

0 Kudos
Maffei__Davide
Beginner
541 Views

Shubha R. (Intel) wrote:

Dear kovrigin, vladimir and others - the non-SSD should work with faster_rcnn, http://docs.openvinotoolkit.org/latest/_inference_engine_samples_object_detection_demo_README.html

Unfortunately there is a bug which will be fixed in 2019R2 - which should be released "Any Day Now".

Thanks,

Shubha

 

Hi Shubha,

same problem here. Both OpenVINO and OpenCV python libraries cannot use faster rcnn models converted with MO for Inference.

I have tested it also with latest OpenVINO version R2, but the problem is the same about inputs length>1.

I have successfully converted the model, after many issues, by using TF 1.13.1 and OV 2019.R1.1 latest, because with R2 version conversion give me the same error here https://software.intel.com/en-us/forums/computer-vision/topic/813854#comment-1942022 ).

Do you have a workaround for this solution?

I found this solution https://github.com/opencv/cvat/pull/545#issuecomment-508151724, but I didn't understand if it works and what this snippet do. Is valid also for IE library?

Thank you

Davide

0 Kudos
Shubha_R_Intel
Employee
541 Views

Dear Maffei, Davide,

I apologize for all of the confusion. But as of OpenVino 2019R2, Object Detection Sample SSD will work for faster rcnn. Other customers have tried it and succeeded. Please read this IDZ post for a detailed explanation.

Also 2019R2.01 is actually available now. Please try it !

I hope it helps.

Shubha

 

0 Kudos
Reply