Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

deep learning deployment toolkit - Yolo

Ashim_Prasad
Beginner
556 Views

Sample program packaged in deep learning deployment toolkit has Yolo as an example. But it requires IR .xml file. Where to find a pre-trained xml file for the same? Or how to convert the weights and model file of darknet implementation of Yolo to IR .xml file required by this toolkit.

0 Kudos
3 Replies
Anna_B_Intel
Employee
556 Views

Hi Ashim, 

First of all you need to convert yolo model from darknet to Caffe format. For that purpose you can find several scripts like that at the Internet. Next you need to run Model Optimizer to create Intermediate Representation (.xml and .bin files) from .prototxt and .caffemodel files. Please take a look on the documentation and this video tutorial that explains how to generate an IR with Model Optimizer.  

Best wishes, 

Anna

0 Kudos
Mohanty__Sampad
Beginner
556 Views
I am exploring into Intel's deep learning deployment tool kit. I am following the link (https://software.intel.com/en-us/inference-engine-devguide-introduction)
I am using python 3.5, tensorflow1.2 and bazel 0.4.5
 
I completed the following action
1.Configuring deep learning framework(Tensorflow)
2.Configuring Intel Deep learning Deployment Tool kit
 
After that I am facing the following issues when working with converting model to IR
 
1.Using summarize_graph to examine a model
bazel build tensorflow/tools/graph_transforms:summarize_graph  -> Works fine
bazel-bin/tensorflow/tools/graph_transforms/summarize_graph --in_graph=/tmp/inception_v1_inf_graph.pb     -> Got core dumped issue as given below
 
 tensorflow/core/framework/tensor_shape.cc:169] Check failed: size >= 0 (-1 vs. 0)
Found 1 possible inputs: Aborted (core dumped)
 
2.Freezing inference graph
$ bazel build tensorflow/python/tools:freeze_graph            -> Works fine
$ bazel-bin/tensorflow/python/tools/freeze_graph \
    --input_graph=/tmp/inception_v1_inf_graph.pb \
    --input_checkpoint= ../checkpoint/inception_v1.ckpt \
    --input_binary=true \
    --output_graph=/tmp/frozen_inception_v1.pb \
    --output_node_names=InceptionV1/Logits/Predictions/Reshape_1        -> ImportError: No module named 'backports'. 
 
I installed backports using pip. Still it is giving the same error. Please let me know how can I proceed further.
 
Thanks 
sampad
0 Kudos
dharmadi__richard
556 Views

Anna B. (Intel) wrote:

Hi Ashim, 

First of all you need to convert yolo model from darknet to Caffe format. For that purpose you can find several scripts like that at the Internet. Next you need to run Model Optimizer to create Intermediate Representation (.xml and .bin files) from .prototxt and .caffemodel files. Please take a look on the documentation and this video tutorial that explains how to generate an IR with Model Optimizer.  

Best wishes, 

Anna

Hi Anna,

I have been trying your proposed solutions, but when I tried to convert my caffe model to IR using Intel's Model Optimizer, I got the following error:

./ModelOptimizer -p FP32 -w $TINY_YOLO_DIR/tiny-yolo-voc.caffemodel -d $TINY_YOLO_DIR/tiny-yolo-voc.prototxt -i -b 1
Start working...

Framework plugin: CAFFE
Network type: CLASSIFICATION
Batch size: 1
Precision: FP32
Layer fusion: false
Horizontal layer fusion: NONE
Output directory: Artifacts
Custom kernels directory: 
Network input normalization: 1
[libprotobuf ERROR google/protobuf/text_format.cc:274] Error parsing text-format caffe.NetParameter: 412:18: Message type "caffe.LayerParameter" has no field named "region_param".
F0207 14:03:09.373576 32167 upgrade_proto.cpp:88] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file: /home/nodeflux/sandbox-richad/tiny-yolo/tiny-yolo-voc.prototxt

 

My question is, is it because I used the wrong Network Type? What type of network should I use? My understanding is YOLO network is supposed to be using CLASSIFICATION and LOCALIZATION at the same time.

Thank you in advance, will really appreciate your prompt reply.

0 Kudos
Reply