Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6392 Discussions

Is it possible to use TensorFlow SSD-MobileNet on NCS?

idata
Employee
3,226 Views

I'm working with an object detection model and I would like to use TensorFlow version of SSD-MobileNet. I saw the Caffe version and tried to retrain it, but the results were very poor. After training for 100 hours the mAP was still less than 0.03. I tried to tweak the learning rate and aspect ratios to better suit my dataset (my objects are mostly squares), but that didn't help. Then I switched to TensorFlow Object Detection API to see if there is a problem in my dataset. However, after training for just 6 hours I already got a mAP of 0.5. I also noticed that the TensorFlow version is also much faster on my machine; (0.6 sec / iteration) vs (2 sec / iteration) on caffe. So the TensorFlow version works much better and I'd like to use that instead if possible.

 

Is there any way to convert the model to NCS? And if direct conversion from TensorFlow to NCS is not possible, would it be possible to convert the model to Caffe format and then to NCS? Or could I just copy the TensorFlow model weights to the equivalent Caffe model?

0 Kudos
47 Replies
idata
Employee
1,576 Views

I'm also interested in this - any feedback on this from anyone?

0 Kudos
idata
Employee
1,576 Views

Me too! I've got a working model on object detection API. Would love to find a way to do this.

0 Kudos
idata
Employee
1,576 Views

@manto @djaenicke @owlie We apologize, but the current NCSDK (2.04.00.06) doesn't have support for SSD Mobilenet on TensorFlow yet. As you mentioned we do support SSD MobileNet on Caffe as an alternative.

 

As far as re-training SSD Mobilenet on Caffe, you can try using https://github.com/listenlink/caffe/tree/ssd for more efficient DWS convolution with CUDNN 9.

0 Kudos
idata
Employee
1,576 Views

You can try Intel OpenVINO™ toolkit. It supports Inference of SSD MobileNet from the TensorFlow Object Detection model zoo on NCS using the Myriad plugin.

0 Kudos
idata
Employee
1,576 Views

@alex_z hi can you please explain more or give some github repositories?

0 Kudos
idata
Employee
1,576 Views

@WuXinyang Hi! Sorry for the delayed response. Download and install the last version of OpenVINO Toolkit (https://software.intel.com/en-us/openvino-toolkit/choose-download). Inside installation folder, you can find C++/Python examples and several pre-trained models. Windows and Linux are supported by the toolkit. The main idea is the same as Movidius SDK, you convert a trained model into Intermediate Representation format using Model Optimization then the Inference Engine reads, loads, and infers the Intermediate Representation on different devices such as CPU, Intel GPU, MYRIAD 2 VPU.

0 Kudos
idata
Employee
1,576 Views

@alex_z Hi thanks for your reply! In fact I already successfully set up the OpenVINO SDK and used it to convert one trained object detection ssd model but I met some problems. Did you ever try any models on NCS with OpenVINO?

0 Kudos
idata
Employee
1,576 Views

@WuXinyang Yes, I have converted ssd_mobilenet_v1_coco model from Tensorflow detection model zoo and custom trained model based on SSD-Mobilenet v1 that I previously used with OpenCV DNN module. Then both models are run on NCS successfully.

0 Kudos
idata
Employee
1,576 Views

@alex_z OMG!! amazing! Do you mind if you can give some instructions on how to implement it? Maybe you can post them in your blog. I am sure that many many people desire to make tensorflow ssd model work on NCS!

0 Kudos
idata
Employee
1,576 Views

@alex_z I just set up the SDK and tried some sample applications. But I dont know how to compile the tensorflow model into their IR format. And after the convertion I guess I need to use some API in my code like:

 

auto netBuilder = new InferenceEngine::CNNNetReader();

 

netBuilder->ReadNetwork("Model.xml");

 

netBuilder->ReadWeights("Model.bin");

 

This is my understanding, if it is right?

0 Kudos
idata
Employee
1,576 Views

PS. SSD MobileNet V2 is working on NCS via the OpenVINO SDK too.

0 Kudos
idata
Employee
1,576 Views

@WuXinyang I am not good at C++. I use Python API.

0 Kudos
idata
Employee
1,576 Views

@alex_z

 

Hi the code I use to convert TF model is just like followings:

 

python3 mo_tf.py --input_model /home/wuxy/Downloads/ssd_mobilenet_v1_coc

 

o_2017_11_17/frozen_inference_graph.pb --output_dir ~/models_VINO

 

and it returns some errors: [ ERROR ] Graph contains a cycle. Can not proceed.

 

Can you pls tell me how can I make it work?
0 Kudos
idata
Employee
1,576 Views

@WuXinyang Try the following:

 

./mo_tf.py --input_model= --tensorflow_use_custom_operations_config extensions/front/tf/ssd_support.json --output="detection_boxes,detection_scores,num_detections"
0 Kudos
idata
Employee
1,576 Views

@alex_z I found the solution now! Thanks!

0 Kudos
idata
Employee
1,576 Views

@WuXinyang You are welcome.

0 Kudos
idata
Employee
1,576 Views

@alex_z For usage on NCS, I need to add a flag in your code: --data_type FP16

 

since the MYRIAD plugin does not support FP32 and the converted models are FP32 by default.

 

Again thanks for your intuitions! I have been searching for long time to try to find way to make tensorflow object-detection model run on NCS!
0 Kudos
idata
Employee
1,576 Views

@WuXinyang Yes, you are right, I forgot to mention it.

0 Kudos
idata
Employee
1,576 Views

@alex_z Great to hear that it's possible to run TF object detection models on NCS. What kind of fps do you get with that or what is the inference time? For example with the ssd_mobilenet_v1_coco model.

0 Kudos
idata
Employee
1,337 Views

@mantu I have got about 10 FPS with the ssd_mobilenet_v1_coco model.

0 Kudos
Reply