Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6573 ディスカッション

Is it possible to use TensorFlow SSD-MobileNet on NCS?

idata
従業員
10,808件の閲覧回数

I'm working with an object detection model and I would like to use TensorFlow version of SSD-MobileNet. I saw the Caffe version and tried to retrain it, but the results were very poor. After training for 100 hours the mAP was still less than 0.03. I tried to tweak the learning rate and aspect ratios to better suit my dataset (my objects are mostly squares), but that didn't help. Then I switched to TensorFlow Object Detection API to see if there is a problem in my dataset. However, after training for just 6 hours I already got a mAP of 0.5. I also noticed that the TensorFlow version is also much faster on my machine; (0.6 sec / iteration) vs (2 sec / iteration) on caffe. So the TensorFlow version works much better and I'd like to use that instead if possible.

 

Is there any way to convert the model to NCS? And if direct conversion from TensorFlow to NCS is not possible, would it be possible to convert the model to Caffe format and then to NCS? Or could I just copy the TensorFlow model weights to the equivalent Caffe model?

0 件の賞賛
47 返答(返信)
idata
従業員
4,784件の閲覧回数

I'm also interested in this - any feedback on this from anyone?

idata
従業員
4,784件の閲覧回数

Me too! I've got a working model on object detection API. Would love to find a way to do this.

idata
従業員
4,784件の閲覧回数

@manto @djaenicke @owlie We apologize, but the current NCSDK (2.04.00.06) doesn't have support for SSD Mobilenet on TensorFlow yet. As you mentioned we do support SSD MobileNet on Caffe as an alternative.

 

As far as re-training SSD Mobilenet on Caffe, you can try using https://github.com/listenlink/caffe/tree/ssd for more efficient DWS convolution with CUDNN 9.

idata
従業員
4,784件の閲覧回数

You can try Intel OpenVINO™ toolkit. It supports Inference of SSD MobileNet from the TensorFlow Object Detection model zoo on NCS using the Myriad plugin.

idata
従業員
4,784件の閲覧回数

@alex_z hi can you please explain more or give some github repositories?

idata
従業員
4,784件の閲覧回数

@WuXinyang Hi! Sorry for the delayed response. Download and install the last version of OpenVINO Toolkit (https://software.intel.com/en-us/openvino-toolkit/choose-download). Inside installation folder, you can find C++/Python examples and several pre-trained models. Windows and Linux are supported by the toolkit. The main idea is the same as Movidius SDK, you convert a trained model into Intermediate Representation format using Model Optimization then the Inference Engine reads, loads, and infers the Intermediate Representation on different devices such as CPU, Intel GPU, MYRIAD 2 VPU.

idata
従業員
4,784件の閲覧回数

@alex_z Hi thanks for your reply! In fact I already successfully set up the OpenVINO SDK and used it to convert one trained object detection ssd model but I met some problems. Did you ever try any models on NCS with OpenVINO?

idata
従業員
4,784件の閲覧回数

@WuXinyang Yes, I have converted ssd_mobilenet_v1_coco model from Tensorflow detection model zoo and custom trained model based on SSD-Mobilenet v1 that I previously used with OpenCV DNN module. Then both models are run on NCS successfully.

idata
従業員
4,784件の閲覧回数

@alex_z OMG!! amazing! Do you mind if you can give some instructions on how to implement it? Maybe you can post them in your blog. I am sure that many many people desire to make tensorflow ssd model work on NCS!

idata
従業員
4,784件の閲覧回数

@alex_z I just set up the SDK and tried some sample applications. But I dont know how to compile the tensorflow model into their IR format. And after the convertion I guess I need to use some API in my code like:

 

auto netBuilder = new InferenceEngine::CNNNetReader();

 

netBuilder->ReadNetwork("Model.xml");

 

netBuilder->ReadWeights("Model.bin");

 

This is my understanding, if it is right?

idata
従業員
4,784件の閲覧回数

PS. SSD MobileNet V2 is working on NCS via the OpenVINO SDK too.

idata
従業員
4,784件の閲覧回数

@WuXinyang I am not good at C++. I use Python API.

idata
従業員
4,784件の閲覧回数

@alex_z

 

Hi the code I use to convert TF model is just like followings:

 

python3 mo_tf.py --input_model /home/wuxy/Downloads/ssd_mobilenet_v1_coc

 

o_2017_11_17/frozen_inference_graph.pb --output_dir ~/models_VINO

 

and it returns some errors: [ ERROR ] Graph contains a cycle. Can not proceed.

 

Can you pls tell me how can I make it work?
idata
従業員
4,784件の閲覧回数

@WuXinyang Try the following:

 

./mo_tf.py --input_model= --tensorflow_use_custom_operations_config extensions/front/tf/ssd_support.json --output="detection_boxes,detection_scores,num_detections"
idata
従業員
4,784件の閲覧回数

@alex_z I found the solution now! Thanks!

idata
従業員
4,784件の閲覧回数

@WuXinyang You are welcome.

idata
従業員
4,784件の閲覧回数

@alex_z For usage on NCS, I need to add a flag in your code: --data_type FP16

 

since the MYRIAD plugin does not support FP32 and the converted models are FP32 by default.

 

Again thanks for your intuitions! I have been searching for long time to try to find way to make tensorflow object-detection model run on NCS!
idata
従業員
4,784件の閲覧回数

@WuXinyang Yes, you are right, I forgot to mention it.

idata
従業員
4,784件の閲覧回数

@alex_z Great to hear that it's possible to run TF object detection models on NCS. What kind of fps do you get with that or what is the inference time? For example with the ssd_mobilenet_v1_coco model.

idata
従業員
4,545件の閲覧回数

@mantu I have got about 10 FPS with the ssd_mobilenet_v1_coco model.

返信