Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

Convert YOLOv3 Model to IR

verma__Ashish
初学者
9,028 次查看

Hi,

I have followed this link to train yolov3 using Pascal VOC data

https://github.com/AlexeyAB/darknet#how-to-train-to-detect-your-custom-objects

finetuning using darknet53.conv.74 available weights.

after training I got yolov3.weights. I am trying to convert those weights to tensorflow using this link

https://github.com/mystic123/tensorflow-yolo-v3

and this command

python3 convert_weights_pb.py --class_names coco.names --data_format NHWC --weights_file yolov3.weights

But I am getting this error

Traceback (most recent call last):
  File "convert_weights_pb.py", line 53, in <module>
    tf.app.run()
  File "/usr/local/lib/python3.5/dist-packages/tensorflow/python/platform/app.py", line 125, in run
    _sys.exit(main(argv))
  File "convert_weights_pb.py", line 43, in main
    load_ops = load_weights(tf.global_variables(scope='detector'), FLAGS.weights_file)
  File "/home/sr/yolo/tensorflow-yolo-v3/utils.py", line 114, in load_weights
    (shape[3], shape[2], shape[0], shape[1]))
ValueError: cannot reshape array of size 14583 into shape (78,256,1,1)

Do I have to specify yolo cfg file somewhere in flags or I am missing something else

Any help will be appreciated

Regards

Ashish

 

 

0 项奖励
54 回复数
Hyodo__Katsuya
创新者
2,872 次查看
Ummm...Sorry... At the moment it is difficult to identify the cause with my ability...
0 项奖励
verma__Ashish
初学者
2,872 次查看

Thanks for your cooperation and instant replies

Regards

Ashish

 

0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看

Oh, No...

Screenshot 2019-02-27 23:14:33.png

0 项奖励
verma__Ashish
初学者
2,872 次查看

Oh, any progress??

0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看
Yesterday, "Nan" was outputted in large quantity, so I went to bed with disappointment. But today, I got information in my Github repository. I will work on other projects today, so I will try to verify it later. https://github.com/PINTO0309/OpenVINO-YoloV3/issues/13#issuecomment-468203337
0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看
0 项奖励
verma__Ashish
初学者
2,872 次查看

For the time being I have to run it in cpu or gpu but since yolov3 needs libcpuextension.so as to be loaded , I am not able to run it in gpu also. Is there a way I can run standard yolov3 in gpu? NCS is not an option right now for me

0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看

See 4.【Optional execution】 Additional installation steps for processor graphics (GPU).

https://github.com/PINTO0309/OpenVINO-YoloV3#1-work-with-laptoppc-ubuntu-1604

GPU is compatible with Intel Graphics HDxxx series only.

0 项奖励
verma__Ashish
初学者
2,872 次查看

I thought yolov3 needs libcpuextension.so to be link while initializing, so after following

Additional installation steps for processor graphics (GPU).

and loading this library, can we run it in gpu because that I library is for Target cpu only, correct me if I am wrong

Regards

Ashish

0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看
"libcpuextension.so" generation command. $ cd /opt/intel/computer_vision_sdk/inference_engine/samples/ $ sudo ./build_samples.sh Path where "libcpuextension.so" is generated. ~/inference_engine_samples_build/intel64/Release/lib/libcpu_extension.so CPU Mode $ cd ~/inference_engine_samples_build/intel64/Release $ sudo ./object_detection_demo_yolov3_async -i cam -d CPU -m xxx GPU Mode $ sudo ./object_detection_demo_yolov3_async -i cam -d GPU -m xxx
0 项奖励
verma__Ashish
初学者
2,872 次查看

That implies we can run yolov3 openvino version in GPU, Correct ?

If that so, then I will try to do that in parallel

Regards

Ashish

0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看
It works unless the layer of "Not Supported" is included. I am using LattePanda Alpha (Intel Graphics HD 645) when testing GPUs. However, it takes time to remove the device from the warehouse, so I have not tried it now. Please be aware that it will not work with NVIDIA GPU. https://github.com/PINTO0309/OpenVINO-YoloV3#openvino-supported-layers-as-of-dec-25-2018
0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看
Apart from whether the result is correct or not, the value is now output.
b920405@ubuntu:~/git/DW2TF$ python3 openvino_tiny-yolov3_test.py 
{'yolov3-tiny/convolutional13/Conv2D': array([[[[-3.4440000e+03, -4.1920000e+03, -4.2800000e+03, ...,
          -5.8920000e+03, -5.6880000e+03, -2.4320000e+03],
         [-6.5480000e+03, -6.8280000e+03, -6.2720000e+03, ...,
          -8.4880000e+03, -7.4000000e+03, -1.6550000e+03],
         [-7.5680000e+03, -8.1880000e+03, -7.0560000e+03, ...,
          -6.4480000e+03, -6.0560000e+03, -1.7450000e+03],
         ...,
         [-6.6240000e+03, -5.6600000e+03, -4.9600000e+03, ...,
          -5.2480000e+03, -4.4080000e+03, -1.3320000e+03],
         [-6.3520000e+03, -5.5080000e+03, -6.2560000e+03, ...,
          -4.8200000e+03, -4.4480000e+03, -3.0280000e+03],
         [-4.1000000e+03, -5.4800000e+03, -6.8280000e+03, ...,
          -4.6520000e+03, -4.7240000e+03, -2.1600000e+03]],

        [[-4.4200000e+03, -3.5360000e+03, -2.8040000e+03, ...,
          -3.5780000e+03, -2.5220000e+03, -3.3950000e+02],
         [-3.6440000e+03, -3.7980000e+03, -3.0300000e+03, ...,
          -9.2700000e+02, -6.2600000e+02, -1.1190000e+03],
         [-3.8820000e+03, -3.9780000e+03, -3.2780000e+03, ...,
          -1.8560000e+03, -9.9500000e+02, -6.0100000e+02],
         ...,
         [-8.1950000e+02, -5.5440000e+03, -5.2000000e+03, ...,
          -2.6500000e+03, -2.3640000e+03,  7.3250000e+02],
         [-5.4050000e+02, -6.0960000e+03, -6.3840000e+03, ...,
          -2.1860000e+03, -4.0680000e+03,  1.1481250e+02],
         [-2.3800000e+03, -4.2120000e+03, -4.2080000e+03, ...,
           1.4000000e+03, -3.1550000e+02,  3.8375000e+01]],

        [[ 1.3890000e+03,  3.9840000e+03,  4.0600000e+03, ...,
           5.7680000e+03,  5.6680000e+03,  2.7060000e+03],
         [ 1.6370000e+03,  4.5240000e+03,  3.6580000e+03, ...,
           4.8560000e+03,  4.3920000e+03,  1.5030000e+03],
         [ 2.1340000e+03,  4.6480000e+03,  2.7660000e+03, ...,
           4.8840000e+03,  4.4760000e+03,  1.1830000e+03],
         ...,
         [ 2.6740000e+03,  4.9560000e+03,  3.3560000e+03, ...,
           3.1920000e+03,  3.4980000e+03,  4.3375000e+02],
         [ 1.8990000e+03,  3.0440000e+03,  2.0350000e+03, ...,
           2.3440000e+03,  3.7800000e+03,  1.1190000e+03],
         [-1.6540000e+03, -1.0025000e+03, -3.0825000e+02, ...,
           1.4662500e+02,  1.1440000e+03,  4.3575000e+02]],

        ...,

        [[ 2.3462500e+02, -1.5560000e+03, -1.5610000e+03, ...,
          -2.3860000e+03, -2.4000000e+03, -4.7280000e+03],
         [-1.2320000e+03, -3.5140000e+03, -2.9780000e+03, ...,
          -2.3840000e+03, -3.1380000e+03, -5.9480000e+03],
         [-1.5120000e+03, -2.9620000e+03, -2.5280000e+03, ...,
          -1.9790000e+03, -3.3500000e+03, -4.7400000e+03],
         ...,
         [-1.2510000e+03, -3.0540000e+03, -3.1260000e+03, ...,
          -3.6920000e+03, -4.3680000e+03, -3.2760000e+03],
         [-1.9170000e+03, -3.4920000e+03, -2.7640000e+03, ...,
          -2.5380000e+03, -2.5380000e+03, -1.3880000e+03],
         [-1.8200000e+03, -2.0440000e+03, -1.4660000e+03, ...,
          -1.4720000e+03, -5.6000000e+02, -1.2410000e+03]],

        [[-2.8340000e+03, -4.6840000e+03, -5.2480000e+03, ...,
          -2.6560000e+03, -2.9860000e+03, -4.2000000e+03],
         [-7.2880000e+03, -9.0720000e+03, -9.5200000e+03, ...,
          -8.8400000e+03, -8.8720000e+03, -7.5520000e+03],
         [-6.7840000e+03, -9.2800000e+03, -9.8960000e+03, ...,
          -8.4800000e+03, -8.8160000e+03, -6.4800000e+03],
         ...,
         [-7.7680000e+03, -8.7600000e+03, -8.4640000e+03, ...,
          -8.0840000e+03, -7.5560000e+03, -5.7200000e+03],
         [-7.6000000e+03, -9.6880000e+03, -8.3760000e+03, ...,
          -8.5360000e+03, -7.0320000e+03, -2.8780000e+03],
         [-7.2360000e+03, -1.0672000e+04, -9.4480000e+03, ...,
          -7.4280000e+03, -5.1120000e+03, -1.3212500e+02]],

        [[ 9.6800000e+02,  1.7340000e+03,  1.5060000e+03, ...,
           1.0060000e+03,  2.5550000e+02, -4.3600000e+02],
         [ 1.5350000e+03,  2.1340000e+03,  1.9410000e+03, ...,
           6.1650000e+02, -1.6525000e+02, -1.3820000e+03],
         [ 1.2550000e+03,  2.5840000e+03,  1.8450000e+03, ...,
           1.4087500e+02,  1.1308594e+00, -1.0900000e+03],
         ...,
         [ 1.8040000e+03,  3.0340000e+03,  1.5590000e+03, ...,
          -2.5960000e+03, -1.6860000e+03, -2.1120000e+03],
         [ 3.5060000e+03,  3.3960000e+03,  7.5650000e+02, ...,
          -1.3170000e+03, -7.6500000e+02, -1.7440000e+03],
         [ 2.3840000e+03,  2.9920000e+03,  3.8600000e+02, ...,
          -4.6000000e+02, -1.1450000e+02, -8.7300000e+02]]]],
      dtype=float32), 'yolov3-tiny/convolutional10/Conv2D': array([[[[ 2.001600e+04,  1.956800e+04,  1.972800e+04, ...,
           1.985600e+04,  1.936000e+04,  2.046400e+04],
         [ 1.956800e+04,  2.161600e+04,  2.259200e+04, ...,
           2.137600e+04,  2.344000e+04,  2.099200e+04],
         [ 2.044800e+04,  2.275200e+04,  2.241600e+04, ...,
           2.366400e+04,  2.089600e+04,  2.100800e+04],
         ...,
         [ 1.913600e+04,  2.153600e+04,  2.262400e+04, ...,
           2.536000e+04,  2.337600e+04,  2.017600e+04],
         [ 1.985600e+04,  2.024000e+04,  2.305600e+04, ...,
           2.449600e+04,  2.449600e+04,  2.428800e+04],
         [ 1.825600e+04,  2.179200e+04,  2.400000e+04, ...,
           2.625600e+04,  2.596800e+04,  2.771200e+04]],

        [[ 5.828000e+03,  4.068000e+03,  3.832000e+03, ...,
           4.024000e+03,  7.475000e+02, -5.925000e+02],
         [ 5.248000e+03, -3.508000e+03, -5.200000e+03, ...,
          -4.360000e+03, -3.940000e+02, -7.240000e+02],
         [ 5.760000e+03, -1.452000e+03, -2.112000e+03, ...,
          -2.606000e+03, -8.670000e+02, -3.310000e+03],
         ...,
         [ 6.176000e+03,  3.195000e+02, -4.965625e+01, ...,
          -5.460000e+03, -1.678000e+03, -4.596000e+03],
         [ 5.108000e+03, -6.305000e+02, -1.342000e+03, ...,
          -6.005000e+02, -2.190000e+03, -4.520000e+03],
         [ 4.980000e+03, -1.668750e+02,  4.870000e+02, ...,
          -1.472000e+03, -5.056000e+03, -8.624000e+03]],

        [[ 5.400000e+03,  8.992000e+03,  7.708000e+03, ...,
           8.336000e+03,  8.760000e+03,  6.648000e+03],
         [ 9.080000e+03,  1.444000e+04,  1.332000e+04, ...,
           1.140000e+04,  1.077600e+04,  9.976000e+03],
         [ 7.432000e+03,  1.050400e+04,  1.070400e+04, ...,
           1.114400e+04,  1.234400e+04,  1.127200e+04],
         ...,
         [ 8.600000e+03,  1.412800e+04,  1.365600e+04, ...,
           1.200800e+04,  1.211200e+04,  1.210400e+04],
         [ 7.552000e+03,  1.082400e+04,  1.087200e+04, ...,
           7.900000e+03,  1.100000e+04,  7.892000e+03],
         [ 5.108000e+03,  8.084000e+03,  8.664000e+03, ...,
           7.032000e+03,  8.208000e+03,  4.812000e+03]],

        ...,

        [[ 2.265600e+04,  2.817600e+04,  2.507200e+04, ...,
           1.705600e+04,  2.280000e+03, -5.414400e+04],
         [ 3.134400e+04,  3.344000e+04,  2.300800e+04, ...,
           2.806400e+04,  9.792000e+03, -5.747200e+04],
         [ 1.872000e+04,  1.575200e+04, -8.110000e+02, ...,
           1.071200e+04, -1.637000e+03, -5.408000e+04],
         ...,
         [ 1.995200e+04,  1.948800e+04,  1.809600e+04, ...,
           1.894400e+04,  1.533600e+04, -3.753600e+04],
         [ 2.201600e+04,  1.360800e+04,  1.889600e+04, ...,
           2.196800e+04, -1.566400e+04, -2.739200e+04],
         [-3.030000e+03,  5.876000e+03,  7.700000e+03, ...,
          -5.124000e+03, -2.694400e+04, -4.772000e+03]],

        [[ 8.320000e+03,  1.628800e+04,  1.777600e+04, ...,
           1.076800e+04, -7.492000e+03, -5.305600e+04],
         [ 3.217600e+04,  5.276800e+04,  4.796800e+04, ...,
           3.404800e+04,  2.358400e+04, -3.875200e+04],
         [ 1.784000e+04,  3.275200e+04,  2.931200e+04, ...,
           4.016000e+04,  2.123200e+04, -4.432000e+04],
         ...,
         [ 2.470400e+04,  4.473600e+04,  4.688000e+04, ...,
           4.886400e+04,  3.036800e+04, -3.468800e+04],
         [ 2.620800e+04,  4.460800e+04,  5.462400e+04, ...,
           4.748800e+04,  8.856000e+03, -3.881600e+04],
         [ 1.984000e+04,  3.449600e+04,  3.795200e+04, ...,
           3.033600e+04, -1.886400e+04, -3.240000e+04]],

        [[-2.160000e+03,  4.512000e+03,  5.612000e+03, ...,
          -3.686000e+03, -1.883200e+04, -5.164800e+04],
         [ 1.363200e+04,  3.179200e+04,  2.388800e+04, ...,
           8.520000e+03,  4.168000e+03, -3.721600e+04],
         [ 7.715000e+02,  6.044000e+03,  7.520000e+03, ...,
           1.365600e+04,  3.748000e+03, -3.824000e+04],
         ...,
         [ 5.676000e+03,  1.705600e+04,  1.776000e+04, ...,
           2.241600e+04,  6.552000e+03, -3.051200e+04],
         [ 9.408000e+03,  2.260800e+04,  3.664000e+04, ...,
           2.979200e+04,  3.600000e+02, -3.472000e+04],
         [ 9.504000e+03,  2.672000e+04,  2.547200e+04, ...,
           1.897600e+04, -1.737600e+04, -3.014400e+04]]]], dtype=float32)}

 

0 项奖励
verma__Ashish
初学者
2,872 次查看

Hi, there is no unsupported layer in the model but I am not able to run with GPU.

And what have you changed to get some values other than Nan?

Regards

Ashish

0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看
Please see "Edit" frozen_yolov3-tiny.xml "manually" in the following URL. https://github.com/PINTO0309/OpenVINO-YoloV3/wiki/Reference-repository#conversion-success-2 I referred to the posts below. https://software.intel.com/en-us/forums/computer-vision/topic/805425 I still need fine adjustments to "negative_slope".
0 项奖励
verma__Ashish
初学者
2,872 次查看

I am still worried about running the converted model with openvino as my first aim is to reduce the detection time of yolov3, if you are able to run openvino converted version of yolov3 for custom model with correct outputs, kindly update.

Any help will be appreciated

Regards

Ashish

0 项奖励
Batra__Dhruv
初学者
2,872 次查看

Hyodo, Katsuya wrote:

It may be that the operation overflows while converting the model.
If the following command is executed, will an overflow warning be displayed?

$ sudo python3 /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/mo_tf.py \
--input_model data/frozen_yolov3.pb \
--output_dir . \
--data_type FP32 \
--batch 1 \
--input yolov3/net1 \
--output yolov3/convolutional59/BiasAdd,yolov3/convolutional67/BiasAdd,yolov3/convolutional75/BiasAdd \
--log_level WARNING

or

$ sudo python3 /opt/intel/computer_vision_sdk/deployment_tools/model_optimizer/mo_tf.py \
--input_model data/frozen_yolov3.pb \
--output_dir . \
--data_type FP32 \
--batch 1 \
--input yolov3/net1 \
--output yolov3/convolutional59/BiasAdd,yolov3/convolutional67/BiasAdd,yolov3/convolutional75/BiasAdd \
--log_level DEBUG

Does the same procedure works on yolov3-tiny?

Please share the Json file of yolov3-tiny

0 项奖励
Hyodo__Katsuya
创新者
2,872 次查看

@Batra, Dhruv

I did all the work with tiny-YoloV3.

JSON file is unnecessary.

However, I still need to adjust "negative_slope".

https://github.com/PINTO0309/OpenVINO-YoloV3/wiki/Reference-repository#conversion-success-2

If you follow Intel's standard tutorial, please refer to the following URL.

https://github.com/PINTO0309/OpenVINO-YoloV3/blob/master/script.txt

0 项奖励
verma__Ashish
初学者
2,872 次查看

I tried to adjust negative_slope on full yolo but no success. I think Json file is necessary for Full Yolo because without it, I am not able to convert it in xml and bin format.

Did you got any success running converted yolov3 model with correct outputs?

0 项奖励
verma__Ashish
初学者
2,985 次查看

@Hyodo, katsuya

Any updates ??

0 项奖励
Hyodo__Katsuya
创新者
2,985 次查看

@verma, Ashish

sorry. I was absorbed in the Google Edge TPU Accelerator and I completely forgot it.

I will tackle it when I calm down.

0 项奖励
回复