Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6403 Discussions

Some questions about OpenVINO Post-Training Optimization Tool (POT)

jacky0327
New Contributor I
1,695 Views

I would like to ask some questions about OpenVINO Post-Training Optimization Tool (POT). https://docs.openvinotoolkit.org/latest/pot_README.html

My task is to quantify the YOLOv4-Tiny model of FP32 and FP16 into an INT8 model.

I have referred to similar questions, but I have successfully quantified YOLOv4-Tiny Model (FP32-INT8/FP16-INT8) in OpenVINO 2021.3 and used it normally.
https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/Quantization-8bit-for-yolov4/td-p/1205...

I refer to the parameters of OpenVINO Accuracy Checker Tool to configure the json file needed by POT.
https://docs.openvinotoolkit.org/latest/omz_tools_accuracy_checker_adapters.html

The quantification can be successful, but the mAP after verification is always 0. I want to know where I went wrong.

Please check my attachment, thank you.

My OpenVINO Post-Training Optimization Tool and OpenVINO Accuracy Checker Tool are installed correctly. The version of OpenVINO I use is 2021.3.
The instructions I use are:

Command:

pot -c yolov4-tiny_voc.json --output-dir backup -e

The results that appear are as follows:

INFO:compression.pipeline.pipeline:Evaluation of generated model
INFO:compression.engines.ac_engine:Start inference on the whole dataset
Total dataset size: 26
26 objects processed in 1.023 seconds
INFO:compression.engines.ac_engine:Inference finished
INFO:app.run:detection_accuracy: 0.0

Command:

accuracy_check -c yolov4-tiny_voc.yml -td CPU

The results that appear are as follows:

26 objects processed in 16.514 seconds
09:50:28 accuracy_checker WARNING: /opt/intel/openvino_2021/deployment_tools/open_model_zoo/tools/accuracy_checker/accuracy_checker/metrics/detection.py:201: UserWarning: No detections to compute mAP
warnings.warn("No detections to compute mAP")

map: 0.00%
AP@0.5: 0.00%
AP@0.5:0.05:95: 0.00%

Please help me, why mAP is always 0, please refer to my attachment.

------------------------------------------------------------------------------------

Sorry, I sent the same question before, but that account never sent a verification letter to my mailbox, and finally even blocked my account, so I applied for an account again to ask questions.

0 Kudos
7 Replies
Zulkifli_Intel
Moderator
1,660 Views

Hi Peng Chang-Jan,


Greetings to you.


We are able to reproduce the case. The result is similar to yours. We still investigate the issue and we will get back to you soon.


Regards,

Zulkifli


0 Kudos
jacky0327
New Contributor I
1,623 Views

Thanks, let me know if there is any progress.

0 Kudos
Zulkifli_Intel
Moderator
1,600 Views

Hello Peng Chang-Jan,


Thank you for your patience and we apologized for the delay. Here are our findings.


Untested dataset – We used the MSCOCO dataset to validate the accuracy as per the documentation, the VOC dataset might not work as expected due to their differences in file architecture. We suggest you move from VOC to COCO if you want to eliminate the mAP 0.00% result during accuracy checker execution, or at least to follow COCO file/folder architecture for your custom dataset.


Metrics used – You are using coco_precision to calculate the mAP which might not work best with the non-COCO dataset. Our recommendation for you is to use different metrics such as detection_accuracy that also work with DetectionAnnotation representation [Metrics - OpenVINO™ Toolkit (openvinotoolkit.org)].


Model Precision – The Intel Open Model Zoo Yolo-v4 was tested and validated using FP16 precision, thus it would be great if you can optimize the Model Optimizer into FP16, instead of the current FP32. 


Regards,

Zulkifli


0 Kudos
jacky0327
New Contributor I
1,589 Views

Please download the compressed file of the attachment.

Regarding the part you mentioned, I replaced voc with coco, and referred to the metrics you taught.

pot -c yolov4-tiny-3l-license_plate_prune_0.46_keep_0.01_416_416_qtz.json --output-dir backup -e


The mAP is still 0. Can you help me test .json?

Or you can help to correct the wrong part and send back the modified and correct compressed file to me, thank you.

0 Kudos
Zulkifli_Intel
Moderator
1,552 Views

Hello Peng Chang-Jan

 

Sorry for the delay.

 

To perform accuracy validation in the post-training optimization tool, you need to create an Accuracy Checker configuration file (.yaml) accordingly. POT configuration file (.json) is created and used to perform model quantization as per in this link: How to Run Examples - OpenVINO™ Toolkit (openvinotoolkit.org)

 

I have created the .yaml file to test the dataset and to obtain the result. Below is the source code. Please test this from your end.

 

Accuracy Checker Configuration File (.yaml) in attachment.

 

 

 

 

 

0 Kudos
Zulkifli_Intel
Moderator
1,499 Views

Hello Peng Chang Jan


This thread will no longer be monitored since we have provided a solution. If you need any additional information from Intel, please submit a new question.


Regards,

Zulkifli


0 Kudos
jacky0327
New Contributor I
1,472 Views

Thank you, I solved it, and would like to ask you why you want to set normalization?

- type: normalization
std: 255.0

Can OpenVINO parse the normalized output of the model? When I remove the normalization, its mAP will be normal.

Is it possible to achieve an acceleration effect using the normalized model?

0 Kudos
Reply