Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.

How to enable the Post-Training Optimization Tool?

11792097
Employee
1,376 Views

I have installed the Accuracy Checker Tool and Post-Training Optimization Toolkit successfully now. I can also run the Accuracy Checker Sample successfully to get the right 75.02% accuracy. 

But when I launch the command-line tool with the configuration file running the following command:

pot -c /home/xu/Desktop/quantize/sample_config.yml 

The log information is shown as below. It seems that there is someting wrong with the configuration file sample_config.yml. (Note that I have modified the related file paths in the configuration file sample_config.yml such as: model, weights and datasets following the guides)

 

14:16:44 accuracy_checker WARNING: /usr/local/lib/python3.5/site-packages/pot-1.0-py3.5.egg/compression/algorithms/quantization/optimization/algorithm.py:44: UserWarning: Nevergrad package could not be imported. If you are planning to use theFQ range optimization algo, consider installing itusing pip. This implies advanced usage of the tool.Note that nevergrad is compatible only with Python 3.6+
  'Nevergrad package could not be imported. If you are planning to use the'

Traceback (most recent call last):
  File "/usr/local/bin/pot", line 11, in <module>
    load_entry_point('pot==1.0', 'console_scripts', 'pot')()
  File "/usr/local/lib/python3.5/site-packages/pot-1.0-py3.5.egg/app/run.py", line 36, in main
    app(sys.argv[1:])
  File "/usr/local/lib/python3.5/site-packages/pot-1.0-py3.5.egg/app/run.py", line 43, in app
    config = Config.read_config(args.config)
  File "/usr/local/lib/python3.5/site-packages/pot-1.0-py3.5.egg/compression/configs/config.py", line 46, in read_config
    cls._configure_engine_params(config)
  File "/usr/local/lib/python3.5/site-packages/pot-1.0-py3.5.egg/compression/configs/config.py", line 231, in _configure_engine_params
    self._configure_ac_params()
  File "/usr/local/lib/python3.5/site-packages/pot-1.0-py3.5.egg/compression/configs/config.py", line 258, in _configure_ac_params
    ConfigReader.check_local_config(ac_conf)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/open_model_zoo/tools/accuracy_checker/accuracy_checker/config/config_reader.py", line 198, in check_local_config
    config_checker_func(config)
  File "/opt/intel/openvino_2020.2.120/deployment_tools/open_model_zoo/tools/accuracy_checker/accuracy_checker/config/config_reader.py", line 140, in _check_models_config
    raise ConfigError('Each model must specify {}'.format(', '.join(required_model_entries)))
accuracy_checker.config.config_validator.ConfigError: Each model must specify name, launchers, datasets

 

So how to solve this problem? Any sugguestion?

Thanks!

0 Kudos
5 Replies
SIRIGIRI_V_Intel
Employee
1,376 Views

Could you please follow the Documentation and provide the config file with the parameters required.

Regards,

Ram prasad

0 Kudos
Mingming_X_Intel
Employee
1,376 Views

Ram prasad (Intel) wrote:

Could you please follow the Documentation and provide the config file with the parameters required.

Regards,

Ram prasad

Hi, Ram prasad,

Thanks for your help!

Sorry to use incorrect configuration file yesterday. Now I use a correct configruation file named sample.json defined by myself. I run the command line as below to quantize deeplabv3 model downloaded from open model zoo:

python3 downloader.py --name deeplabv3

python3 converter.py --name deeplabv3 --mo /opt/intel/openvino_2020.2.120/deployment_tools/model_optimizer/mo.py

pot -c /home/xu/Desktop/quantize/sample.json --output-dir /home/xu/Desktop/quantize

I attach the content of sample.json as below:

{
    "model": {
        "model_name": "deeplabv3",
        "model": "/opt/intel/openvino_2020.2.120/deployment_tools/tools/post_training_optimization_toolkit/libs/open_model_zoo/tools/downloader/public/deeplabv3/FP32/deeplabv3.xml",
        "weights": "/opt/intel/openvino_2020.2.120/deployment_tools/tools/post_training_optimization_toolkit/libs/open_model_zoo/tools/downloader/public/deeplabv3/FP32/deeplabv3.bin"
    },
    "engine": {
        "type": "simplified",
        // you can specify path to directory with images or video file
        // also you can specify template for file names to filter images to load
        // templates are unix style
        "data_source": "<PATH_TO_DATASET>''
    },
    "compression": {
        "target_device": "CPU",
        "algorithms": [
            {
                "name": "DefaultQuantization",
                "params": {
                    "preset": "performance",
                    "stat_subset_size": 300
                }
            }
        ]
    }
}

But I don’t know how to config the "data_source": "<PATH_TO_DATASET>'' here. I don’t want to use the accuracy checker tool, so I set the "engine" to "simplified" mode. Why do I still need the dataset to enable the Post-Training Optimization Tool? And how to set the "data_source" for deeplabv3 model?

Thanks!

Regards,

Mingming (11792097)

0 Kudos
SIRIGIRI_V_Intel
Employee
1,376 Views

The dataset is used to collect statistics during the quantization process. Please use the dataset or the path to directory of images.

Please refer the Model Optimization flow

Hope this helps.

Regards,

Ram prasad

0 Kudos
Mingming_X_Intel
Employee
1,376 Views

Ram prasad (Intel) wrote:

The dataset is used to collect statistics during the quantization process. Please use the dataset or the path to directory of images.

Please refer the Model Optimization flow

Hope this helps.

Regards,

Ram prasad

Hi, Ram prasad,

Thanks very much! I did it as you said successfully.

Regards,

Mingming

0 Kudos
SIRIGIRI_V_Intel
Employee
1,376 Views

I was glad that worked for you. Thank you for sharing with the community.

Regards,
Ram prasad

0 Kudos
Reply