Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6256 Discussions

DeepLab v3 Tensorflow model optimizer/convert fails with ResizeBilinear layer not found



I'm trying to convert a Tensorflow model to openvino using model optimizer.

I'm using version openvino_2019.1.133

I run the command of the optimizer and all is ok as you can see below:

python  --input_model "model_frozen_29604.pb" --output_dir "openvino" --input_shape [1,512,512,3] --output "prob" --input "input"

Model Optimizer arguments:
Common parameters:
        - Path to the Input Model:      ...model_frozen_29604.pb
        - Path for generated IR:        ...openvino
        - IR output name:       model_frozen_29604
        - Log level:    ERROR
        - Batch:        Not specified, inherited from the model
        - Input layers:         input
        - Output layers:        prob
        - Input shapes:         [1,512,512,3]
        - Mean values:  Not specified
        - Scale values:         Not specified
        - Scale factor:         Not specified
        - Precision of IR:      FP32
        - Enable fusing:        True
        - Enable grouped convolutions fusing:   True
        - Move mean values to preprocess section:       False
        - Reverse input channels:       False
TensorFlow specific parameters:
        - Input model in text protobuf format:  False
        - Path to model dump for TensorBoard:   None
        - List of shared libraries with TensorFlow custom layers implementation:        None
        - Update the configuration file with input/output node names:   None
        - Use configuration file used to generate the model with Object Detection API:  None
        - Operations to offload:        None
        - Patterns to offload:  None
        - Use the config file:  None
Model Optimizer version:        2019.1.0-341-gc9b66a2

[ SUCCESS ] Generated IR model.
[ SUCCESS ] XML file: ...openvino\model_frozen_29604.xml
[ SUCCESS ] BIN file: ...openvino\model_frozen_29604.bin
[ SUCCESS ] Total execution time: 103.00 seconds.


But then when i try to test the port using the python it fails here:

 if plugin.device == "CPU":
        supported_layers = plugin.get_supported_layers(net)
        not_supported_layers = [l for l in net.layers.keys() if l not in supported_layers]
        if len(not_supported_layers) != 0:
            log.error("Following layers are not supported by the plugin for specified device {}:\n {}".
                      format(plugin.device, ', '.join(not_supported_layers)))
            log.error("Please try to specify cpu extensions library path in sample's command line parameters using -l "
                      "or --cpu_extension command line argument")


and shows me the following error message:

Following layers are not supported by the plugin for specified device CPU:
 DeepLab_v3/ASPP_layer/ResizeBilinear, DeepLab_v3/ResizeBilinear

Any help please?

What i'm i missing here?

thank you very much.

0 Kudos
1 Reply

even im also getting the same error  on my new OAK-D  device /// got successfully converted into IR  // but when i try to convert them into blob format .. it fails..// only for deeplabv3 its showing error..  :((

0 Kudos