Intel® Distribution of OpenVINO™ Toolkit
Community support and discussions about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all things computer vision-related on Intel® platforms.

UNet model stuck on Starting inference

dopeuser
Novice
1,313 Views

I am trying to run the UNet model on Neural Compute Stick. I successfully converted the model to FP16 and ran on the CPU. However, as soon as I change the device to MYRIAD, it gets stuck on Starting inference.

[ INFO ] Creating Inference Engine
[ INFO ] Loading network
[ INFO ] Preparing input blobs
[ WARNING ] Image ..\cat.2000.jpg is resized from (499, 459) to (224, 224)
[ INFO ] Batch size is 1
[ INFO ] Loading model to the plugin
[ INFO ] Starting inference

 

I am using the segmentation_demo code. My IR file is attached below.

0 Kudos
1 Solution
RandallMan_B_Intel
1,045 Views

Hello dopeuser,


Thanks for your patience. The engineering team already checked your issue; the U-net is a heavy model that seems to be exceeding the memory capability of Intel NCS 2. When checking the memory consumption at runtime, your model was almost twice as much as the unet-camvid-onnx-0001 model. Unfortunately, both these models are not supported on MYRIAD devices. 


Best regards,

Randall.


View solution in original post

11 Replies
RandallMan_B_Intel
1,302 Views

Hi dopeuser,


UNet is not supported by Intel Neural Compute Stick (Myriad device). Check the MYRIAD Plugin to see the Networks supported and additional information.


Regards,

Randall B.


dopeuser
Novice
1,294 Views
RandallMan_B_Intel
1,274 Views

Hi, dope user,


Could you share with us more information to test from our end? For example, the commands used to run your model and used to convert the model to FP16.


Regards,

Randall B.


dopeuser
Novice
1,266 Views
I am using the following code to convert the exported pytorch graph: 
 
python "%openvino_dir%\deployment_tools\model_optimizer\mo.py" --input_model ".\temp\unet.onnx" --log_level=ERROR --input_shape "(1,3, 224,224)" --output_dir converted_unet --input=input.1 --output=Conv_338 --reverse_input_channels --data_type FP16
 
I have attached the exported graph (unet.xml).
 
For testing the downloaded model from your website and my converted model, I used the OpenVino's python segmentation demo available in C:\Program Files (x86)\IntelSWTools\openvino_2020.4.287\deployment_tools\open_model_zoo\demos\python_demos\segmentation_demo\segmentation_demo.py
 
But this does not work only when I set device="MYRIAD" for my model. The downloaded model works well on "MYRIAD". Both the models work well on CPU.
 
To freeze the PyTorch model I am using:
weights = torch.load(modelfname)
inference_model.load_state_dict(weights)
dummy_input = torch.randn(13224224)
os.makedirs('./temp'exist_ok=True)
torch.onnx.export(inference_model, dummy_input, temp/unet.onnx"opset_version=11)
 
 
I also tried using opset_version=10
 
RandallMan_B_Intel
1,257 Views

Hi dopeuser,


Thanks for your reply, we are currently looking into your issue and we will come back to you as soon as possible.


Regards,

Randall B.


dopeuser
Novice
1,219 Views
RandallMan_B_Intel
1,208 Views

Hi dopeuser,


The engineering team is still working on that to check the UNet model on Intel NCS2.


Regards,

Randall.


RandallMan_B_Intel
1,184 Views

Hello dopeuser,


Thanks for your patience. Additionally, we need your .bin file, could you provide us the onnx model to reproduce and see if all layers are supported on MYRIAD.


Regards,

Randall.


dopeuser
Novice
1,174 Views

I have attached the files. Let me know if you need anything else.

RandallMan_B_Intel
1,046 Views

Hello dopeuser,


Thanks for your patience. The engineering team already checked your issue; the U-net is a heavy model that seems to be exceeding the memory capability of Intel NCS 2. When checking the memory consumption at runtime, your model was almost twice as much as the unet-camvid-onnx-0001 model. Unfortunately, both these models are not supported on MYRIAD devices. 


Best regards,

Randall.


dopeuser
Novice
1,040 Views
Reply