The model loads fine when using CPU but for MYRIAD i get the following error:
How do i get more info about what this even means? Any help is appreciated thanks
INFO Creating Inference Engine
INFO Loading network files:
INFO Preparing input blobs
INFO Batch size is [1, 48, 1, 5, 48]
INFO Loading model to the plugin
Traceback (most recent call last):
File "neural.py", line 55, in <module>
File "neural.py", line 38, in main
exec_net = ie.load_network(network=net, device_name=device)
File "ie_api.pyx", line 85, in openvino.inference_engine.ie_api.IECore.load_network
File "ie_api.pyx", line 92, in openvino.inference_engine.ie_api.IECore.load_network
RuntimeError: AssertionFailed: _allocatedIntermData.count(topParent) > 0
Thanks for reaching out. It's possible that the model you are using is not supported by the Myriad Plugin.
Please provide details about the following:
Hey version info is:
[ INFO ] InferenceEngine:
API version............. 2.1.custom_releases/2019/R3_cb6cad9663aea3d282e0e8b3e0bf359df665d5d0
[ INFO ] Device info
myriadPlugin............ version 2.1
The model is based of a convolution neural network, I didn't creatae the model, but have recently started working with it
Model optimizer command is
mo_tf.py --input_model inference_graph.pb --input X --input_shape [1,1,5,48,48]
As far as the original model, it is a tensorflow model. what files would you like? the frozen .pb files? The source file is here https://github.com/TheCacophonyProject/classifier-pipeline/blob/master/model_crnn.py#L441 with LSTM removed.
The supported model format for MYRIAD plugin is FP16. Please add --data_type FP16 along with the mentioned model optimizer command.
Also, verify that you are using the supported networks for MYRIAD plugin.
If the issue still persists, please share the frozen model(.pb) and other necessary files to reproduce the issue.