I am getting this error on compiling a Caffe Network
[Error 35] Setup Error: Not enough resources on Myriad to process this network.
How to resolve this?
@macsz If you are receiving this error, this likely means that the model's intermediate processing memory requirements are too large to be processed on the NCS device.
I'm getting the same error. "Setup Error: Not enough resources on Myriad to process this network". This is for my custom network though. Have tried with couple of .pb files with size 15MB and 50MB.
Any debug switches I can enable to get more information ? Please provide an email-ID to which I can send my model.
Since its my custom network, I can modify it to fit into the NCS but I need to understand what should be modified.
When I use the mvNCCompile function for a .meta file with about 320KB I will get the same error :-/
I use the API Version 2 and printed the current memory useage with "print('#################', device.get_option(mvncapi.DeviceOption.RO_CURRENT_MEMORY_USED))" in the TensorFlowParser.py file ( in line 370 in the parse_tensor part). I suspected, that the memory useage will be grown while the mvNCCompile script is running, but the output says me, that the memory useage is always the same. Especially it is lower than the total memory size, which I have got with "total_memory = device.get_option(mvncapi.DeviceOption.RO_MEMORY_SIZE)".
@Tome_at_Intel so which "memory" do you mean in your post: "this likely means that the model's intermediate processing memory requirements are too large to be processed on the NCS device."?
@sneey The NCS comes with 4 Gb (or 500 MB) of DRAM NCS specs here. A portion of the 500 MB is set aside for graph file allocation and another portion is set aside for intermediate processing. If you are receiving this error, it is likely exceeding one of these memory limitations.
You can check your model's intermediate processing memory requirement by editing the FileIO.py file in your /usr/local/bin/ncsdk/Controllers directory. There is a debug flag and if you set the flag to True, you can see the memory requirements for running your model when running mvNCCheck or mvNCCompile.
Thanks for your help @Tome_at_Intel ! So the output says me, the stick has 133Mb storage for intermediate processing and it is needed 148Mb for my model…
So a solution could be to get a stick with more than 500 MB DRAM, so the 4GB Version?
The "NCS specs here" link explains "VPU includes 4Gbits of LPDDR3 DRAM"… so now I am a bit confused :-/ A 500MB Version is not mentioned.