I am having this issue on Raspberry Pi4. My other IR models are working fine on Pi except this one. That means openvino is working fine on the Pi.
Also, I have tested this model on Intel cpu and gpu, where it works perfectly fine this means that the IR model is also OK. But the issue is when loading on Pi
(Model is attached)
Traceback (most recent call last):
File "recogniser.py", line 60, in <module>
exec_dnet = ie.load_network(dnet,"CPU")
File "ie_api.pyx", line 367, in openvino.inference_engine.ie_api.IECore.load_network
File "ie_api.pyx", line 379, in openvino.inference_engine.ie_api.IECore.load_network
Thank you for reaching out to us.
I have validated your model with Benchmark C++ Tool using CPU plugin on Raspberry Pi 4B (4GB RAM), I encountered the same error as you did.
The error occurred because the requested memory size exceeded the memory capacity of Raspberry Pi for intermediate processing.
Intel Core Processors supported by CPU plugin is far superior in performance and have advanced features compared to Raspberry Pi ARM CPU, which has limited features and capabilities.
As such, it is impractical to expect ARM CPU to inference complex models, which by the way can be inferenced on Intel CPU.
For our discussion purpose, let me share with you the performance differences among different hardware. When I ran Object Detection Demo with person-vehicle-bike-detection-crossroad-0078 model, I obtained 0.2 FPS using ARM CPU (Raspberry Pi 4B), 15 FPS using Intel CPU (Intel Core i7), and 2 FPS using Movidius VPU (NCS2 on Raspberry PI 4).
This thread will no longer be monitored since we have provided a solution.
If you need any additional information from Intel, please submit a new question.