I am currently using in production a NCS2 with openCV provided in the OpenVino 2020.2 package.
The current solution uses the NCS2 with an SSD Mobilenetv2 network. If I run only my software in the PC, the performance is great (90ms per inference), however, when my software is running in parallel with other programs that consume high amount of the CPU (80-90%) the inference times of the network become very unstable and slow, with ranges from 100ms to 1500ms, even sometimes when the CPU load is really high, the inference breaks giving the exception : "Failed to queue inference: NC_ERROR". The Computer I use has a Intel Atom E3845 CPU and uses Windows 10.
So, has CPU usage a big impact in the inference of models in the NCS2, actings as a bottleneck? Is there any way to avoid inference exception (which causes my program to break)?
Also, another curious thing I experienced is that if I run a program that uses two separate threads in parallel for inference, the process runs perfectly on my Intel Xeon E3-1245 with the NCS2. However, if I do the exact same thing in the Atom PC it breaks from the begining. Is the NCS2 able to parallelize calls from multiples threads?
Thank you so much for you attention.
Thanks for reaching out.
I would suggest you run Benchmark app with default parameters on NCS2 device while your Atom processor is used by some other applications. Does Benchmark app shows degraded performance when CPU usage is high?
Please share the output text file of the benchmark app.
Besides that, I suggest you try a method to throttle other applications by changing the processor affinity setting.
Open the “Task Manager,” then go to “Details.”
Right-click on any program or service, and click “Affinity.”
You’ll be able to limit a program or service to certain core processors.
You could read more about other methods in this article. You could also try using third party software recommended in the article, but please note that we have not tested or validated them.
Additionally, could you share more information about the parallel inference error on the Atom PC?
Please include error messages if available.
Intel will no longer monitor this thread since we have provided a solution. If you need any additional information from Intel, please submit a new question.