Community
cancel
Showing results for 
Search instead for 
Did you mean: 
Peniak__Martin
Beginner
121 Views

Running inference on Intel Atom C2308

Hi guys,

I have created a docker container that runs my application that uses OpenVINO to run the inference. Everything works fine on my host machine (COU mode) but when I deploy this docker on a different machine with the Atom C2308 processor, it crashes with the error: Illegal Instruction...see below:

Mon Aug 6 13:39:36 2018: Debug: Starting
Mon Aug 6 13:39:36 2018: Debug: Loading OpenVINO localisation model
Mon Aug 6 13:39:36 2018: Debug: Target platform "CPU"
Mon Aug 6 13:39:36 2018: Debug: Inference plugin loaded
Mon Aug 6 13:39:36 2018: Debug: Plugin extension added
Mon Aug 6 13:39:36 2018: Debug: Network loaded
inputDims=300 300 3 1 
outputDims=1 1 100 7 
SSD Mode
Illegal instruction

Am I doing something wrong or is this Atom processor currently not supported? 

Any help will be very much appreciated!

Thank you,
Martin Peniak
Cortexica Vision Systems

0 Kudos
1 Reply
121 Views

Just a guess, but I don't believe the Atom processor has all of the instructions your host has (e.g., AVX).  Try to recompile your application on the Atom, or use appropriate compiler flags to limit the architectural features used.  

 

Reply