Intel® Distribution of OpenVINO™ Toolkit
Community assistance about the Intel® Distribution of OpenVINO™ toolkit, OpenCV, and all aspects of computer vision-related on Intel® platforms.
6462 Discussions

Not able to run inference on intel discrete graphics cards

ShashankKumar
Employee
411 Views

Hi, 

I have an IR model which is inferencing properly on CPU and iGPU, but when we run it on  intel discrete graphics cards the whole system crashes and inferencing doesn't even start. 

It is getting stuck at the point when we call infer request.

results = compiled_model.infer_new_request({0:I0,1:I1})

I am using Openvino API 2.0 for the inferencing.

0 Kudos
2 Replies
IntelSupport
Community Manager
381 Views

 

Hi ShashankKumar,

 

Thanks for reaching out.

 

Discrete graphics supported with OpenVINO are only Intel® Data Center GPU Flex Series and Intel® Arc GPU. You may refer to the System Requirement documentation.

 

Meanwhile, below is an example of multi-device execution with GPUs as a target device.

compiled_model=core.compile_model(model,"MULTI:GPU.1,GPU.0")

 

Hello Query Device Python Sample can be used to print all available devices with their supported metrics and default values for configuration parameters

 

 

Regards,

Aznie


0 Kudos
IntelSupport
Community Manager
286 Views

Hi ShashankKumar,


This thread will no longer be monitored since we have provided a solution. If you need any additional information from Intel, please submit a new question.



Regards,

Aznie


0 Kudos
Reply