Intel® oneAPI Data Parallel C++
Support for Intel® oneAPI DPC++ Compiler, Intel® oneAPI DPC++ Library, Intel ICX Compiler , Intel® DPC++ Compatibility Tool, and GDB*

Resulting executable and CPU/GPUs

eldiener
New Contributor I
747 Views

If I create a resulting executable with the Data Parallel C++ compiler is that executable limited in any way when being run on non-Intel CPUs or GPUs ?

0 Kudos
3 Replies
RahulV_intel
Moderator
728 Views

Hi,

 

DPC++ compilation can be performed in two ways:

  • JIT (Just in Time) compilation (Online compilation)
  • AOT (Ahead of Time) compilation (Offline compilation)

JIT compilation loads a generic SPIRV image into the executable. The machine code will be generated "on the fly" for the device that is being targeted.

 

AOT compilation loads a device-specific image into the executable. The machine code will be generated during the compilation phase ("Ahead of time") for the target device.

 

In short, the JIT-compiled executables should be portable (provided you have installed the same oneAPI version across both machines).

 

More information here:

https://software.intel.com/content/www/us/en/develop/documentation/oneapi-programming-guide/top/programming-interface/cpu-flow.html

 

Currently, oneAPI doesn't support non-Intel accelerators. You can check the hardware requirements here:

https://software.intel.com/content/www/us/en/develop/articles/intel-oneapi-base-toolkit-system-requirements.html

 

However, the open Github release of DPC++ has limited support for non-Intel accelerators. https://github.com/intel/llvm/blob/sycl/sycl/doc/GetStartedGuide.md

 

 

Thanks,

Rahul

 

0 Kudos
RahulV_intel
Moderator
709 Views

Hi,


Any updates on this? If the solution provided helped, could you please let me know if I can close this thread from my end?


Thanks,

Rahul


0 Kudos
RahulV_intel
Moderator
689 Views

Hi,


I have not heard back from you, so I will go ahead and close this thread from my end. Feel free to post a new query if your issue persists.


Regards,

Rahul


0 Kudos
Reply