Intel® oneAPI DPC++/C++ Compiler
Talk to fellow users of Intel® oneAPI DPC++/C++ Compiler and companion tools like Intel® oneAPI DPC++ Library, Intel® DPC++ Compatibility Tool, and Intel® Distribution for GDB*

icpx: unknown target CPU 'coffeelake'

AlexanderS
Beginner
864 Views

Hello,

 

The documentation for -march says I can use -march=coffeelake.

 

However, I get "unknown target CPU 'coffeelake'" error:

 

$ icpx -o test test.cpp -march=coffeelake
error: unknown target CPU 'coffeelake'
note: valid target CPU values are: nocona, core2, penryn, bonnell, atom, silvermont, slm, goldmont, goldmont-plus, tremont, nehalem, corei7, westmere, sandybridge, corei7-avx, ivybridge, core-avx-i, haswell, core-avx2, broadwell, common-avx512, skylake, skylake-avx512, skx, cascadelake, cooperlake, cannonlake, icelake-client, rocketlake, icelake-server, tigerlake, sapphirerapids, alderlake, knl, knm, k8, athlon64, athlon-fx, opteron, k8-sse3, athlon64-sse3, opteron-sse3, amdfam10, barcelona, btver1, btver2, bdver1, bdver2, bdver3, bdver4, znver1, znver2, znver3, x86-64, x86-64-v2, x86-64-v3, x86-64-v4

 

$ icpx -v
Intel(R) oneAPI DPC++/C++ Compiler 2022.0.0 (2022.0.0.20211123)
Target: x86_64-unknown-linux-gnu
Thread model: posix
InstalledDir: /opt/intel/oneapi/compiler/2022.0.2/linux/bin-llvm
Found candidate GCC installation: /usr/lib64/gcc/x86_64-suse-linux/11
Found candidate GCC installation: /usr/lib64/gcc/x86_64-suse-linux/7
Found candidate GCC installation: /usr/lib64/gcc/x86_64-suse-linux/8
Selected GCC installation: /usr/lib64/gcc/x86_64-suse-linux/11
Candidate multilib: .;@m64
Selected multilib: .;@m64

 

Is the documentation incorrect, or is it a bug in the compiler?

 

P.S. I'm compiling for i9-9900K CPU.

 

Thanks!

0 Kudos
2 Replies
VarshaS_Intel
Moderator
841 Views

Hi,


Thanks for posting in Intel Communities.


We are working on your issue internally. We will get back to you soon.


Thanks & Regards,

Varsha


0 Kudos
Alex_Y_Intel
Moderator
823 Views

Your question has been escalated to our development team. We'll work on it internally and come back with update later.


0 Kudos
Reply