Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2221 Discussions

Nested building of PETSc for OpenFOAM when using Kokkos

FCLC
Beginner
3,121 Views

Hi, 

 

This is a followup question as related to https://community.intel.com/t5/Intel-oneAPI-HPC-Toolkit/MPI-linking-with-LLVM-based-compilers/m-p/1498469/emcs_t/S2h8ZW1haWx8dG9waWNfc3Vic2NyaXB0aW9ufExKODNLTkpFOTEzRzM5fDE0OTg0Njl8U1VCU0NSSVBUSU9OU3xoSw#M10707

If attempting to build PETSc with the LLVM based compilers, Intel MPI and have KOKKOS available, the build will fail during recursive calls to cmake. 

example build command is 

`./configure --prefix=/home/felix/petsc4foam/ThirdParty-v2212/platforms/linux64IcxSPDPInt32/petsc-3.18.2 --PETSC_DIR=/home/felix/petsc4foam/ThirdParty-v2212/petsc-3.18.2 --with-petsc-arch=SPDPInt32 --with-clanguage=C --with-fc=0 --with-x=0 --with-debugging=0 --with-cc="mpiicc -cc=icx" --with-cxx="mpiicpc -cxx=icpx" --with-debugging=0 COPTFLAGS='-diag-disable=10441 -O3 -march=native' CXXOPTFLAGS='-O3 -march=native' --with-blaslapack-dir=$MKLROOT --with-mkl_sparse_optimize=1 --with-precision=single --download-viennacl=1 --download-kokkos=1 `

but this can be simplified to 
./configure --with-cc="mpiicc -cc=icx" --with-cxx="mpiicpc -cxx=icpx" --download-kokkos=1 --with-fc=0`

 

which will then spit out :

```

=============================================================================================

                 Configuring KOKKOS with CMake; this may take several minutes

=============================================================================================

Executing: /usr/bin/cmake .. -DCMAKE_INSTALL_PREFIX=/home/felix/petsc4foam/petsc-3.19.2/arch-linux-c-debug -DCMAKE_INSTALL_NAME_DIR:STRING="/home/felix/petsc4foam/petsc-3.19.2/arch-linux-c-debug/lib" -DC>

stdout:

-- Setting default Kokkos CXX standard to 17

-- The CXX compiler identification is unknown

-- Configuring incomplete, errors occurred!

See also "/home/felix/petsc4foam/petsc-3.19.2/arch-linux-c-debug/externalpackages/git.kokkos/petsc-build/CMakeFiles/CMakeOutput.log".

See also "/home/felix/petsc4foam/petsc-3.19.2/arch-linux-c-debug/externalpackages/git.kokkos/petsc-build/CMakeFiles/CMakeError.log".

                    Error configuring KOKKOS with CMake Could not execute "['/usr/bin/cmake .. -DCMAKE_INSTALL_PREFIX=/home/felix/petsc4foam/petsc-3.19.2/arch-linux-c-debug -DCMAKE_INSTALL_NAME_DIR:STRIN>

-- Setting default Kokkos CXX standard to 17

-- The CXX compiler identification is unknown

-- Configuring incomplete, errors occurred!

See also "/home/felix/petsc4foam/petsc-3.19.2/arch-linux-c-debug/externalpackages/git.kokkos/petsc-build/CMakeFiles/CMakeOutput.log".

See also "/home/felix/petsc4foam/petsc-3.19.2/arch-linux-c-debug/externalpackages/git.kokkos/petsc-build/CMakeFiles/CMakeError.log".CMake Error at CMakeLists.txt:116 (PROJECT):

  The CMAKE_CXX_COMPILER:

 

    mpiicpc -cxx=icpx

 

  is not a full path and was not found in the PATH.

 

  Tell CMake where to find the compiler by setting either the environment

  variable "CXX" or the CMake cache entry CMAKE_CXX_COMPILER to the full path

  to the compiler, or to the compiler name if it is in the PATH.

```


Ideal resolution is probably to introduce something along the lines of mpiicx, mpiicpx and mpiifx. 

ICC and co are the default compilers of intel mpi but are slated for removal this year (2h2023)

Labels (1)
0 Kudos
1 Solution
RabiyaSK_Intel
Employee
2,934 Views

Hi,

 

Thank you for your patience.

 

>>>Would it be possible to forward the feedback requesting introduction of mpiicx compilers as native instead of the current system of extra flags and environment variables?

Thanks for the feedback. We have given the feedback to the concerned development team. The mpi wrappers requested(mpiicx, mpiicpx, mpiifx) will be available from the next release.

 

Meanwhile, you can try exporting instead:

export I_MPI_CC=icx
export I_MPI_CXX=icpx
export I_MPI_F90=ifx

./configure --download-kokkos=1 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=0 

make -j

If you still face any issues, you can reach out to us.

 

Thanks & Regards,

Shaik Rabiya

View solution in original post

0 Kudos
6 Replies
RabiyaSK_Intel
Employee
3,065 Views

Hi,

 

Thanks for posting in Intel Communities.

 

We are able to reproduce your issue. We have informed the concerned development team about it. We will get back to you soon.

 

Thanks & Regards,

Shaik Rabiya

 

0 Kudos
FCLC
Beginner
3,044 Views

Hi Shaik,

 

More than likely there are workarounds via environment variables that should be usable to get around this particular case, but those workarounds will be relatively fragile, much like the mpiicc -cc=icx fix from the other day.

Longer term with the deprecation of icc (and therefore deprecation of mpiicc) there will be a need to create an mpiicx family of MPI compilers anyway.

Would it be possible to forward the feedback requesting introduction of mpiicx compilers as native instead of the current system of extra flags and environment variables?

Cheers,

 

Felix

0 Kudos
RabiyaSK_Intel
Employee
2,935 Views

Hi,

 

Thank you for your patience.

 

>>>Would it be possible to forward the feedback requesting introduction of mpiicx compilers as native instead of the current system of extra flags and environment variables?

Thanks for the feedback. We have given the feedback to the concerned development team. The mpi wrappers requested(mpiicx, mpiicpx, mpiifx) will be available from the next release.

 

Meanwhile, you can try exporting instead:

export I_MPI_CC=icx
export I_MPI_CXX=icpx
export I_MPI_F90=ifx

./configure --download-kokkos=1 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=0 

make -j

If you still face any issues, you can reach out to us.

 

Thanks & Regards,

Shaik Rabiya

0 Kudos
FCLC
Beginner
2,896 Views

@RabiyaSK_Intel wrote:

Hi,

 

Thank you for your patience.

 

>>>Would it be possible to forward the feedback requesting introduction of mpiicx compilers as native instead of the current system of extra flags and environment variables?

Thanks for the feedback. We have given the feedback to the concerned development team. The mpi wrappers requested(mpiicx, mpiicpx, mpiifx) will be available from the next release.

 

Meanwhile, you can try exporting instead:

 

export I_MPI_CC=icx
export I_MPI_CXX=icpx
export I_MPI_F90=ifx

./configure --download-kokkos=1 --with-cc=mpiicc --with-cxx=mpiicpc --with-fc=0 

make -j

 

If you still face any issues, you can reach out to us.

 

Thanks & Regards,

Shaik Rabiya


Hi Shaik,

Thank you so much for the mpi wrapper solution! Happy to hear it's coming. Would you happen to have an approximate timeline? I believe we're currently on version 2023.1.0 of the HPC_toolkit

0 Kudos
RabiyaSK_Intel
Employee
2,227 Views

Hi,


Thanks for accepting our solution.


>>> Would you happen to have an approximate timeline?

We apologize but we cannot provide an accurate date or approximate timeline for 2023.2 release of oneAPI toolkits. 


However, you can check the Intel MPI Library or HPC Toolkit release notes for the inclusion of Intel LLVM compiler based wrappers. 

The link for Intel MPI Library release notes is:

https://www.intel.com/content/www/us/en/developer/articles/release-notes/mpi-library-release-notes.html


The link for HPC Toolkit release notes is:

https://www.intel.com/content/www/us/en/developer/articles/release-notes/intel-oneapi-hpc-toolkit-release-notes.html


Thanks & Regards,

Shaik Rabiya


0 Kudos
FCLC
Beginner
2,127 Views

Hi again!

Wanted to say thanks for acking the issues.

With the latest release of the OneAPI HPC toolkit, when you install intel MPI new wrappers are provided to use them *directly* with the LLVM compilers.

Wanted to send a warm thank you to: yourself and other members of the intel sw and community teams for addressing this issue!

if you could pass that along I'd be grateful

Felix

0 Kudos
Reply