Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2171 Discussions

MPI_f08 with polymorphic argument CLASS(*)

hakostra1
New Contributor II
938 Views

Over in the Fortran forum here I reported a bug last summer. It was originally about calling some MPI_f08 functions, but ifx gave compilation errors. @Steve_Lionel helped me boil it down to a code snippet that did not depend on MPI, the bug was fixed in version 2024.1 of the Intel Fortran compiler.

However, my original MPI code still does not compile. I have now boiled it down to the following example:

 

SUBROUTINE test(baz, dtype)
    USE MPI_f08
    IMPLICIT NONE (type, external)

    ! Subroutine arguments
    CLASS(*) :: baz
    TYPE(MPI_Datatype) :: dtype

    ! Local variables
    TYPE(MPI_Request) :: recvreq

    CALL MPI_Irecv(baz, 1, dtype, 0, 0, MPI_COMM_SELF, recvreq)
END SUBROUTINE test

 

Compiling this with mpiifx gives:

 

example-2.F90(12): error #8769: If the actual argument is unlimited polymorphic, the corresponding dummy argument must also be unlimited polymorphic.   [BAZ]
    CALL MPI_Irecv(baz, 1, dtype, 0, 0, MPI_COMM_SELF, recvreq)
-------------------^
compilation aborted for example-2.F90 (code 1)

 

The version of the compiler and MPI library is:

 

$ mpiifx --version
ifx (IFX) 2024.1.0 20240308
Copyright (C) 1985-2024 Intel Corporation. All rights reserved.

$ mpirun --version
Intel(R) MPI Library for Linux* OS, Version 2021.12 Build 20240213 (id: 4f55822)
Copyright 2003-2024, Intel Corporation.

 

Since the compiler bug is apparently solved in the compiler version I use now, I wonder if there could be a bug somewhere in the MPI_f08 bindings? Please refer to the original post in the Fortran part of the forum for more information on the original problem.

Both GFortran + OpenMPI and NAG Fortran compiler + MPICH compile the above example code just fine without any errors or warnings.

Thanks in advance for all help.

0 Kudos
1 Solution
TobiasK
Moderator
610 Views

@hakostra1 

We will fix it with the next release.
In case you cannot modify the installation folder, you can of course just recompile the module files in a folder of your choice and add that folder to your compile / link line.

View solution in original post

0 Kudos
7 Replies
TobiasK
Moderator
756 Views

@hakostra1


we made some changes in the F08 bindings recently, let me check this.


0 Kudos
TobiasK
Moderator
722 Views

@hakostra1 
thanks for reporting this. Are you using Windows or Linux?

0 Kudos
hakostra1
New Contributor II
713 Views

Thanks for looking into this. I'm using Linux.

0 Kudos
TobiasK
Moderator
684 Views

@hakostra1 

for Linux please try (after setting up the oneapi 2024.1 environment)

cd $I_MPI_ROOT/opt/mpi/binding
tar -xf intel-mpi-binding-kit.tar.gz
cd f08
make MPI_INST=${I_MPI_ROOT} F90=ifx NAME=ifx
cd include/ifx
mkdir ${I_MPI_ROOT}/include/mpi/back
cp ${I_MPI_ROOT}/include/mpi/* ${I_MPI_ROOT}/include/mpi/back
cp * ${I_MPI_ROOT}/include/mpi/

 

and try your example again.

 

Best

Tobias

0 Kudos
hakostra1
New Contributor II
631 Views

Yes, the example compiles now. Thanks. I guess this means there is a problem with the included MPI_f08 module files, then?

Is there any plans to update/fix these in a future release? Because this is a quite tedious fix to apply to every computer and workstation where I want to compile my software. In many cases I do not even have the necessary privileges to conduct this fix...

Anyways, thanks for the help so far!

0 Kudos
TobiasK
Moderator
611 Views

@hakostra1 

We will fix it with the next release.
In case you cannot modify the installation folder, you can of course just recompile the module files in a folder of your choice and add that folder to your compile / link line.

0 Kudos
hakostra1
New Contributor II
609 Views

Great to hear it will be fixed! That makes life so much easier for everyone!

 

Thanks a lot for looking into this!

0 Kudos
Reply