Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2159 Discussions

system call intrinsics not working when using ILP64 version of Intel MPI

John_Y_
Beginner
1,616 Views

Hi,

I am using Intel MPI 5.1.2 and ifort version 16.0.1 under linux.  I have a code that uses the 'system' intrinsic to perform some file/directory manipulations that no longer works properly when using the latest version of parallel studio (it works with earlier versions). Attached is a very simple test case that illustrates the problem by attempting to create a directory either using the 'system' function of the 'execute_command_line" function.  Basically, when using the standard 4-byte mpi interface there are no issues.  However, when using the 8-byte ILP64 mpi interface, the system call fails silently and the execute_command_line function crashes the program.  Also attached is a screen shot of the test case command line run on my system.

Another issue I notice with the mpi included in the latest parallel studio is that I keep getting an 'undefined symbol: MPI_F_STATUSES_IGNORE' warning when running the test case with the ILP64 interface.  The ldd output is also shown in the screen shot.  If I perform an

                         nm /PATH/TO/libmpi_ilp64.so.4 | grep STATUSES_IGNORE

                         132:                 U MPI_F_STATUSES_IGNORE

I see that the symbol exists in the library but that it is undefined.

 

0 Kudos
14 Replies
Steven_L_Intel1
Employee
1,616 Views

I'm going to move this to the Clustering (MPI) forum. I don't see that it's a Fortran issue. I'll note that on Linux, EXECUTE_COMMAND_LINE just calls system.

0 Kudos
John_Y_
Beginner
1,616 Views

Hi,

I was just wondering if anybody had a chance to look at the issue described above.

Thanks,

John

0 Kudos
Owen_W_
Beginner
1,616 Views

It seems like this is still an issue. I have been encountering this condition this week under the use of Intel MPI 5.1.3 and ifort 16.0.3 under Linux.

0 Kudos
Robert_Adams
Novice
1,616 Views

Has this issue been addressed?

I'm seeing the same issue in Intel MPI 5.1.3.

Thanks,

Rob

 

0 Kudos
Mark_L_Intel
Moderator
1,616 Views

Hi,

  I used the older versions (MPI and compiler versions ) pretty close to the ones you used but I could reproduce the issue. Can you provide the exact commands that you used to compile and run the attached a.f90? Sorry, the screenshot has rather small font  

Mark

0 Kudos
Mark_L_Intel
Moderator
1,616 Views

correction to the previous post: I meant "could NOT reproduce"

0 Kudos
Mark_L_Intel
Moderator
1,616 Views

 

After little bit more tweaking, I think I was able to reproduce the issue with these commands:

-bash-4.2$ mpiifort -i8 -o test-ilp64.x test-ilp64.f90

-bash-4.2$ mpirun -ilp64 -n 1 ./test-ilp64.x

 

Is that how you were running your example?

 

Thanks

Mark

 

 

0 Kudos
John_Young
New Contributor I
1,616 Views

Hi Mark,

Mark L. (Intel) wrote:

-bash-4.2$ mpiifort -i8 -o test-ilp64.x test-ilp64.f90

-bash-4.2$ mpirun -ilp64 -n 1 ./test-ilp64.x

Yes, that is how I was compiling and running the program. 

Thanks,

John

0 Kudos
Mark_L_Intel
Moderator
1,616 Views

John,

   I talked to engineering - they asked my to submit an internal ticket (I intend to do it). I do not have the time estimate for the resolution although at this point.

If you have Intel Premier Support (IPS) account - you could submit this too through IPS - this way you can track the issue directly.   

Thanks,

Mark

 

0 Kudos
William_D_2
Beginner
1,616 Views

This still seems to be an open issue. Any ideas on when it will be fixed?

0 Kudos
Mark_L_Intel
Moderator
1,616 Views

Let me check. What version (of Intel MPI) do you use now? Can you use the latest 2017 Update 2? 

0 Kudos
kostas_s_1
Beginner
1,616 Views

still unresolved with 2017.2 and 2017.3

0 Kudos
Sachin_m_
Novice
1,616 Views

Still unresolved with 2018.0.2

0 Kudos
Sergey_Y_Intel
Employee
1,616 Views

Sachin m. wrote:

Still unresolved with 2018.0.2

Please check the latest Intel MPI. Issue is fixed since 2018 Update 3.

0 Kudos
Reply