Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.

Cannot use MPI 2019.0.045 beta

Hector
Beginner
1,793 Views

Hi there!

I am unable to use Intel MPI library from Parallel Studio XE beta 2019 in Windows.

I am trying to compile the following code with mpicc:
http://people.sc.fsu.edu/~jburkardt/c_src/hello_mpi/hello_mpi.c

It seems to compile ok.

However, when I run it with mpiexec there is no output.
mpiexec -n 1 hello_mpi.exe

I don't have this problem with Intel MPI 2018.

Thanks for your help,

Hector

0 Kudos
4 Replies
Chuck1
Beginner
1,793 Views

Exact same problem here- compiles fine, hydra_services running fine, but no output as you pointed. 

I asked it first here, got no response;;

https://software.intel.com/en-us/forums/intel-math-kernel-library/topic/776981

0 Kudos
ALaza1
Novice
1,793 Views

same problem here: compile under Version 19.0.0.045 Beta Build 20180317 followed by a plain ol' mpiexec. program exits immediately. I also get the fi_info error.

I previously did an mpiexec -register successfully.

Also using

I_MPI_FABRICS=shm:tcp
I_MPI_PIN_PROCESSOR_LIST=allcores
I_MPI_ROOT=C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2019.0.045\windows\mpi\intel64\bin\..\..
I_MPI_TCP_NETMASK=192.168.3.0/24

C:\cygwin64\home\art\Fortran\PI>mpiifort.bat -fpp -O3 -Qip -QaxCORE-AVX2 -QxSSE4.2 -traceback -o pi pi.f90
mpifc.bat for the Intel(R) MPI Library 2019 Beta for Windows*
Copyright 2007-2018 Intel Corporation.

Intel(R) Visual Fortran Intel(R) 64 Compiler for applications running on Intel(R) 64, Version 19.0.0.045 Beta Build 20180317
Copyright (C) 1985-2018 Intel Corporation.  All rights reserved.

ifort: NOTE: The Beta evaluation period for this product ends on 11-oct-2018 UTC.
Microsoft (R) Incremental Linker Version 14.14.26429.4
Copyright (C) Microsoft Corporation.  All rights reserved.

-out:pi.exe
-subsystem:console
-incremental:no
"/LIBPATH:C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2019.0.045\windows\mpi\intel64\bin\..\..\intel64\lib\relea
se_mt"
"/LIBPATH:C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2019.0.045\windows\mpi\intel64\bin\..\..\intel64\lib"
impi.lib
pi.obj
C:\cygwin64\home\art\Fortran\PI>mpiexec pi

C:\cygwin64\home\art\Fortran\PI>

Art

0 Kudos
James_T_Intel
Moderator
1,793 Views

We are attempting to identify the source of this problem.  Please try running with -localonly to help determine if all of the necessary DLL files are being loaded.

Also, please submit bugs such as this to our support site at https://www.intel.com/supporttickets so that we can better track submissions and our resolution process.

0 Kudos
ALaza1
Novice
1,793 Views

installed v19.0.0.070 update 1 today.

1) the fi_info error goes away

2) firewall rule for smpd.exe gets added but none added for hydra_service. I think hydra_service obsoletes smpd.exe anyway.

3) mpiexec -localonly works OK but doesn't work without -localonly fails to open a socket.

example

C:\cygwin64\home\art\Fortran\PI>mpiifort.bat -fpp -O3 -Qip -QaxCORE-AVX2 -QxSSE4.2 -traceback -o pi pi.f90
mpifc.bat for the Intel(R) MPI Library 2019 Beta Update 1 for Windows*
Copyright 2007-2018 Intel Corporation.

Intel(R) Visual Fortran Intel(R) 64 Compiler for applications running on Intel(R) 64, Version 19.0.0.070 Beta Build 20180524
Copyright (C) 1985-2018 Intel Corporation.  All rights reserved.

Microsoft (R) Incremental Linker Version 14.14.26429.4
Copyright (C) Microsoft Corporation.  All rights reserved.

-out:pi.exe
-subsystem:console
-incremental:no
"/LIBPATH:C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2019.0.070\windows\mpi\intel64\bin\..\..\intel64\lib\relea
se_mt"
"/LIBPATH:C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2019.0.070\windows\mpi\intel64\bin\..\..\intel64\lib"
impi.lib
pi.obj
C:\cygwin64\home\art\Fortran\PI>mpiexec pi
[mpiexec@blaze] ..\windows\src\hydra_sock.c (229): Retrying connection, retry_count=1, retries=0
[mpiexec@blaze] bstrap\service\service_launch.c (123): assert (!closed) failed
[mpiexec@blaze] bstrap\src\hydra_bstrap.c (348): error launching bstrap proxy
[mpiexec@blaze] mpiexec.c (1402): error setting up the boostrap proxies

C:\cygwin64\home\art\Fortran\PI>mpiexec -localonly pi
MPI startup(): shm:tcp fabric is unknown or has been removed from the product, please use ofi or shm:ofi instead
MPI startup(): I_MPI_TCP_NETMASK variable has been removed from the product, its value is ignored
Hostname = blaze Process     0  of     1 is alive
 number of intervals =            1
  pi is approximately:  3.200000000000000  Error is:  0.058407346410207
 number of intervals =            2
  pi is approximately:  3.162352941176470  Error is:  0.020760287586677
 number of intervals =            3
  pi is approximately:  3.150849209865604  Error is:  0.009256556275810
 number of intervals =            4
  pi is approximately:  3.146800518393943  Error is:  0.005207864804150
 number of intervals =            5
  pi is approximately:  3.144925864003328  Error is:  0.003333210413535
 number of intervals =            6
  pi is approximately:  3.143907427222438  Error is:  0.002314773632645
 number of intervals =            7
  pi is approximately:  3.143293317527468  Error is:  0.001700663937675
 number of intervals =            8
  pi is approximately:  3.142894729591688  Error is:  0.001302076001895
 number of intervals =            9
  pi is approximately:  3.142621456557613  Error is:  0.001028802967820
 program done, going to sleep for 1 min.
[mpiexec@blaze] Sending Ctrl-C to processes as requested
[mpiexec@blaze] Press Ctrl-C again to force abort
[mpiexec@blaze] mpiexec.c (1527): assert (exitcodes != NULL) failed

C:\cygwin64\home\art\Fortran\PI>mpiexec  pi
[mpiexec@blaze] ..\windows\src\hydra_sock.c (229): Retrying connection, retry_count=1, retries=0
[mpiexec@blaze] bstrap\service\service_launch.c (123): assert (!closed) failed
[mpiexec@blaze] bstrap\src\hydra_bstrap.c (348): error launching bstrap proxy
[mpiexec@blaze] mpiexec.c (1402): error setting up the boostrap proxies

C:\cygwin64\home\art\Fortran\PI>

 

art

0 Kudos
Reply