- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I installed parallel_studio_xe_2018_update2_cluster_edition_setup on Microsoft Windows [Version 10.0.17134.165], and I noticed mpi fails to run on my laptop. I built the simple mpi test (test.f90) that comes with the suit and it fails when running in parallel.
This is some env paths:
where mpiexec C:\Program Files (x86)\IntelSWTools\compilers_and_libraries_2018.2.185\windows\mpi\intel64\bin\mpiexec.exe C:\Program Files (x86)\Common Files\Intel\Shared Libraries\redist\intel64_win\mpirt\mpiexec.exe C:\Program Files (x86)\Common Files\Intel\Shared Libraries\redist\ia32_win\mpirt\mpiexec.exe mpiexec --version Intel(R) MPI Library for Windows* OS, Version 2018 Update 2 Build 20180125 Copyright 2003-2018 Intel Corporation.
This is the error that I receive when I run it
mpiexec -np 2 -localonly test.exe [unset]: Error reading initack on 620 Error on readline:: No error [unset]: write_line error; fd=620 buf=:cmd=init pmi_version=1 pmi_subversion=1 : system msg for write_line failure : No error [unset]: Unable to write to PMI_fd [unset]: write_line error; fd=620 buf=:cmd=barrier_in : system msg for write_line failure : No error [unset]: write_line error; fd=620 buf=:cmd=get_ranks2hosts : system msg for write_line failure : No error [unset]: expecting cmd="put_ranks2hosts", got cmd="" Fatal error in MPI_Init: Other MPI error, error stack: MPIR_Init_thread(805): fail failed MPID_Init(1743)......: channel initialization failed MPID_Init(2144)......: PMI_Init returned -1 [unset]: write_line error; fd=620 buf=:cmd=abort exitcode=68204815 : system msg for write_line failure : No error [unset]: Error reading initack on 684 Error on readline:: No error [unset]: write_line error; fd=684 buf=:cmd=init pmi_version=1 pmi_subversion=1 : system msg for write_line failure : No error [unset]: Unable to write to PMI_fd [unset]: write_line error; fd=684 buf=:cmd=barrier_in : system msg for write_line failure : No error [unset]: write_line error; fd=684 buf=:cmd=get_ranks2hosts : system msg for write_line failure : No error [unset]: expecting cmd="put_ranks2hosts", got cmd="" Fatal error in MPI_Init: Other MPI error, error stack: MPIR_Init_thread(805): fail failed MPID_Init(1743)......: channel initialization failed MPID_Init(2144)......: PMI_Init returned -1 [unset]: write_line error; fd=684 buf=:cmd=abort exitcode=68204815 : system msg for write_line failure : No error
I want to mention that this same test worked fine on other desktop machines with OS windows 10, but fails on others.
Could you please let me know if there is a fix for this issue.
Thanks !
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have tested the latest update (update 3) and it is also giving the same error.
Could you please let me know if there is a fix for this as we have large open CFD code project that uses parallel MPI, and this issue has been reported by couple of users who are not able to run parallel MPI anymore.
Thanks !
Below is another example I used to test it with.
#include "mpi.h" #include <stdio.h> #include <string.h> int main (int argc, char *argv[]) { int i, rank, size, namelen; char name[MPI_MAX_PROCESSOR_NAME]; MPI_Status stat; MPI_Init (&argc, &argv); MPI_Comm_size (MPI_COMM_WORLD, &size); MPI_Comm_rank (MPI_COMM_WORLD, &rank); MPI_Get_processor_name (name, &namelen); if (rank == 0) { printf ("Hello world: rank %d of %d running on %s\n", rank, size, name); for (i = 1; i < size; i++) { MPI_Recv (&rank, 1, MPI_INT, i, 1, MPI_COMM_WORLD, &stat); MPI_Recv (&size, 1, MPI_INT, i, 1, MPI_COMM_WORLD, &stat); MPI_Recv (&namelen, 1, MPI_INT, i, 1, MPI_COMM_WORLD, &stat); MPI_Recv (name, namelen + 1, MPI_CHAR, i, 1, MPI_COMM_WORLD, &stat); printf ("Hello world: rank %d of %d running on %s\n", rank, size, name); } } else { MPI_Send (&rank, 1, MPI_INT, 0, 1, MPI_COMM_WORLD); MPI_Send (&size, 1, MPI_INT, 0, 1, MPI_COMM_WORLD); MPI_Send (&namelen, 1, MPI_INT, 0, 1, MPI_COMM_WORLD); MPI_Send (name, namelen + 1, MPI_CHAR, 0, 1, MPI_COMM_WORLD); } MPI_Finalize (); return (0); }
I have compiled the above code :
mpicc -o test.exe test.c
Same error after using MPI
mpiexec -np 2 -localonly test.exe [unset]: Error reading initack on 668 Error on readline:: No error [unset]: write_line error; fd=668 buf=:cmd=init pmi_version=1 pmi_subversion=1 : system msg for write_line failure : No error [unset]: Unable to write to PMI_fd [unset]: write_line error; fd=668 buf=:cmd=barrier_in : system msg for write_line failure : No error [unset]: write_line error; fd=668 buf=:cmd=get_ranks2hosts : system msg for write_line failure : No error [unset]: expecting cmd="put_ranks2hosts", got cmd="" Fatal error in MPI_Init: Other MPI error, error stack: MPIR_Init_thread(805): fail failed MPID_Init(1743)......: channel initialization failed MPID_Init(2144)......: PMI_Init returned -1 [unset]: write_line error; fd=668 buf=:cmd=abort exitcode=68204815 : system msg for write_line failure : No error [unset]: Error reading initack on 684 Error on readline:: No error [unset]: write_line error; fd=684 buf=:cmd=init pmi_version=1 pmi_subversion=1 : system msg for write_line failure : No error [unset]: Unable to write to PMI_fd [unset]: write_line error; fd=684 buf=:cmd=barrier_in : system msg for write_line failure : No error [unset]: write_line error; fd=684 buf=:cmd=get_ranks2hosts : system msg for write_line failure : No error [unset]: expecting cmd="put_ranks2hosts", got cmd="" Fatal error in MPI_Init: Other MPI error, error stack: MPIR_Init_thread(805): fail failed MPID_Init(1743)......: channel initialization failed MPID_Init(2144)......: PMI_Init returned -1 [unset]: write_line error; fd=684 buf=:cmd=abort exitcode=68204815 : system msg for write_line failure : No error
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page