Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2159 Discussions

mpdring on user's acc using the mpdring of root

Sangamesh_B_
Beginner
445 Views
In a 512 core RedHat Linux cluster, the mpdring under user's acc is not working based on root user's mpdring.

The following is the error:

:~> mpdtrace
mpdtrace (send_dict_msg 554):send_dict_msg raised exception: sock= errmsg=:[Errno 32] Broken pipe:
mpdtb:
/opt/intel/impi/3.2.1.009/bin64/mpdlib.py, 554, send_dict_msg
/opt/intel/impi/3.2.1/bin64/mpdtrace, 79, mpdtrace
/opt/intel/impi/3.2.1/bin64/mpdtrace, 118,

mpdtrace: unexpected msg from mpd=:{'error_msg': 'invalid secretword to root mpd'}: Please examine the /tmp/mpd2.logfile_username logfile on each host in the ring

The .mpd.conf file in user's home directory has a coorect setting:
:~> cat .mpd.conf
MPD_USE_ROOT_MPD=1

However, if an own mpdring is created in user's acc, then it works.

May I know what's going wrong here..
0 Kudos
3 Replies
Dmitry_K_Intel2
Employee
445 Views
Hi San,

Looks like your .mpd.conf should be:
MPD_SECRET_WORD=your_secret_word
MPD_USE_ROOT_MPD=yes


Pay attention, that version 3.2.1 is quite old and I'd recommendupdating it to the latest 4.0 Update 3, which doesn'tcreate mpd rings by default because it has got new process manager called Hydra.

Regards!
Dmitry
0 Kudos
Sangamesh_B_
Beginner
445 Views
Thanks for the reply..

If Intel MPI updated to latest 4.0 Update 3, do all the applications have to be recompiled or just linking new library will work?

0 Kudos
Dmitry_K_Intel2
Employee
445 Views
Hi San,

The following functions has been changed in 4.x according to the MPI-2.1 standard:
MPI_Cart_create()
MPI_Cart_map()
MPI_Cart_sub()
MPI_Graph_create()

And constant MPI_MAX_ERROR_STRING has been changed from 512 to 1024.
If your applications don't use these functions and constant then you don't need to recompile applications. You just need toset LD_LIBRARY_PATH to new location (note: directory structure is not the same).

If your applications depend on these functions you can set I_MPI_COMPATIBILITY=3 and run application not recompiling them.

Intel MPI library will be installed in another directory (so you'll have 2 different versions) and running mpivars.sh script from different locations you'll be able to change the library. So you can easily return to 3.x version if anything goes wrong.

Also I need to mention that in version4.0 Update 3 there is mpiexec (used with mpd) and mpiexec.hydra (used with Hydra process manager).I's suggest using of'mpirun' as a universal launcher (it will run mpiexec.hydra by default).

Regards!
Dmitry
0 Kudos
Reply