Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.

Two Infiniband networks.

Adalberto_Fazzio
Beginner
389 Views
Hello all!
I have a SGI cluster with two separated infiniband networks. I want to use both networks for MPI comunications, how do I configure the Intel MPI this way?
There's anything to do with the "Set I_MPI_DEVICE=:"command?
Thank you for your attention.
Best regards.
0 Kudos
2 Replies
Dmitry_S_Intel
Moderator
389 Views
Hi!

Yes, it is possible with multirail.

Release Notes:
"Native InfiniBand* interface (OFED* verbs) support with multirail capability for ultimate InfiniBand* performance
- Set I_MPI_FABRICS=ofa for OFED* verbs only
- Set I_MPI_FABRICS=shm:ofa for shared memory and OFED* verbs
- Set I_MPI_OFA_NUM_ADAPTERS, etc., for multirail transfers"

Please take a look atI_MPI_OFA_NUM_ADAPTERS andI_MPI_OFA_NUM_PORTS also.

--
Dmitry Sivkov
0 Kudos
Adalberto_Fazzio
Beginner
389 Views
Hi Dmitry, thank you for your quick response!
Your advice will be very helpful, I'll take a look at those options in the manual to be sure of what I'm doing :)
Thank you again!
0 Kudos
Reply