Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2158 Discussions

CFD: Intel MPI + Mellanox InfiniBand + Win 10...?

Pedersen__Nick
Beginner
913 Views

Hi!

Does anyone know if it is possible to run Intel MPI with Mellanox InfiniBand (ConnectX-5 or 6) cards running Mellanox's latest WinOF-2 v2.2 in a Windows 10 environment? I've been googling and reading for hours but I can't find any concrete information.

This is for running Ansys CFX/Fluent on a relatively small CFD cluster of 4 compute nodes. The current release of CFX/Fluent (2019 R3) runs on Intel MPI 2018 Release 3 by default.

Older versions of Intel MPI (2017, for example) listed specifically "Windows* OpenFabrics* (WinOF*) 2.0 or higher" and "Mellanox* WinOF* Rev 4.40 or higher" as supported InfiniBand software. “Windows OpenFabrics (WinOF)” appears to be dead and does not support Windows 10. The older Mellanox WinOF Rev 4.40 does not support the newest Mellanox IB cards.

The release notes for Intel MPI 2018 and newer does not mention these older InfiniBand software, and instead mentions Intel Omni-Path.

Mellanox's own release notes for WinOF-2 v2.2 only mentions Microsoft MS MPI for the MPI protocol. ANSYS does run on MS MPI, but then I think I would have to move the cluster over to a Windows Server OS environment. I currently run the cluster successfully on Windows 10 using Intel MPI, but over 10GigE and not InfiniBand.

Thanks for any pointers!

Cheers.

 

 

0 Kudos
0 Replies
Reply