- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Code:
import mpi4py
import time
import numpy as np
from mpi4py import MPI
comm = MPI.COMM_WORLD
rank = comm.Get_rank()
print("rank",rank)
if __name__ == '__main__':
if rank == 0:
mem = np.array([0], dtype='i')
win = MPI.Win.Create(mem, comm=comm)
else:
win = MPI.Win.Create(None, comm=comm)
print(rank, "end")
(py3.6.8) ➜ ~ mpirun -n 2 python -u test.py
rank 0
rank 1
0 end
1 end
Abort(806449679): Fatal error in internal_Finalize: Other MPI error, error stack:
internal_Finalize(50)...........: MPI_Finalize failed
MPII_Finalize(345)..............:
MPID_Finalize(511)..............:
MPIDI_OFI_mpi_finalize_hook(895):
destroy_vni_context(1137).......: OFI domain close failed (ofi_init.c:1137:destroy_vni_context:Device or resource busy)
Why is this happening? How to debug? This error is not reported on the other machine.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Since this is a duplicate thread of https://community.intel.com/t5/Intel-oneAPI-HPC-Toolkit/mpirun-error/m-p/1426821#M9990 we will no longer monitor this thread. We will continue addressing this issue in the other thread.
Thanks & Regards,
Santosh
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Since this is a duplicate thread of https://community.intel.com/t5/Intel-oneAPI-HPC-Toolkit/mpirun-error/m-p/1426821#M9990 we will no longer monitor this thread. We will continue addressing this issue in the other thread.
Thanks & Regards,
Santosh

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page