Intel® oneAPI HPC Toolkit
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
1939 Discussions

assertion failed intel_transport_init.h

RHRK_TUKL
Beginner
968 Views

OneAPI in version 2021.3

SLURM

mpiexec.hydra *like mpiexec, mpirun produces

Assertion failed in file ../../src/mpid/ch4/shm/posix/eager/include/intel_transport_init.h at line 1
057: llc_id >= 0

0 Kudos
9 Replies
ShivaniK_Intel
Moderator
940 Views

Hi,


Thanks for reaching out to us.


Could you please provide the sample reproducer code and the steps to reproduce the issue at our end?


Also, Could you please provide us your system environment details(OS version)?


Thanks & Regards

Shivani


RHRK_TUKL
Beginner
934 Views

#SBATCH --nodes=2 
#SBATCH --ntasks=4
#SBATCH --cpus-per-task=2
#SBATCH --ntasks-per-node=2

mpiicc -o oneA test_affinity.c

export I_MPI_DEBUG=100

mpiexec.hydra ./oneA

 

OS: Linux 3.10.0-1160.42.2.el7.x86_64

ShivaniK_Intel
Moderator
918 Views

Hi,


Thanks for providing the details.


We can see that you have not attached the sample reproducer code. Could you please provide the sample reproducer code to investigate more on the issue?


Thanks & Regards

Shivani


RHRK_TUKL
Beginner
905 Views

Ok, it's that simple. To my opinion, MPI_Init() is already not passed.

ShivaniK_Intel
Moderator
835 Views

Hi,


Thanks for providing the requested details. We have tried on Linux CentOS version 8 and were unable to reproduce the issue at our end.


Could you please provide a complete error log to investigate more on your issue?


Thanks & Regards

Shivani 


RHRK_TUKL
Beginner
715 Views

Hi,

installed the new release 2021.4.

No everything seems to work like it should.

Case may be closed.

Regards,

 Josef Schüle

RHRK_TUKL
Beginner
818 Views

Hello,

here is the slurm-Error file with I_MPI_DEBUG=100 after calling

mpiexec.hydra ./exe

ShivaniK_Intel
Moderator
770 Views

Hi,


Thanks for providing the slurm-error file .Could you also please provide output file too which has the details about I_MPI_DEBUG log?


Also, let us know the libfabric provider you have been using?


Thanks & Regards

Shivani



ShivaniK_Intel
Moderator
653 Views

Hi,


Glad to know that your issue is resolved. If you need any additional information, please post a new question as this thread will no longer be monitored by Intel.


Thanks & Regards

Shivani


Reply