Intel® Fortran Compiler
Build applications that can scale for the future with optimized code designed for Intel® Xeon® and compatible processors.
28456 Discussions

Module load in oneapi for coarray run

sverdrup
Beginner
2,045 Views

Hi all,

I'm triing to compile and run a fortran code with coarray. The compilation of the code seems ok, but when I run it I obtaine the seguent error:

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 0 PID 721 RUNNING AT 41ca5757f459
= KILLED BY SIGNAL: 11 (Segmentation fault)
===================================================================================

===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= RANK 1 PID 722 RUNNING AT 41ca5757f459
= KILLED BY SIGNAL: 11 (Segmentation fault)
===================================================================================

I've tried also hello world code with coarray but the resault is the same, not seem's to be a memory error! It is possible I have to load other modules?
This is my loaded modules:

Currently Loaded Modulefiles:
 1) mpi/2021.6.0   2) tbb/latest   3) compiler-rt/latest   4) oclfpga/latest   5) compiler/2022.1.0   6) init_opencl/2022.1.0   7) vtune/2022.2.0

and this is the list of the avalibles modules from oneapi modulefiles:

--------------------------------------------------------------- /opt/intel/oneapi/modulefiles ---------------------------------------------------------------
advisor/2022.1.0 compiler-rt32/latest dev-utilities/2021.6.0 dnnl/latest inspector/2022.1.0 mkl/latest vpl/2022.1.0
advisor/latest compiler/2022.1.0 dev-utilities/latest dpl/2021.7.0 inspector/latest mkl32/2022.1.0 vpl/latest
ccl/2021.6.0 compiler/latest dnnl-cpu-gomp/2022.1.0 dpl/latest intel_ipp_intel64/2021.6.0 mkl32/latest vtune/2022.2.0
ccl/latest compiler32/2022.1.0 dnnl-cpu-gomp/latest icc/2022.1.0 intel_ipp_intel64/latest mpi/2021.6.0 vtune/latest
clck/2021.6.0 compiler32/latest dnnl-cpu-iomp/2022.1.0 icc/latest intel_ippcp_intel64/2021.6.0 mpi/latest
clck/latest dal/2021.6.0 dnnl-cpu-iomp/latest icc32/2022.1.0 intel_ippcp_intel64/latest oclfpga/2022.1.0
compiler-rt/2022.1.0 dal/latest dnnl-cpu-tbb/2022.1.0 icc32/latest itac/2021.6.0 oclfpga/latest
compiler-rt/latest debugger/2021.6.0 dnnl-cpu-tbb/latest init_opencl/2022.1.0 itac/latest tbb/2021.6.0
compiler-rt32/2022.1.0 debugger/latest dnnl/2022.1.0 init_opencl/latest mkl/2022.1.0 tbb/latest

Thanks in advance,

best regards

0 Kudos
26 Replies
Ron_Green
Moderator
311 Views

netCDF and hdf5 is kicking me around.  I've never found it easy to build those packages.  I tried to install with YUM but all I got was the libraries, no include or module file(s). 

0 Kudos
Ron_Green
Moderator
300 Views

I found hdf5-devel and netcdf-devel and netcdf-fortran-devel but these are for gfortran (sigh).  looks like I have to go back to source builds.

 

Which brings up a good question - you DO have netcdf-fortran and hdf5 built for INTEL FORTRAN?  You aren't trying to use those libs built for gfortran are you?

0 Kudos
sverdrup
Beginner
288 Views

Hello Ron, sorry for the delay...

The the installation of the netcdf libraries for intel compilers are quite complicated. You have to compile the libraries and various dependencies with the intel fortran and c compilrs.

I've wrote a little bash script to download and compile all the libreries, I hope it will be useful to you.

 You will definitely have to change something, and in the end remember to export the paths of the libraries.

If you need help, do'nt hesitate, best regards,

a.

0 Kudos
sverdrup
Beginner
284 Views
0 Kudos
Ron_Green
Moderator
258 Views

I have it built and running to a certain point.  Seems it needs a data file in the /data directory??  Oh, and what OS distro and version are you running under?

 

Also, do this 

 

ldd <executable>

my executable is named 'bathy' and has this:

 

 

ldd bathy
	linux-vdso.so.1 (0x00007fff961e0000)
	libnetcdff.so.7 => /cts/tools/library/netcdf4-intel-fortran/4.5.4/lib/libnetcdff.so.7 (0x00007f1d30b28000)
	libsz.so.2 => /cts/tools/library/szip/2.1.1/lib/libsz.so.2 (0x00007f1d30b13000)
	libicaf.so => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/compiler/2022.1.0/linux/compiler/lib/intel64_lin/libicaf.so (0x00007f1d30aa5000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f1d3092f000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f1d3090c000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f1d3071a000)
	/lib64/ld-linux-x86-64.so.2 (0x00007f1d30dc2000)
	libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f1d306ff000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f1d306f9000)
	libnetcdf.so.19 => /cts/tools/library/netcdf4-intel/4.9.0/lib/libnetcdf.so.19 (0x00007f1d304b8000)
	libhdf5_hl.so.200 => /cts/tools/library/hdf5/1.12.0/lib/libhdf5_hl.so.200 (0x00007f1d3048e000)
	libhdf5.so.200 => /cts/tools/library/hdf5/1.12.0/lib/libhdf5.so.200 (0x00007f1d2fe0b000)
	libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f1d2fdef000)
	libimf.so => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/compiler/2022.1.0/linux/compiler/lib/intel64_lin/libimf.so (0x00007f1d2f761000)
	libifport.so.5 => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/compiler/2022.1.0/linux/compiler/lib/intel64_lin/libifport.so.5 (0x00007f1d2f533000)
	libifcoremt.so.5 => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/compiler/2022.1.0/linux/compiler/lib/intel64_lin/libifcoremt.so.5 (0x00007f1d2f393000)
	libsvml.so => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/compiler/2022.1.0/linux/compiler/lib/intel64_lin/libsvml.so (0x00007f1d2d331000)
	libintlc.so.5 => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/compiler/2022.1.0/linux/compiler/lib/intel64_lin/libintlc.so.5 (0x00007f1d2d0b9000)
	libmpi.so.12 => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/mpi/2021.6.0/lib/release/libmpi.so.12 (0x00007f1d2b871000)
	libirng.so => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/compiler/2022.1.0/linux/compiler/lib/intel64_lin/libirng.so (0x00007f1d2b507000)
	libmpifort.so.12 => /nfs/pdx/disks/cts2/tools/oneapi/2022.2.0-up1/mpi/2021.6.0/lib/libmpifort.so.12 (0x00007f1d2b153000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f1d2b149000)

 

0 Kudos
Ron_Green
Moderator
258 Views

and try

 

export I_MPI_FABRICS=shm

and run on one node only

0 Kudos
Reply