Intel® MPI Library
Get help with building, analyzing, optimizing, and scaling high-performance computing (HPC) applications.
2154 Discussions

Difference between mpicc and mpiicc

Fuli_F_
Beginner
5,839 Views

I write a simple mpi program as follow:

#include "mpi.h"
#include <stdio.h>
#include <math.h>
void main(argc,argv)
int argc;
char *argv[];
{
    int myid,numprocs;
    int namelen;
    char pro_name[MPI_MAX_PROCESSOR_NAME];
    MPI_Init(&argc,&argv);
    MPI_Comm_rank(MPI_COMM_WORLD,&myid);
    MPI_Comm_size(MPI_COMM_WORLD,&numprocs);
    MPI_Get_processor_name(pro_name,&namelen);
    printf("Process %d of %d on %s\n",myid,numprocs,pro_name);
    MPI_Finalize();
}

When I compile it with "mpicc -o xxx xxx.c" and run it with "mpirun -np 8 ./xxx", it rightly creates 8 processes. But when I compile it with "mpiicc -o xxx xxx.c" and run with the same order as above, it only creates 1 process. I want to know what's the difference between the mpicc and mpiicc. Is it caused by some faults made during my installment? And how can I fix it? By the way, I install the impi and compiler of intel by installing the intel cluster studio (l_ics_2013.1.039_intel64).

Here is the result:

[root@localhost test]# mpicc -o mpitest mpi-test.c
[root@localhost test]# mpirun -np 8 ./mpitest 
Process 3 of 8 on localhost.localdomain
Process 4 of 8 on localhost.localdomain
Process 5 of 8 on localhost.localdomain
Process 6 of 8 on localhost.localdomain
Process 7 of 8 on localhost.localdomain
Process 0 of 8 on localhost.localdomain
Process 1 of 8 on localhost.localdomain
Process 2 of 8 on localhost.localdomain
[root@localhost test]# mpiicc -o mpitest mpi-test.c
[root@localhost test]# mpirun -np 8 ./mpitest 
Process 0 of 1 on localhost.localdomain
Process 0 of 1 on localhost.localdomain
Process 0 of 1 on localhost.localdomain
Process 0 of 1 on localhost.localdomain
Process 0 of 1 on localhost.localdomain
Process 0 of 1 on localhost.localdomain
Process 0 of 1 on localhost.localdomain
Process 0 of 1 on localhost.localdomain

 

0 Kudos
1 Solution
James_T_Intel
Moderator
5,838 Views

The difference in the two is that mpiicc uses icc, and mpicc allows you to select the compiler, defaulting to gcc.  But there's definitely something wrong here.  Please run the following commands and attach the output.txt file:

[plain]echo "***mpicc show" > output.txt

mpicc -show -o gtest mpi-test.c >> output.txt

echo "***mpiicc show" >> output.txt

mpiicc -show -o itest mpi-test.c >> output.txt

mpicc -o gtest mpi-test.c

mpiicc -o itest mpi-test.c

echo "***ldd gtest" >> output.txt

ldd gtest >> output.txt

echo "***ldd itest" >> output.txt

ldd itest >> output.txt

echo "***runtest gtest" >> output.txt

mpirun -n 8 -genv I_MPI_DEBUG 5 -verbose ./gtest 2>&1 >> output.txt

echo "***runtest itest" >> output.txt

mpirun -n 8 -genv I_MPI_DEBUG 5 -verbose ./itest 2>&1 >> output.txt[/plain]

My guess is that there's something wrong in the configuration, and one of these should show the problem.

View solution in original post

0 Kudos
7 Replies
James_T_Intel
Moderator
5,839 Views

The difference in the two is that mpiicc uses icc, and mpicc allows you to select the compiler, defaulting to gcc.  But there's definitely something wrong here.  Please run the following commands and attach the output.txt file:

[plain]echo "***mpicc show" > output.txt

mpicc -show -o gtest mpi-test.c >> output.txt

echo "***mpiicc show" >> output.txt

mpiicc -show -o itest mpi-test.c >> output.txt

mpicc -o gtest mpi-test.c

mpiicc -o itest mpi-test.c

echo "***ldd gtest" >> output.txt

ldd gtest >> output.txt

echo "***ldd itest" >> output.txt

ldd itest >> output.txt

echo "***runtest gtest" >> output.txt

mpirun -n 8 -genv I_MPI_DEBUG 5 -verbose ./gtest 2>&1 >> output.txt

echo "***runtest itest" >> output.txt

mpirun -n 8 -genv I_MPI_DEBUG 5 -verbose ./itest 2>&1 >> output.txt[/plain]

My guess is that there's something wrong in the configuration, and one of these should show the problem.

0 Kudos
Fuli_F_
Beginner
5,838 Views

James Tullos (Intel) wrote:

The difference in the two is that mpiicc uses icc, and mpicc allows you to select the compiler, defaulting to gcc.  But there's definitely something wrong here.  Please run the following commands and attach the output.txt file:

 

echo "***mpicc show" > output.txt
mpicc -show -o gtest mpi-test.c >> output.txt
echo "***mpiicc show" >> output.txt
mpiicc -show -o itest mpi-test.c >> output.txt
mpicc -o gtest mpi-test.c
mpiicc -o itest mpi-test.c
echo "***ldd gtest" >> output.txt
ldd gtest >> output.txt
echo "***ldd itest" >> output.txt
ldd itest >> output.txt
echo "***runtest gtest" >> output.txt
mpirun -n 8 -genv I_MPI_DEBUG 5 -verbose ./gtest 2>&1 >> output.txt
echo "***runtest itest" >> output.txt
mpirun -n 8 -genv I_MPI_DEBUG 5 -verbose ./itest 2>&1 >> output.txt

Thanks for your reply! I run these commends here is the result:

***mpicc show
icc -o gtest mpi-test.c -I/usr/local/include -pthread -Wl,-rpath -Wl,/usr/local/lib -Wl,--enable-new-dtags -L/usr/local/lib -lmpi
***mpiicc show
icc -o itest mpi-test.c -I/opt/intel//impi/4.1.1.036/intel64/include -L/opt/intel//impi/4.1.1.036/intel64/lib -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker /opt/intel//impi/4.1.1.036/intel64/lib -Xlinker -rpath -Xlinker /opt/intel/mpi-rt/4.1 -lmpigf -lmpi -lmpigi -ldl -lrt -lpthread
***ldd gtest
        linux-vdso.so.1 =>  (0x00007fffeddff000)
        libmpi.so.1 => /usr/local/lib/libmpi.so.1 (0x00007fd0ac7a8000)
        libm.so.6 => /lib64/libm.so.6 (0x0000003fc3e00000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x0000003fcfe00000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003fc4600000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003fc4200000)
        libdl.so.2 => /lib64/libdl.so.2 (0x0000003fc4a00000)
        libopen-rte.so.7 => /usr/local/lib/libopen-rte.so.7 (0x00007fd0ac4fb000)
        libopen-pal.so.6 => /usr/local/lib/libopen-pal.so.6 (0x00007fd0ac1f5000)
        librt.so.1 => /lib64/librt.so.1 (0x0000003fc5200000)
        libnsl.so.1 => /lib64/libnsl.so.1 (0x0000003fd4e00000)
        libutil.so.1 => /lib64/libutil.so.1 (0x0000003fd3e00000)
        libimf.so => /opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64/libimf.so (0x00007fd0abd36000)
        libsvml.so => /opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64/libsvml.so (0x00007fd0ab2da000)
        libirng.so => /opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64/libirng.so (0x00007fd0ab0d3000)
        libintlc.so.5 => /opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64/libintlc.so.5 (0x00007fd0aae7d000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003fc3a00000)
***ldd itest
        linux-vdso.so.1 =>  (0x00007fff9a7ff000)
        libmpigf.so.4 => /opt/intel//impi/4.1.1.036/intel64/lib/libmpigf.so.4 (0x00007f6207515000)
        libmpi.so.4 => /opt/intel//impi/4.1.1.036/intel64/lib/libmpi.so.4 (0x00007f6206eb5000)
        libdl.so.2 => /lib64/libdl.so.2 (0x0000003fc4a00000)
        libmpi.so.4 => /opt/intel//impi/4.1.1.036/intel64/lib/libmpi.so.4 (0x00007f6206eb5000)
        libdl.so.2 => /lib64/libdl.so.2 (0x0000003fc4a00000)
        librt.so.1 => /lib64/librt.so.1 (0x0000003fc5200000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003fc4600000)
        libm.so.6 => /lib64/libm.so.6 (0x0000003fc3e00000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x0000003fcfe00000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003fc4200000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003fc3a00000)
***runtest gtest
***runtest itest

Unfortunately, I got the error "mpirun: Error: unknown option "-genv" Type 'mpirun --help' for usage.", when I run this commend "mpirun -n 8 -genv I_MPI_DEBUG 5 -verbose ./itest 2>&1 >> output.txt"

0 Kudos
Fuli_F_
Beginner
5,838 Views

Hi,James:

Thanks for your reply. I run your commands and here are the results:

***mpicc show
icc -o gtest mpi-test.c -I/usr/local/include -pthread -Wl,-rpath -Wl,/usr/local/lib -Wl,--enable-new-dtags -L/usr/local/lib -lmpi
***mpiicc show
icc -o itest mpi-test.c -I/opt/intel//impi/4.1.1.036/intel64/include -L/opt/intel//impi/4.1.1.036/intel64/lib -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker /opt/intel//impi/4.1.1.036/intel64/lib -Xlinker -rpath -Xlinker /opt/intel/mpi-rt/4.1 -lmpigf -lmpi -lmpigi -ldl -lrt -lpthread
***ldd gtest
        linux-vdso.so.1 =>  (0x00007fffeddff000)
        libmpi.so.1 => /usr/local/lib/libmpi.so.1 (0x00007fd0ac7a8000)
        libm.so.6 => /lib64/libm.so.6 (0x0000003fc3e00000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x0000003fcfe00000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003fc4600000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003fc4200000)
        libdl.so.2 => /lib64/libdl.so.2 (0x0000003fc4a00000)
        libopen-rte.so.7 => /usr/local/lib/libopen-rte.so.7 (0x00007fd0ac4fb000)
        libopen-pal.so.6 => /usr/local/lib/libopen-pal.so.6 (0x00007fd0ac1f5000)
        librt.so.1 => /lib64/librt.so.1 (0x0000003fc5200000)
        libnsl.so.1 => /lib64/libnsl.so.1 (0x0000003fd4e00000)
        libutil.so.1 => /lib64/libutil.so.1 (0x0000003fd3e00000)
        libimf.so => /opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64/libimf.so (0x00007fd0abd36000)
        libsvml.so => /opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64/libsvml.so (0x00007fd0ab2da000)
        libirng.so => /opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64/libirng.so (0x00007fd0ab0d3000)
        libintlc.so.5 => /opt/intel/composer_xe_2013_sp1.0.080/compiler/lib/intel64/libintlc.so.5 (0x00007fd0aae7d000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003fc3a00000)
***ldd itest
        linux-vdso.so.1 =>  (0x00007fff9a7ff000)
        libmpigf.so.4 => /opt/intel//impi/4.1.1.036/intel64/lib/libmpigf.so.4 (0x00007f6207515000)
        libmpi.so.4 => /opt/intel//impi/4.1.1.036/intel64/lib/libmpi.so.4 (0x00007f6206eb5000)
        libdl.so.2 => /lib64/libdl.so.2 (0x0000003fc4a00000)
"output.txt" 35L, 2300C                                                  1,1          椤剁                                                                        2
        libmpi.so.4 => /opt/intel//impi/4.1.1.036/intel64/lib/libmpi.so.4 (0x00007f6206eb5000)
        libdl.so.2 => /lib64/libdl.so.2 (0x0000003fc4a00000)
        librt.so.1 => /lib64/librt.so.1 (0x0000003fc5200000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003fc4600000)
        libm.so.6 => /lib64/libm.so.6 (0x0000003fc3e00000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x0000003fcfe00000)
        libc.so.6 => /lib64/libc.so.6 (0x0000003fc4200000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003fc3a00000)
***runtest gtest
***runtest itest
 

Unfortunately, I got this error "mpirun: Error: unknown option "-genv" Type 'mpirun --help' for usage.", when I run "mpirun -n 8 -genv I_MPI_DEBUG 5 -verbose ./gtest 2>&1 >> output.txt"

By the way, the ICS is installed on an alone server, so when it asks me to create the "amchine.LINUX" file, I carelessly write one line isn't same with the result of command "hostname".

0 Kudos
James_T_Intel
Moderator
5,838 Views

First, correct the machine file.  The host name listed should match the output from hostname.

Second, what is the output from

[plain]which mpirun[/plain]

0 Kudos
Fuli_F_
Beginner
5,838 Views

James Tullos (Intel) wrote:

First, correct the machine file.  The host name listed should match the output from hostname.

Second, what is the output from

 

which mpirun

 

Thank you very much!The cause maybe I use Intel compiler mpiicc and run it with the unmatched openmpi mpirun. Because another guy in our team installed the openmpi on our server, but didn't inform us. the mpicc and mpirun is firstly found at /usr/local/bin.

0 Kudos
James_T_Intel
Moderator
5,838 Views

To avoid this in the future, simply source the environment variable script provided in the installation before using the Intel® MPI Library.  This will set the paths correctly as well as setting I_MPI_ROOT.

0 Kudos
TimP
Honored Contributor III
5,838 Views

It's still highly desirable to install various MPI builds in their specific directories, e.g. /opt/openmpi_16/  so they don't get mixed in at build or run time unless the specific PATH and LD_LIBRARY_PATH have been set; possibly by module load   http://modules.sourceforge.net

0 Kudos
Reply