Error in F2py with mpi

Hi,
I try to use f2py with mpi and follow this example (Tutorial — MPI for Python 3.1.5 documentation). I can compile it with nv fortran compiler but get the following error when running it.

hello.sayhello(fcomm)
*** The MPI_Comm_f2c() function was called before MPI_INIT was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.

The same code works properly when I switch to gfortran. How to fix this error?
I use hpc_sdk 22.11.
This is how I compile the code:
(nv fortran) f2py -c --f90exec=mpif90 --fcompiler=nv hello.f90 -m hello
(gfortran) f2py -c --f90exec=mpif90 --fcompiler=gfortran hello.f90 -m hello

Hi ruihengsong,

I don’t use Python or f2py myself, so I’m only going to be able to guess as to what the issue is.

The error indicates that MPI_init wasn’t called before calling the MPI function. In looking at the examples you linked to, there’s no explicit call to MPI_init, so I presume it gets implicitly called within python when loading the MPI module.

My guess is that your Python MPI module was build using a different MPI than the one you’re using to compile using nvfortran. Hence, the wrong MPI runtime is getting initialized.

Which mpi90 are you using?
What is the output from the command “ldd hello”?

-Mat

I think you are right. Python MPI module might be build by GCC.
I use /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/bin/mpif90.
The output from ‘ldd hello’.
linux-vdso.so.1 (0x00007ffc20af9000)
libmpi_usempif08.so.40 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/libmpi_usempif08.so.40 (0x00007f1de6604000)
libmpi_usempi_ignore_tkr.so.40 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/libmpi_usempi_ignore_tkr.so.40 (0x00007f1de63fd000)
libmpi_mpifh.so.40 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/libmpi_mpifh.so.40 (0x00007f1de61a5000)
libmpi.so.40 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/libmpi.so.40 (0x00007f1de5d15000)
libnvf.so => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/compilers/lib/libnvf.so (0x00007f1de5603000)
libnvomp.so => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/compilers/lib/libnvomp.so (0x00007f1de48f9000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f1de48ea000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f1de48c7000)
libnvcpumath.so => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/compilers/lib/libnvcpumath.so (0x00007f1de44af000)
libnvc.so => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/compilers/lib/libnvc.so (0x00007f1de424a000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f1de423e000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f1de404c000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f1de4031000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f1de3ee2000)
libopen-rte.so.40 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/libopen-rte.so.40 (0x00007f1de3b7a000)
libopen-pal.so.40 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/libopen-pal.so.40 (0x00007f1de364d000)
librdmacm.so.1 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/librdmacm.so.1 (0x00007f1de3438000)
libibverbs.so.1 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/libibverbs.so.1 (0x00007f1de322b000)
libnuma.so.1 => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/libnuma.so.1 (0x00007f1de3020000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007f1de3019000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f1de2ffd000)
libnvhpcatm.so => /opt/nvidia/hpc_sdk/Linux_x86_64/22.11/comm_libs/mpi/lib/…/…/…/…/compilers/lib/libnvhpcatm.so (0x00007f1de2df2000)
libatomic.so.1 => /lib/x86_64-linux-gnu/libatomic.so.1 (0x00007f1de2de8000)
/lib64/ld-linux-x86-64.so.2 (0x00007f1de683f000)