Hi,
Is there a up to date compile script available for linking mumps library into the ElmerSolver_mpi.
I have all the required library and include files available for openmpi, mumps, scalapack, blacs.
I would appreciate help from the forum members.
Thanks,
saa_1973.
MUMPS Library Linking
Re: MUMPS Library Linking
Hello,
I get the following error when I try to link MUMPS in Elmer:
./libelmersolver.so: undefined reference to `dmumps_'
collect2: ld returned 1 exit status
make[2]: *** [ElmerSolver_mpi] Error 1
I have used the following compile script.
-----------------------------------------------------------------------------------------------------------------------
#!/bin/sh -f
# This compilation script worked on a RHEL5.0
# kernel: x86_64 2.6.18-92.1.22.el5
export CC="mpicc"
export CXX="mpic++"
export FC="mpif90"
export F77="mpif90"
export ELMER_INSTALL="/home/cfd/elmer"
export ELMER_HOME="/home/cfd/elmer"
export MPIDIR="/usr/lib64/openmpi"
export HYPRE="/home/cfd/hypre"
export CPPFLAGS="-I$HYPRE/include"
export MUMPS="/home/cfd/Downloads/MUMPS_4.9.2"
export BLACS="/usr/lib64/blacs-openmpi"
export SCALAPACK="/usr/lib64/scalapack-openmpi"
export FCPPFLAGS="$FCPPFLAGS -DHAVE_MUMPS"
export FCFLAGS="$FCFLAGS -I$MUMPS/include"
modules="matc umfpack mathlibs meshgen2d eio hutiter fem elmerparam elmergrid"
##### configure, build and install #########
for m in $modules; do
echo "module $m"
echo "###############"
cd $m ; make clean; ./configure --enable-static=yes --prefix=$ELMER_INSTALL --with-mpi=yes \
--with-mpi-dir=$MPIDIR --with-mpi-lib-dir="$MPIDIR/lib -lmpi_f77 -lmpi" --with-mpi-inc-dir="/usr/include/openmpi-x86_64" \
--with-64bits=yes --with-hypre="-L$HYPRE/lib -lHYPRE" --with-mumps="/home/cfd/Downloads/MUMPS_4.9.2/libdmumps.a \
/home/cfd/Downloads/MUMPS_4.9.2/lib/libmumps_common.a /home/cfd/Downloads/MUMPS_4.9.2/lib/libpord.a \
/usr/lib64/scalapack-openmpi/libscalapack.a /usr/lib64/blacs-openmpi/libmpiblacs.a /usr/lib64/blacs-openmpi/libmpiblacsCinit.a \
/usr/lib64/blacs-openmpi/libmpiblacsF77init.a /usr/lib64/blacs-openmpi/libmpiblacs.a /usr/lib64/libblas.a /usr/lib64/liblapack.a \
/usr/lib64/liblapack_pic.a" && make install && cd ..
done
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I would appreciate some clue from the users. Is MUMPS support available in the current svn 4312? Might be some changes required in the compile script.
Thanks,
saa_1973.
I get the following error when I try to link MUMPS in Elmer:
./libelmersolver.so: undefined reference to `dmumps_'
collect2: ld returned 1 exit status
make[2]: *** [ElmerSolver_mpi] Error 1
I have used the following compile script.
-----------------------------------------------------------------------------------------------------------------------
#!/bin/sh -f
# This compilation script worked on a RHEL5.0
# kernel: x86_64 2.6.18-92.1.22.el5
export CC="mpicc"
export CXX="mpic++"
export FC="mpif90"
export F77="mpif90"
export ELMER_INSTALL="/home/cfd/elmer"
export ELMER_HOME="/home/cfd/elmer"
export MPIDIR="/usr/lib64/openmpi"
export HYPRE="/home/cfd/hypre"
export CPPFLAGS="-I$HYPRE/include"
export MUMPS="/home/cfd/Downloads/MUMPS_4.9.2"
export BLACS="/usr/lib64/blacs-openmpi"
export SCALAPACK="/usr/lib64/scalapack-openmpi"
export FCPPFLAGS="$FCPPFLAGS -DHAVE_MUMPS"
export FCFLAGS="$FCFLAGS -I$MUMPS/include"
modules="matc umfpack mathlibs meshgen2d eio hutiter fem elmerparam elmergrid"
##### configure, build and install #########
for m in $modules; do
echo "module $m"
echo "###############"
cd $m ; make clean; ./configure --enable-static=yes --prefix=$ELMER_INSTALL --with-mpi=yes \
--with-mpi-dir=$MPIDIR --with-mpi-lib-dir="$MPIDIR/lib -lmpi_f77 -lmpi" --with-mpi-inc-dir="/usr/include/openmpi-x86_64" \
--with-64bits=yes --with-hypre="-L$HYPRE/lib -lHYPRE" --with-mumps="/home/cfd/Downloads/MUMPS_4.9.2/libdmumps.a \
/home/cfd/Downloads/MUMPS_4.9.2/lib/libmumps_common.a /home/cfd/Downloads/MUMPS_4.9.2/lib/libpord.a \
/usr/lib64/scalapack-openmpi/libscalapack.a /usr/lib64/blacs-openmpi/libmpiblacs.a /usr/lib64/blacs-openmpi/libmpiblacsCinit.a \
/usr/lib64/blacs-openmpi/libmpiblacsF77init.a /usr/lib64/blacs-openmpi/libmpiblacs.a /usr/lib64/libblas.a /usr/lib64/liblapack.a \
/usr/lib64/liblapack_pic.a" && make install && cd ..
done
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I would appreciate some clue from the users. Is MUMPS support available in the current svn 4312? Might be some changes required in the compile script.
Thanks,
saa_1973.
Re: MUMPS Library Linking
Hi,
i don't think the configure script has the option "--with-mumps". You
could try adding the mumps libraries to your --with-mpi-lib-dir - option,
before the "-lmpi_f77 -lmpi" part ?
Juha
i don't think the configure script has the option "--with-mumps". You
could try adding the mumps libraries to your --with-mpi-lib-dir - option,
before the "-lmpi_f77 -lmpi" part ?
Juha
Re: MUMPS Library Linking
Hello Juha,
Thanks for your reply and help. I could not run the configure script while including the mumps libraries along with the mpi Iib. I made an addition of export LDFLAGS in the configure script and I now get the following errors:
./libelmersolver.so: undefined reference to `metis_nodend_'
./libelmersolver.so: undefined reference to `numroc_'
./libelmersolver.so: undefined reference to `blacs_gridexit_'
./libelmersolver.so: undefined reference to `ParMETIS_V3_NodeND'
./libelmersolver.so: undefined reference to `pdgetrf_'
./libelmersolver.so: undefined reference to `descinit_'
./libelmersolver.so: undefined reference to `blacs_gridinit_'
./libelmersolver.so: undefined reference to `pdpotrf_'
./libelmersolver.so: undefined reference to `pdgetrs_'
./libelmersolver.so: undefined reference to `blacs_gridinfo_'
./libelmersolver.so: undefined reference to `pdpotrs_'
./libelmersolver.so: undefined reference to `metis_nodewnd_'
collect2: ld returned 1 exit status
make[2]: *** [ElmerSolver] Error 1
make[2]: Leaving directory `/home/cfd/trunk/fem/src'
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory `/home/cfd/trunk/fem/src'
make: *** [install-recursive] Error 1
--------------------------------------------------------------------------------------------------------------------------------------------------------
#!/bin/sh -f
# This compilation script worked on a RHEL5.0
# kernel: x86_64 2.6.18-92.1.22.el5
export CC="mpicc -fPIC"
export CXX="mpic++"
export FC="mpif90"
export F77="mpif90"
export ELMER_INSTALL="/home/cfd/elmer"
export ELMER_HOME="/home/cfd/elmer"
export MPIDIR="/usr/lib64/openmpi"
export HYPRE="/home/cfd/hypre"
export CPPFLAGS="-I$HYPRE/include"
export MUMPS="/home/cfd/Downloads/MUMPS_4.9.2"
export BLACS="/usr/lib64/blacs-openmpi"
export SCALAPACK="/usr/lib64/scalapack-openmpi"
export FCPPFLAGS="$FCPPFLAGS -DHAVE_MUMPS"
export FCFLAGS="$FCFLAGS -I$MUMPS/include"
export LDFLAGS="-L$MUMPS/lib -ldmumps -lmumps_common -lpord -L$MPIDIR/lib -lmpi_f77 -lmpi"
modules="matc umfpack mathlibs meshgen2d eio hutiter fem elmerparam elmergrid"
##### configure, build and install #########
for m in $modules; do
echo "module $m"
echo "###############"
##### parallel #######
cd $m ; make clean; ./configure --enable-static=yes --prefix=$ELMER_INSTALL --with-mpi=yes \
--with-mpi-dir=$MPIDIR --with-mpi-lib-dir="$MPIDIR/lib -lmpi_f77 -lmpi" --with-mpi-inc-dir="/usr/include/openmpi-x86_64" \
--with-64bits=yes --with-hypre="-L$HYPRE/lib -lHYPRE" --with-mumps="/home/cfd/Downloads/MUMPS_4.9.2/lib/libdmumps.a \
/home/cfd/Downloads/MUMPS_4.9.2/lib/libmumps_common.a /home/cfd/Downloads/MUMPS_4.9.2/lib/libpord.a \
/usr/lib64/scalapack-openmpi/libscalapack.a /usr/lib64/blacs-openmpi/libmpiblacs.a /usr/lib64/blacs-openmpi/libmpiblacsCinit.a \
/usr/lib64/blacs-openmpi/libmpiblacsF77init.a /usr/lib64/blacs-openmpi/libmpiblacs.a /usr/lib64/libblas.a /usr/lib64/liblapack.a \
/usr/lib64/liblapack_pic.a" && make install && cd ..
done
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Please inform if the configure script needs further edit.
Thanks,
saa_1973.
Thanks for your reply and help. I could not run the configure script while including the mumps libraries along with the mpi Iib. I made an addition of export LDFLAGS in the configure script and I now get the following errors:
./libelmersolver.so: undefined reference to `metis_nodend_'
./libelmersolver.so: undefined reference to `numroc_'
./libelmersolver.so: undefined reference to `blacs_gridexit_'
./libelmersolver.so: undefined reference to `ParMETIS_V3_NodeND'
./libelmersolver.so: undefined reference to `pdgetrf_'
./libelmersolver.so: undefined reference to `descinit_'
./libelmersolver.so: undefined reference to `blacs_gridinit_'
./libelmersolver.so: undefined reference to `pdpotrf_'
./libelmersolver.so: undefined reference to `pdgetrs_'
./libelmersolver.so: undefined reference to `blacs_gridinfo_'
./libelmersolver.so: undefined reference to `pdpotrs_'
./libelmersolver.so: undefined reference to `metis_nodewnd_'
collect2: ld returned 1 exit status
make[2]: *** [ElmerSolver] Error 1
make[2]: Leaving directory `/home/cfd/trunk/fem/src'
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory `/home/cfd/trunk/fem/src'
make: *** [install-recursive] Error 1
--------------------------------------------------------------------------------------------------------------------------------------------------------
#!/bin/sh -f
# This compilation script worked on a RHEL5.0
# kernel: x86_64 2.6.18-92.1.22.el5
export CC="mpicc -fPIC"
export CXX="mpic++"
export FC="mpif90"
export F77="mpif90"
export ELMER_INSTALL="/home/cfd/elmer"
export ELMER_HOME="/home/cfd/elmer"
export MPIDIR="/usr/lib64/openmpi"
export HYPRE="/home/cfd/hypre"
export CPPFLAGS="-I$HYPRE/include"
export MUMPS="/home/cfd/Downloads/MUMPS_4.9.2"
export BLACS="/usr/lib64/blacs-openmpi"
export SCALAPACK="/usr/lib64/scalapack-openmpi"
export FCPPFLAGS="$FCPPFLAGS -DHAVE_MUMPS"
export FCFLAGS="$FCFLAGS -I$MUMPS/include"
export LDFLAGS="-L$MUMPS/lib -ldmumps -lmumps_common -lpord -L$MPIDIR/lib -lmpi_f77 -lmpi"
modules="matc umfpack mathlibs meshgen2d eio hutiter fem elmerparam elmergrid"
##### configure, build and install #########
for m in $modules; do
echo "module $m"
echo "###############"
##### parallel #######
cd $m ; make clean; ./configure --enable-static=yes --prefix=$ELMER_INSTALL --with-mpi=yes \
--with-mpi-dir=$MPIDIR --with-mpi-lib-dir="$MPIDIR/lib -lmpi_f77 -lmpi" --with-mpi-inc-dir="/usr/include/openmpi-x86_64" \
--with-64bits=yes --with-hypre="-L$HYPRE/lib -lHYPRE" --with-mumps="/home/cfd/Downloads/MUMPS_4.9.2/lib/libdmumps.a \
/home/cfd/Downloads/MUMPS_4.9.2/lib/libmumps_common.a /home/cfd/Downloads/MUMPS_4.9.2/lib/libpord.a \
/usr/lib64/scalapack-openmpi/libscalapack.a /usr/lib64/blacs-openmpi/libmpiblacs.a /usr/lib64/blacs-openmpi/libmpiblacsCinit.a \
/usr/lib64/blacs-openmpi/libmpiblacsF77init.a /usr/lib64/blacs-openmpi/libmpiblacs.a /usr/lib64/libblas.a /usr/lib64/liblapack.a \
/usr/lib64/liblapack_pic.a" && make install && cd ..
done
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Please inform if the configure script needs further edit.
Thanks,
saa_1973.
Re: MUMPS Library Linking
Hi,
OK, there is an option "--mpi-library" (in recent versions) in addition to --mpi-lib-dir.
Give it a value in the style you have given the value for LDFLAGS. If you want to
use the environment variables, you could MPI_LIBS instead of LDFLAGS.
Anyway, the trouble now seems to be that your link is missing the some of the
MUMPS required libraries, namely blacs & metis (at least).
Juha
OK, there is an option "--mpi-library" (in recent versions) in addition to --mpi-lib-dir.
Give it a value in the style you have given the value for LDFLAGS. If you want to
use the environment variables, you could MPI_LIBS instead of LDFLAGS.
Anyway, the trouble now seems to be that your link is missing the some of the
MUMPS required libraries, namely blacs & metis (at least).
Juha
Re: MUMPS Library Linking
Hi, I've just made a patch to the configure system which detects the MUMPS library and links to it if it's there and usable. I basically copied the HYPRE section of acx_elmer.m4 and substituted mumps for hypre. It works for me.
Drawbacks:
[*]The -I/usr/include is somewhat Debian-specific, it would be better to have something which looks (or asks) for the MUMPS directory and build from that.
[*]Linking with -ldmumps only works on Debian -- the advantage is that it doesn't hard-wire the PORD ordering, so one can use it with that, Scotch or METIS -- if the dmumps library is linked properly. I built in an alternative which has the mumps_common, pord, scalapack and blacs linkages, but that's only narrowly useful, and is untested -- the right way is to have separate tests for BLACS, ScaLAPACK and MUMPS.
So there it is, hopefully useful, and if not, others can tweak it as needed.
-Adam
Drawbacks:
[*]The -I/usr/include is somewhat Debian-specific, it would be better to have something which looks (or asks) for the MUMPS directory and build from that.
[*]Linking with -ldmumps only works on Debian -- the advantage is that it doesn't hard-wire the PORD ordering, so one can use it with that, Scotch or METIS -- if the dmumps library is linked properly. I built in an alternative which has the mumps_common, pord, scalapack and blacs linkages, but that's only narrowly useful, and is untested -- the right way is to have separate tests for BLACS, ScaLAPACK and MUMPS.
So there it is, hopefully useful, and if not, others can tweak it as needed.
-Adam
- Attachments
-
- mumps.patch
- Patch to include MUMPS detection in the configure system.
- (4.02 KiB) Downloaded 548 times
Re: MUMPS Library Linking
Hi, I just committed this patch (finally -- thought I had done it three months ago when I wrote it).
Re: MUMPS Library Linking
Hi all,
I am facing the same problem as saa_1973, and the error is:
./libelmersolver.so: undefined reference to `blacs_gridexit_'
./libelmersolver.so: undefined reference to `pb_topset_'
./libelmersolver.so: undefined reference to `blacs_gridinit_'
./libelmersolver.so: undefined reference to `blacs_abort_'
./libelmersolver.so: undefined reference to `pb_topget_'
./libelmersolver.so: undefined reference to `blacs_gridinfo_'
collect2: ld returned 1 exit status
make[3]: *** [ElmerSolver] Error 1
make[3]: Leaving directory `/home/aizat/trunk/fem/src'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/home/aizat/trunk/fem/src'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/home/aizat/trunk/fem'
make: *** [all] Error 2
Juha suggested that blacs and metis(at least) is missing from the MUMPS required libraries. I have already compiled the MUMPS library with both BLACS and Metis and the following error still occurs, do you have any suggestion on how to solve this problem? Thanks
-Aizat-
I am facing the same problem as saa_1973, and the error is:
./libelmersolver.so: undefined reference to `blacs_gridexit_'
./libelmersolver.so: undefined reference to `pb_topset_'
./libelmersolver.so: undefined reference to `blacs_gridinit_'
./libelmersolver.so: undefined reference to `blacs_abort_'
./libelmersolver.so: undefined reference to `pb_topget_'
./libelmersolver.so: undefined reference to `blacs_gridinfo_'
collect2: ld returned 1 exit status
make[3]: *** [ElmerSolver] Error 1
make[3]: Leaving directory `/home/aizat/trunk/fem/src'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/home/aizat/trunk/fem/src'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/home/aizat/trunk/fem'
make: *** [all] Error 2
Juha suggested that blacs and metis(at least) is missing from the MUMPS required libraries. I have already compiled the MUMPS library with both BLACS and Metis and the following error still occurs, do you have any suggestion on how to solve this problem? Thanks
-Aizat-
Re: MUMPS Library Linking
Hello Aizat,
Try using ldd on the MUMPS library, that should tell where its dependent libraries are. Either that or add the ParMETIS and BLACS libs to the elmersolver library linking command -- or to the MUMPS libraries line.
You may also need to play with LD_LIBRARY_PATH.
I wonder if anyone at Redhat is working on a package for MUMPS, ParMETIS or Elmer...
-Adam
Try using ldd on the MUMPS library, that should tell where its dependent libraries are. Either that or add the ParMETIS and BLACS libs to the elmersolver library linking command -- or to the MUMPS libraries line.
You may also need to play with LD_LIBRARY_PATH.
I wonder if anyone at Redhat is working on a package for MUMPS, ParMETIS or Elmer...
-Adam
Re: MUMPS Library Linking
Hi adam,
Thank you for a prompt reply.
1) I have tried including both BLACS and Parmetis during the compilation of ElmerSolver as shown below(coloured in red):
mpif90 -m64 -fPIC -Wl,-rpath=/usr/local/elmer/lib -L. -L/home/aizat/MUMPS_4.9.2/lib -ldmumps -lmumps_common -lpord -L/home/aizat/scalapack-1.8.0 -lscalapack /home/aizat/BLACS/LIB/*.a /home/aizat/ParMetis-3.1.1/*.a /home/aizat/BLACS/LIB/*.a -L/usr/local/openmpi/lib -lmpi_f77 -lmpi -L/usr/local/elmer/lib -o ElmerSolver Solver.o mpi_stubs.o -L. -lelmersolver /home/aizat/BLACS/LIB/*.a /home/aizat/ParMetis-3.1.1/*.a
2) I have also included both BLACS and Parmetis into the LD_LIBRARY_PATH (not sure how to play with this library though!!)
LD_LIBRARY_PATH=$HOME/Polyde/lib:/usr/local/lib/pgplot:/usr/local/openmpi/lib:/home/aizat/BLACS/LIB:/home/aizat/ParMetis-3.1.1:/home/aizat/scalapack-1.8.0
3) Besides, I also check the setting for MUMPS and have already included both BLACS and ParMetis in the Makefile.inc (coloured in red)
#LMETISDIR = /local/metis/
#IMETIS = # Metis doesn't need include files (Fortran interface avail.)
LMETISDIR = /home/aizat/ParMetis-3.1.1/
IMETISDIR = /home/aizat/ParMetis-3.1.1/
# You have to choose one among the following two lines depending on
# the type of analysis you want to perform. If you want to perform only
# sequential analysis choose the first (remember to add -Dmetis in the ORDERINGSF
# variable below); for both parallel and sequential analysis choose the second
# line (remember to add -Dparmetis in the ORDERINGSF variable below)
#LMETIS = -L$(LMETISDIR) -lmetis
LMETIS = -L$(LMETISDIR) -lparmetis -lmetis
# The following variables will be used in the compilation process.
# Please note that -Dptscotch and -Dparmetis imply -Dscotch and -Dmetis respectively.
#ORDERINGSF = -Dscotch -Dmetis -Dpord -Dptscotch -Dparmetis
ORDERINGSF = -Dpord -Dparmetis -Dmetis
ORDERINGSC = $(ORDERINGSF)
LORDERINGS = $(LMETIS) $(LPORD) $(LSCOTCH)
IORDERINGSF = $(ISCOTCH)
IORDERINGSC = $(IMETIS) $(IPORD) $(ISCOTCH)
#End orderings
########################################################################
################################################################################
PLAT =
#RM = /bin/rm -f
#CC = gcc
CC =mpicc -fPIC
#FC = gfortran
FC = mpif90 -fPIC
#FL = gfortran
FL = mpif90 -fPIC
AR = ar vr
#RANLIB = ranlib
RANLIB = echo
SCALAP = /home/aizat/scalapack-1.8.0/libscalapack.a /home/aizat/BLACS/LIB/blacs_MPI-LINUX-0.a /home/aizat/BLACS/LIB/blacsCinit_MPI-LINUX-0.a /home/aizat/BLACS/LIB/blacsF77init_MPI-LINUX-0.a
#INCPAR = -I/usr/local/include
INCPAR = -I/usr/local/openmpi/include
# LIBPAR = $(SCALAP) -L/usr/local/lib/ -llammpio -llamf77mpi -lmpi -llam -lutil -ldl -lpthread
LIBPAR = $(SCALAP) -L/usr/local/openmpi/lib/ -lmpi
# See point 17 in the FAQ to have more details on the compilation of mpich with gfortran
INCSEQ = -I$(topdir)/libseq
LIBSEQ = -L$(topdir)/libseq -lmpiseq
#LIBBLAS = -L/usr/lib/xmm/ -lf77blas -latlas
LIBBLAS = -L/usr/lib64/ -lblas
LIBOTHERS = -lpthread
#Preprocessor defs for calling Fortran from C (-DAdd_ or -DAdd__ or -DUPPER)
CDEFS = -DAdd_
#Begin Optimized options
OPTF = -O -Dintel_ -DALLOW_NON_INIT
OPTL = -O
OPTC = -O
#End Optimized options
INC = $(INCPAR)
LIB = $(LIBPAR)
LIBSEQNEEDED =
4) Lastly, I am not sure how to use ldd during compilation of Elmer (can you give me some hint as I am new with Linux).
Thank a lot for you help,
-Aizat-
Thank you for a prompt reply.
1) I have tried including both BLACS and Parmetis during the compilation of ElmerSolver as shown below(coloured in red):
mpif90 -m64 -fPIC -Wl,-rpath=/usr/local/elmer/lib -L. -L/home/aizat/MUMPS_4.9.2/lib -ldmumps -lmumps_common -lpord -L/home/aizat/scalapack-1.8.0 -lscalapack /home/aizat/BLACS/LIB/*.a /home/aizat/ParMetis-3.1.1/*.a /home/aizat/BLACS/LIB/*.a -L/usr/local/openmpi/lib -lmpi_f77 -lmpi -L/usr/local/elmer/lib -o ElmerSolver Solver.o mpi_stubs.o -L. -lelmersolver /home/aizat/BLACS/LIB/*.a /home/aizat/ParMetis-3.1.1/*.a
2) I have also included both BLACS and Parmetis into the LD_LIBRARY_PATH (not sure how to play with this library though!!)
LD_LIBRARY_PATH=$HOME/Polyde/lib:/usr/local/lib/pgplot:/usr/local/openmpi/lib:/home/aizat/BLACS/LIB:/home/aizat/ParMetis-3.1.1:/home/aizat/scalapack-1.8.0
3) Besides, I also check the setting for MUMPS and have already included both BLACS and ParMetis in the Makefile.inc (coloured in red)
#LMETISDIR = /local/metis/
#IMETIS = # Metis doesn't need include files (Fortran interface avail.)
LMETISDIR = /home/aizat/ParMetis-3.1.1/
IMETISDIR = /home/aizat/ParMetis-3.1.1/
# You have to choose one among the following two lines depending on
# the type of analysis you want to perform. If you want to perform only
# sequential analysis choose the first (remember to add -Dmetis in the ORDERINGSF
# variable below); for both parallel and sequential analysis choose the second
# line (remember to add -Dparmetis in the ORDERINGSF variable below)
#LMETIS = -L$(LMETISDIR) -lmetis
LMETIS = -L$(LMETISDIR) -lparmetis -lmetis
# The following variables will be used in the compilation process.
# Please note that -Dptscotch and -Dparmetis imply -Dscotch and -Dmetis respectively.
#ORDERINGSF = -Dscotch -Dmetis -Dpord -Dptscotch -Dparmetis
ORDERINGSF = -Dpord -Dparmetis -Dmetis
ORDERINGSC = $(ORDERINGSF)
LORDERINGS = $(LMETIS) $(LPORD) $(LSCOTCH)
IORDERINGSF = $(ISCOTCH)
IORDERINGSC = $(IMETIS) $(IPORD) $(ISCOTCH)
#End orderings
########################################################################
################################################################################
PLAT =
#RM = /bin/rm -f
#CC = gcc
CC =mpicc -fPIC
#FC = gfortran
FC = mpif90 -fPIC
#FL = gfortran
FL = mpif90 -fPIC
AR = ar vr
#RANLIB = ranlib
RANLIB = echo
SCALAP = /home/aizat/scalapack-1.8.0/libscalapack.a /home/aizat/BLACS/LIB/blacs_MPI-LINUX-0.a /home/aizat/BLACS/LIB/blacsCinit_MPI-LINUX-0.a /home/aizat/BLACS/LIB/blacsF77init_MPI-LINUX-0.a
#INCPAR = -I/usr/local/include
INCPAR = -I/usr/local/openmpi/include
# LIBPAR = $(SCALAP) -L/usr/local/lib/ -llammpio -llamf77mpi -lmpi -llam -lutil -ldl -lpthread
LIBPAR = $(SCALAP) -L/usr/local/openmpi/lib/ -lmpi
# See point 17 in the FAQ to have more details on the compilation of mpich with gfortran
INCSEQ = -I$(topdir)/libseq
LIBSEQ = -L$(topdir)/libseq -lmpiseq
#LIBBLAS = -L/usr/lib/xmm/ -lf77blas -latlas
LIBBLAS = -L/usr/lib64/ -lblas
LIBOTHERS = -lpthread
#Preprocessor defs for calling Fortran from C (-DAdd_ or -DAdd__ or -DUPPER)
CDEFS = -DAdd_
#Begin Optimized options
OPTF = -O -Dintel_ -DALLOW_NON_INIT
OPTL = -O
OPTC = -O
#End Optimized options
INC = $(INCPAR)
LIB = $(LIBPAR)
LIBSEQNEEDED =
4) Lastly, I am not sure how to use ldd during compilation of Elmer (can you give me some hint as I am new with Linux).
Thank a lot for you help,
-Aizat-