Trouble compiling examples: deal.ii configured with PETSc and shared libraries

1,061 views
Skip to first unread message

Uday K

unread,
Jan 29, 2014, 7:31:03 AM1/29/14
to dea...@googlegroups.com
Dear deal.ii users, 

I was hoping to get some help with a problem that I am currently facing. I was successfully able to compile PETSc (version 3.3-p7) and deal.ii (version 8.0.0). However, when I try to compile the first example, I get the following error after executing make:

=============================================
[  0%] Building CXX object CMakeFiles/step-1.dir/step-1.cc.o
Linking CXX executable step-1
/usr/bin/ld: /usr/local/src/petsc-3.3-p7/x86_64/lib/libdmumps.a(dmumps_part2.o): undefined reference to symbol 'scotchfdgraphorderinit_'
/usr/bin/ld: note: 'scotchfdgraphorderinit_' is defined in DSO /usr/local/src/petsc-3.3-p7/x86_64/lib/libpetsc.so so try adding it to the linker command line
/usr/local/src/petsc-3.3-p7/x86_64/lib/libpetsc.so: could not read symbols: Invalid operation
collect2: ld returned 1 exit status
make[2]: *** [step-1] Error 1
make[1]: *** [CMakeFiles/step-1.dir/all] Error 2
make: *** [all] Error 2
=============================================

I am new to cmake, and thus not very familiar with where and how I add information about libpetsc to the linker. I'd have thought that since deal.ii installed correctly with full information about PETSc, it shouldn't complain about not finding something PETSc related. 

Next, in the case of example 17, make passes successfully, but make run fails after a seg fault:

=============================================
[ 50%] Built target step-17
[100%] Run step-17 with Debug configuration
CMake Error at CMakeFiles/run_target.cmake:6 (MESSAGE):
  
  Program terminated with exit code: Segmentation fault

make[3]: *** [CMakeFiles/run] Error 1
make[2]: *** [CMakeFiles/run.dir/all] Error 2
make[1]: *** [CMakeFiles/run.dir/rule] Error 2
make: *** [run] Error 2
=============================================

This is how I have configured PETSc:

./config/configure.py --with-shared-libraries=1 --download-mpich --with-x=0 --download-hypre=1 --download-scalapack --download-ptscotch --download-mumps --download-umfpack

and this is how I configured deal.ii (after installing scalapack, blacs, mumps, umfpack, p4est, hdf5, netcdf):

cmake -DCMAKE_C_COMPILER=${PETSC_BIN}/${CC} -DCMAKE_CXX_COMPILER=${PETSC_BIN}/${CX} -DCMAKE_Fortran_COMPILER=${PETSC_BIN}/${FC} -DCMAKE_INSTALL_PREFIX=/usr/local/bin/deal.II -DCMAKE_PREFIX_PATH=$PETSC_LIB -DSCALAPACK_DIR=$PETSC_LIB -DBLACS_DIR=$PETSC_LIB -DPETSC_DIR=$PETSC_DIR  -DPETSC_ARCH=$PETSC_ARCH -DMUMPS_DIR=$PETSC_LIB -DUMFPACK_DIR=$PETSC_LIB -DDEAL_II_WITH_P4EST=ON -DP4EST_DIR=/usr/local/src/p4est-install/FAST -DP4EST_INCLUDE_DIR=/usr/local/src/p4est-install/FAST/include -DSC_INCLUDE_DIR=/usr/local/src/p4est-install/FAST/include -DDEAL_II_WITH_PETSC=ON -DDEAL_II_WITH_MPI=ON -DDEAL_II_WITH_UMFPACK=ON -DMUMPS_INCLUDE_DIR=${PETSC_INC} -DUMFPACK_INCLUDE_DIR=${PETSC_INC} -DAMD_INCLUDE_DIR=${PETSC_INC} -DDEAL_II_WITH_HDF5=ON -DDEAL_II_WITH_NETCDF=ON ../deal.II

with the following environment variables defined:

export PETSC_DIR=/usr/local/src/petsc-3.3-p7
export PETSC_ARCH=x86_64
export PETSC_LIB=${PETSC_DIR}/${PETSC_ARCH}/lib
export PETSC_BIN=${PETSC_DIR}/${PETSC_ARCH}/bin
export PETSC_INC=${PETSC_DIR}/${PETSC_ARCH}/include
export CC=mpicc
export CX=mpicxx
export FC=mpif90

Any help on resolving the above will be appreciated. I noted a similar query on the same forum, but there the user uses a static library option while compiling PETSc (https://groups.google.com/d/msg/dealii/7kPvGoE6z54/f-Sv36BA01UJ). 

Thanks!
 Uday

Timo Heister

unread,
Jan 29, 2014, 8:49:04 AM1/29/14
to dea...@googlegroups.com
Can you please
1. post your detailed.log from your deal.II build directory
2. the output of "ldd step-17" (in your step-17 directory obviously)
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to dealii+un...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.



--
Timo Heister
http://www.math.clemson.edu/~heister/

Uday K

unread,
Jan 30, 2014, 4:00:00 AM1/30/14
to dea...@googlegroups.com
Dear Timo,

 Here you go. 

detailed.log from build:

###
#
#  deal.II configuration:
#        CMAKE_BUILD_TYPE:       DebugRelease
#        BUILD_SHARED_LIBS:      ON
#        CMAKE_INSTALL_PREFIX:   /usr/local/bin/deal.II
#        CMAKE_SOURCE_DIR:       /usr/local/src/deal.II (Version 8.0.0)
#        CMAKE_BINARY_DIR:       /usr/local/src/build
#        CMAKE_CXX_COMPILER:     GNU 4.6.3 on platform Linux x86_64
#                                /usr/local/src/petsc-3.3-p7/x86_64/bin/mpicxx
#
#  Compiler flags used for this build:
#        CMAKE_CXX_FLAGS:              -pedantic -fpic -Wall -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Wno-deprecated -Wno-deprecated-declarations -std=c++0x  -fPIC -Wno-parentheses -Wno-long-long -Wno-long-long
#        DEAL_II_CXX_FLAGS_RELEASE:    -O2 -funroll-loops -funroll-all-loops -fstrict-aliasing -felide-constructors -Wno-unused
#        DEAL_II_CXX_FLAGS_DEBUG:      -O0 -ggdb -Wa,--compress-debug-sections
#        DEAL_II_LINKER_FLAGS:         -Wl,--as-needed -rdynamic  -Wl,-rpath  -Wl,/usr/local/src/petsc-3.3-p7/x86_64/lib
#        DEAL_II_LINKER_FLAGS_RELEASE: 
#        DEAL_II_LINKER_FLAGS_DEBUG:   -ggdb
#
#  Configured Features (DEAL_II_ALLOW_BUNDLED = ON, DEAL_II_ALLOW_AUTODETECTION = ON):
#      ( DEAL_II_WITH_64BIT_INDICES = OFF )
#      ( DEAL_II_WITH_ARPACK = OFF )
#        DEAL_II_WITH_BOOST set up with bundled packages
#        DEAL_II_WITH_FUNCTIONPARSER set up with bundled packages
#        DEAL_II_WITH_HDF5 set up with external dependencies
#            HDF5_INCLUDE_DIRS = /usr/include
#            HDF5_LIBRARIES = /usr/lib/libhdf5_hl.so;/usr/lib/libhdf5.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#        DEAL_II_WITH_LAPACK set up with external dependencies
#            LAPACK_LIBRARIES = /usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so
#      ( DEAL_II_WITH_METIS = OFF )
#        DEAL_II_WITH_MPI set up with external dependencies
#            MPI_CXX_COMPILER = /usr/local/src/petsc-3.3-p7/x86_64/bin/mpicxx
#            MPI_CXX_COMPILE_FLAGS =  -fPIC
#            MPI_CXX_INCLUDE_PATH = /usr/local/src/petsc-3.3-p7/x86_64/include
#            MPI_CXX_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichcxx.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#            MPI_CXX_LINK_FLAGS =  -Wl,-rpath  -Wl,/usr/local/src/petsc-3.3-p7/x86_64/lib
#        DEAL_II_WITH_MUMPS set up with external dependencies
#            MUMPS_INCLUDE_DIRS = /usr/local/src/petsc-3.3-p7/x86_64/include
#            MUMPS_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libdmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmumps_common.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libpord.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libscalapack.a;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libblacs.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#        DEAL_II_WITH_NETCDF set up with external dependencies
#            NETCDF_INCLUDE_DIRS = /usr/include
#            NETCDF_LIBRARIES = /usr/lib/libnetcdf_c++.so;/usr/lib/libnetcdf.so
#        DEAL_II_WITH_P4EST set up with external dependencies
#            P4EST_INCLUDE_DIRS = /usr/local/src/p4est-install/FAST/include;/usr/local/src/p4est-install/FAST/include
#            P4EST_LIBRARIES = /usr/local/src/p4est-install/FAST/lib/libp4est.so;/usr/local/src/p4est-install/FAST/lib/libsc.so;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#        DEAL_II_WITH_PETSC set up with external dependencies
#            PETSC_INCLUDE_DIRS = /usr/local/src/petsc-3.3-p7/x86_64/include;/usr/local/src/petsc-3.3-p7/include
#            PETSC_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libpetsc.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libcmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libdmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libsmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libzmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmumps_common.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libpord.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libHYPRE.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichcxx.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptesmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptscotch.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptscotcherr.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libscalapack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libblacs.a;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libumfpack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libamd.a;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichcxx.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libdl.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/libdl.so
#      ( DEAL_II_WITH_SLEPC = OFF )
#        DEAL_II_WITH_THREADS set up with bundled packages
#      ( DEAL_II_WITH_TRILINOS = OFF )
#        DEAL_II_WITH_UMFPACK set up with external dependencies
#            UMFPACK_INCLUDE_DIRS = /usr/local/src/petsc-3.3-p7/x86_64/include;/usr/local/src/petsc-3.3-p7/x86_64/include
#            UMFPACK_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libumfpack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libamd.a;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/librt.so
#        DEAL_II_WITH_ZLIB set up with external dependencies
#            ZLIB_INCLUDE_DIRS = /usr/include
#            ZLIB_LIBRARIES = /usr/lib/x86_64-linux-gnu/libz.so
#
#  Component configuration:
#        DEAL_II_COMPONENT_COMPAT_FILES
#      ( DEAL_II_COMPONENT_DOCUMENTATION = OFF )
#        DEAL_II_COMPONENT_EXAMPLES
#        DEAL_II_COMPONENT_MESH_CONVERTER
#      ( DEAL_II_COMPONENT_PARAMETER_GUI = OFF )
#
###

and the output of ldd step-17:

================
linux-vdso.so.1 =>  (0x00007fff76d09000)
libdeal_II.g.so.8.0.0 => /usr/local/bin/deal.II/lib/libdeal_II.g.so.8.0.0 (0x00007faf26b06000)
libpetsc.so => /usr/local/src/petsc-3.3-p7/x86_64/lib/libpetsc.so (0x00007faf252aa000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007faf2507b000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007faf24d7b000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007faf24b64000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007faf247a4000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007faf245a0000)
libhdf5.so.6 => /usr/lib/libhdf5.so.6 (0x00007faf23f5e000)
libnetcdf_c++.so.5 => /usr/lib/libnetcdf_c++.so.5 (0x00007faf23d41000)
libp4est.so.0 => /usr/local/src/p4est-install/FAST/lib/libp4est.so.0 (0x00007faf23ac8000)
libsc.so.0 => /usr/local/src/p4est-install/FAST/lib/libsc.so.0 (0x00007faf23866000)
liblapack.so.3gf => /usr/lib/liblapack.so.3gf (0x00007faf22c70000)
libblas.so.3gf => /usr/lib/libblas.so.3gf (0x00007faf229d6000)
libmpich.so.3 => /usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so.3 (0x00007faf2265e000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007faf22362000)
libgfortran.so.3 => /usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007faf2204b000)
libmpl.so.1 => /usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so.1 (0x00007faf21e44000)
/lib64/ld-linux-x86-64.so.2 (0x00007faf2fe18000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007faf21c2d000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007faf21a24000)
libnetcdf.so.6 => /usr/lib/libnetcdf.so.6 (0x00007faf216e6000)
libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007faf214af000)
libcurl-gnutls.so.4 => /usr/lib/x86_64-linux-gnu/libcurl-gnutls.so.4 (0x00007faf21257000)
libhdf5_hl.so.6 => /usr/lib/libhdf5_hl.so.6 (0x00007faf21025000)
libidn.so.11 => /usr/lib/x86_64-linux-gnu/libidn.so.11 (0x00007faf20df1000)
liblber-2.4.so.2 => /usr/lib/x86_64-linux-gnu/liblber-2.4.so.2 (0x00007faf20be3000)
libldap_r-2.4.so.2 => /usr/lib/x86_64-linux-gnu/libldap_r-2.4.so.2 (0x00007faf20994000)
libgssapi_krb5.so.2 => /usr/lib/x86_64-linux-gnu/libgssapi_krb5.so.2 (0x00007faf20755000)
libgnutls.so.26 => /usr/lib/x86_64-linux-gnu/libgnutls.so.26 (0x00007faf20499000)
libgcrypt.so.11 => /lib/x86_64-linux-gnu/libgcrypt.so.11 (0x00007faf2021b000)
librtmp.so.0 => /usr/lib/x86_64-linux-gnu/librtmp.so.0 (0x00007faf20000000)
libresolv.so.2 => /lib/x86_64-linux-gnu/libresolv.so.2 (0x00007faf1fde4000)
libsasl2.so.2 => /usr/lib/x86_64-linux-gnu/libsasl2.so.2 (0x00007faf1fbc9000)
libgssapi.so.3 => /usr/lib/x86_64-linux-gnu/libgssapi.so.3 (0x00007faf1f98a000)
libkrb5.so.3 => /usr/lib/x86_64-linux-gnu/libkrb5.so.3 (0x00007faf1f6bc000)
libk5crypto.so.3 => /usr/lib/x86_64-linux-gnu/libk5crypto.so.3 (0x00007faf1f494000)
libcom_err.so.2 => /lib/x86_64-linux-gnu/libcom_err.so.2 (0x00007faf1f28f000)
libkrb5support.so.0 => /usr/lib/x86_64-linux-gnu/libkrb5support.so.0 (0x00007faf1f087000)
libtasn1.so.3 => /usr/lib/x86_64-linux-gnu/libtasn1.so.3 (0x00007faf1ee76000)
libp11-kit.so.0 => /usr/lib/x86_64-linux-gnu/libp11-kit.so.0 (0x00007faf1ec63000)
libgpg-error.so.0 => /lib/x86_64-linux-gnu/libgpg-error.so.0 (0x00007faf1ea5f000)
libheimntlm.so.0 => /usr/lib/x86_64-linux-gnu/libheimntlm.so.0 (0x00007faf1e857000)
libkrb5.so.26 => /usr/lib/x86_64-linux-gnu/libkrb5.so.26 (0x00007faf1e5d1000)
libasn1.so.8 => /usr/lib/x86_64-linux-gnu/libasn1.so.8 (0x00007faf1e331000)
libhcrypto.so.4 => /usr/lib/x86_64-linux-gnu/libhcrypto.so.4 (0x00007faf1e0fd000)
libroken.so.18 => /usr/lib/x86_64-linux-gnu/libroken.so.18 (0x00007faf1dee7000)
libkeyutils.so.1 => /lib/x86_64-linux-gnu/libkeyutils.so.1 (0x00007faf1dce3000)
libwind.so.0 => /usr/lib/x86_64-linux-gnu/libwind.so.0 (0x00007faf1dab9000)
libheimbase.so.1 => /usr/lib/x86_64-linux-gnu/libheimbase.so.1 (0x00007faf1d8aa000)
libhx509.so.5 => /usr/lib/x86_64-linux-gnu/libhx509.so.5 (0x00007faf1d660000)
libsqlite3.so.0 => /usr/lib/x86_64-linux-gnu/libsqlite3.so.0 (0x00007faf1d3bc000)
libcrypt.so.1 => /lib/x86_64-linux-gnu/libcrypt.so.1 (0x00007faf1d183000)
================

 Thanks,
Uday

Matthias Maier

unread,
Jan 30, 2014, 4:27:28 AM1/30/14
to dea...@googlegroups.com

Am 30. Jan 2014, 10:00 schrieb Uday K <uda...@gmail.com>:

> Dear Timo,
>
> Here you go.
>
> detailed.log from build:
>
> # MUMPS_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib
> /libdmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/
> libmumps_common.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libpord.a;/
> usr/local/src/petsc-3.3-p7/x86_64/lib/libscalapack.a;/usr/lib/
> liblapack.so;/usr/lib/libblas.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/
> libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/
> x86_64-linux-gnu/4.6/libquadmath.so;/usr/local/src/petsc-3.3-p7/
> x86_64/lib/libblacs.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/
> libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;
> /usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/
> petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/
> lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/
> x86_64-linux-gnu/libpthread.so;/usr/local/src/petsc-3.3-p7/x86_64/lib
> /libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/
> libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/
> usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/
> petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;
> /usr/lib/x86_64-linux-gnu/libpthread.so

The problem is simply that the mumps configuration module does not pick
up scotch at the moment. I'll fix that later.

For the moment, I suggest that you just reconfigure deal.II with
disabled, direct mumps support: -DDEAL_II_WITH_MUMPS=OFF. Please note,
that you can use mumps (if you wish to do so) over the appropriate petsc
wrapper as well.

Best,
Matthias

Uday K

unread,
Jan 30, 2014, 5:34:37 AM1/30/14
to dea...@googlegroups.com
Dear Matthias, while this seems to have resolved the problem with running step-1 successfully, the problem with step-17 remains. I am also getting the same error on step-48. On trying out step-46, make passes, but make run fails with a more verbose error message relating to umfpack. For my purposes, a sparse direct solver (mumps/umfpack) that runs out of the box is essential.

Error message for make run on step-48:

===
[ 50%] Built target step-46
[100%] Run step-46 with Debug configuration
Refinement cycle 0
   Number of active cells: 64
   Number of degrees of freedom: 531
   Assembling...
   Solving...


----------------------------------------------------
Exception on processing: 

--------------------------------------------------------
An error occurred in line <288> of file </usr/local/src/deal.II/source/lac/sparse_direct.cc> in function
    void dealii::SparseDirectUMFPACK::factorize(const Matrix&) [with Matrix = dealii::SparseMatrix<double>]
The violated condition was: 
    status == UMFPACK_OK
The name and call sequence of the exception was:
    ExcUMFPACKError("umfpack_dl_numeric", status)
Additional Information: 
UMFPACK routine umfpack_dl_numeric returned error status 1. See the file <bundled/umfpack/UMFPACK/Include/umfpack.h> for a description of 'status codes'.

Stacktrace:
-----------
#0  /usr/local/bin/deal.II/lib/libdeal_II.g.so.8.0.0: void dealii::SparseDirectUMFPACK::factorize<dealii::SparseMatrix<double> >(dealii::SparseMatrix<double> const&)
#1  /usr/local/bin/deal.II/lib/libdeal_II.g.so.8.0.0: void dealii::SparseDirectUMFPACK::initialize<dealii::SparseMatrix<double> >(dealii::SparseMatrix<double> const&, dealii::SparseDirectUMFPACK::AdditionalData)
#2  step-46: Step46::FluidStructureProblem<2>::solve()
#3  step-46: Step46::FluidStructureProblem<2>::run()
#4  step-46: main
--------------------------------------------------------

Aborting!
----------------------------------------------------
CMake Error at CMakeFiles/run_target.cmake:6 (MESSAGE):
  
  Program terminated with exit code: 1

make[3]: *** [CMakeFiles/run] Error 1
make[2]: *** [CMakeFiles/run.dir/all] Error 2
make[1]: *** [CMakeFiles/run.dir/rule] Error 2
make: *** [run] Error 2
===

[Oddly, there is no step-47 on the tutorial page!]
The output of ldd step-17 is as before:
===
linux-vdso.so.1 =>  (0x00007fff287cb000)
libdeal_II.g.so.8.0.0 => /usr/local/bin/deal.II/lib/libdeal_II.g.so.8.0.0 (0x00007ffbbf8cc000)
libpetsc.so => /usr/local/src/petsc-3.3-p7/x86_64/lib/libpetsc.so (0x00007ffbbe070000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007ffbbde41000)
libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007ffbbdb41000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007ffbbd92a000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007ffbbd56a000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007ffbbd366000)
libhdf5.so.6 => /usr/lib/libhdf5.so.6 (0x00007ffbbcd24000)
libnetcdf_c++.so.5 => /usr/lib/libnetcdf_c++.so.5 (0x00007ffbbcb07000)
libp4est.so.0 => /usr/local/src/p4est-install/FAST/lib/libp4est.so.0 (0x00007ffbbc88e000)
libsc.so.0 => /usr/local/src/p4est-install/FAST/lib/libsc.so.0 (0x00007ffbbc62c000)
liblapack.so.3gf => /usr/lib/liblapack.so.3gf (0x00007ffbbba36000)
libblas.so.3gf => /usr/lib/libblas.so.3gf (0x00007ffbbb79c000)
libmpich.so.3 => /usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so.3 (0x00007ffbbb424000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007ffbbb128000)
libgfortran.so.3 => /usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007ffbbae11000)
libmpl.so.1 => /usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so.1 (0x00007ffbbac0a000)
/lib64/ld-linux-x86-64.so.2 (0x00007ffbc8bdb000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007ffbba9f3000)
librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007ffbba7ea000)
libnetcdf.so.6 => /usr/lib/libnetcdf.so.6 (0x00007ffbba4ac000)
libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007ffbba275000)
libcurl-gnutls.so.4 => /usr/lib/x86_64-linux-gnu/libcurl-gnutls.so.4 (0x00007ffbba01d000)
libhdf5_hl.so.6 => /usr/lib/libhdf5_hl.so.6 (0x00007ffbb9deb000)
libidn.so.11 => /usr/lib/x86_64-linux-gnu/libidn.so.11 (0x00007ffbb9bb7000)
liblber-2.4.so.2 => /usr/lib/x86_64-linux-gnu/liblber-2.4.so.2 (0x00007ffbb99a9000)
libldap_r-2.4.so.2 => /usr/lib/x86_64-linux-gnu/libldap_r-2.4.so.2 (0x00007ffbb975a000)
libgssapi_krb5.so.2 => /usr/lib/x86_64-linux-gnu/libgssapi_krb5.so.2 (0x00007ffbb951b000)
libgnutls.so.26 => /usr/lib/x86_64-linux-gnu/libgnutls.so.26 (0x00007ffbb925f000)
libgcrypt.so.11 => /lib/x86_64-linux-gnu/libgcrypt.so.11 (0x00007ffbb8fe1000)
librtmp.so.0 => /usr/lib/x86_64-linux-gnu/librtmp.so.0 (0x00007ffbb8dc6000)
libresolv.so.2 => /lib/x86_64-linux-gnu/libresolv.so.2 (0x00007ffbb8baa000)
libsasl2.so.2 => /usr/lib/x86_64-linux-gnu/libsasl2.so.2 (0x00007ffbb898f000)
libgssapi.so.3 => /usr/lib/x86_64-linux-gnu/libgssapi.so.3 (0x00007ffbb8750000)
libkrb5.so.3 => /usr/lib/x86_64-linux-gnu/libkrb5.so.3 (0x00007ffbb8482000)
libk5crypto.so.3 => /usr/lib/x86_64-linux-gnu/libk5crypto.so.3 (0x00007ffbb825a000)
libcom_err.so.2 => /lib/x86_64-linux-gnu/libcom_err.so.2 (0x00007ffbb8055000)
libkrb5support.so.0 => /usr/lib/x86_64-linux-gnu/libkrb5support.so.0 (0x00007ffbb7e4d000)
libtasn1.so.3 => /usr/lib/x86_64-linux-gnu/libtasn1.so.3 (0x00007ffbb7c3c000)
libp11-kit.so.0 => /usr/lib/x86_64-linux-gnu/libp11-kit.so.0 (0x00007ffbb7a29000)
libgpg-error.so.0 => /lib/x86_64-linux-gnu/libgpg-error.so.0 (0x00007ffbb7825000)
libheimntlm.so.0 => /usr/lib/x86_64-linux-gnu/libheimntlm.so.0 (0x00007ffbb761d000)
libkrb5.so.26 => /usr/lib/x86_64-linux-gnu/libkrb5.so.26 (0x00007ffbb7397000)
libasn1.so.8 => /usr/lib/x86_64-linux-gnu/libasn1.so.8 (0x00007ffbb70f7000)
libhcrypto.so.4 => /usr/lib/x86_64-linux-gnu/libhcrypto.so.4 (0x00007ffbb6ec3000)
libroken.so.18 => /usr/lib/x86_64-linux-gnu/libroken.so.18 (0x00007ffbb6cad000)
libkeyutils.so.1 => /lib/x86_64-linux-gnu/libkeyutils.so.1 (0x00007ffbb6aa9000)
libwind.so.0 => /usr/lib/x86_64-linux-gnu/libwind.so.0 (0x00007ffbb687f000)
libheimbase.so.1 => /usr/lib/x86_64-linux-gnu/libheimbase.so.1 (0x00007ffbb6670000)
libhx509.so.5 => /usr/lib/x86_64-linux-gnu/libhx509.so.5 (0x00007ffbb6426000)
libsqlite3.so.0 => /usr/lib/x86_64-linux-gnu/libsqlite3.so.0 (0x00007ffbb6182000)
libcrypt.so.1 => /lib/x86_64-linux-gnu/libcrypt.so.1 (0x00007ffbb5f49000)
===

Thanks,
 Uday

Wolfgang Bangerth

unread,
Jan 30, 2014, 12:31:23 PM1/30/14
to dea...@googlegroups.com
On 01/30/2014 04:34 AM, Uday K wrote:
> Dear Matthias, while this seems to have resolved the problem with
> running step-1 successfully, the problem with step-17 remains. I am also
> getting the same error on step-48. On trying out step-46, make passes,
> but make run fails with a more verbose error message relating to
> umfpack. For my purposes, a sparse direct solver (mumps/umfpack) that
> runs out of the box is essential.
>
> Error message for make run on step-48:
>
> ===
> [ 50%] Built target step-46
> [100%] Run step-46 with Debug configuration
> Refinement cycle 0
> Number of active cells: 64
> Number of degrees of freedom: 531
> Assembling...
> Solving...

This works in mainline (I just verified). The error message seems to
indicate that the matrix is singular. Can you try with deal.II 8.1?

As for your problem with step-17: you show that PETSc segfaults. Can you
try to run the tests that come with PETSc and that the PETSc
documentation suggests one runs right after installation?

Best
W.


--
------------------------------------------------------------------------
Wolfgang Bangerth email: bang...@math.tamu.edu
www: http://www.math.tamu.edu/~bangerth/

Timo Heister

unread,
Jan 30, 2014, 5:01:53 PM1/30/14
to dea...@googlegroups.com
> This works in mainline (I just verified). The error message seems to
> indicate that the matrix is singular. Can you try with deal.II 8.1?

and if you do that, please also run "make test" in the deal.II build
directory and report any failing tests.

> As for your problem with step-17: you show that PETSc segfaults. Can you try
> to run the tests that come with PETSc and that the PETSc documentation
> suggests one runs right after installation?

And if not, are you running with debug mode on in PETSc? You can try
to run in a debugger to see where the problem is happening.

Uday K

unread,
Jan 31, 2014, 2:05:41 AM1/31/14
to dea...@googlegroups.com


On Friday, 31 January 2014 03:31:53 UTC+5:30, Timo Heister wrote:
> This works in mainline (I just verified). The error message seems to
> indicate that the matrix is singular. Can you try with deal.II 8.1?

and if you do that, please also run "make test" in the deal.II build
directory and report any failing tests.

Interestingly (in version 8.0.0), make test fails because there is no target defined:

===
make test
make: *** No rule to make target `test'.  Stop.
===

So, is there no hope for version 8.0.0 and I'll need to go the higher version?
 

> As for your problem with step-17: you show that PETSc segfaults. Can you try
> to run the tests that come with PETSc and that the PETSc documentation
> suggests one runs right after installation?

And if not, are you running with debug mode on in PETSc? You can try
to run in a debugger to see where the problem is happening.

The tests that come with PETSc (compiled in debug mode) run successfully. This is the output of make all test in the PETSc directory

===
Using configure Options: --with-shared-libraries=1 --with-x=0 --download-mpich --download-hypre=1 --download-blacs --download-scalapack --download-mumps --download-ptscotch --download-umfpack=yes
Using configuration flags:
#define INCLUDED_PETSCCONF_H
#define IS_COLORING_MAX 65535
#define STDC_HEADERS 1
#define MPIU_COLORING_VALUE MPI_UNSIGNED_SHORT
#define PETSC_UINTPTR_T uintptr_t
#define PETSC_HAVE_PTHREAD 1
#define PETSC_STATIC_INLINE static inline
#define PETSC_REPLACE_DIR_SEPARATOR '\\'
#define PETSC_RESTRICT  __restrict__
#define PETSC_HAVE_SO_REUSEADDR 1
#define PETSC_HAVE_MPI 1
#define PETSC_PREFETCH_HINT_T2 _MM_HINT_T2
#define PETSC_PREFETCH_HINT_T0 _MM_HINT_T0
#define PETSC_PREFETCH_HINT_T1 _MM_HINT_T1
#define PETSC__GNU_SOURCE 1
#define PETSC_HAVE_FORTRAN 1
#define PETSC_HAVE_HYPRE 1
#define PETSC_LIB_DIR "/usr/local/src/petsc-3.3-p7/x86_64/lib"
#define PETSC_USE_SOCKET_VIEWER 1
#define PETSC__POSIX_C_SOURCE_200112L 1
#define PETSC_SLSUFFIX "so"
#define PETSC_FUNCTION_NAME_CXX __func__
#define PETSC_HAVE_UMFPACK 1
#define PETSC_HAVE_MUMPS 1
#define PETSC_HAVE_ATOLL 1
#define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1
#define PETSC_UNUSED __attribute((unused))
#define PETSC_FUNCTION_NAME_C __func__
#define PETSC_HAVE_VALGRIND 1
#define PETSC_USE_SINGLE_LIBRARY 1
#define PETSC_HAVE_BUILTIN_EXPECT 1
#define PETSC_HAVE_PTSCOTCH 1
#define PETSC_DIR_SEPARATOR '/'
#define PETSC_PATH_SEPARATOR ':'
#define PETSC__BSD_SOURCE 1
#define PETSC_HAVE_XMMINTRIN_H 1
#define PETSC_PREFETCH_HINT_NTA _MM_HINT_NTA
#define PETSC_Prefetch(a,b,c) _mm_prefetch((const char*)(a),(c))
#define PETSC_HAVE_BLASLAPACK 1
#define PETSC_HAVE_STRING_H 1
#define PETSC_HAVE_SYS_TYPES_H 1
#define PETSC_HAVE_ENDIAN_H 1
#define PETSC_HAVE_SYS_PROCFS_H 1
#define PETSC_HAVE_DLFCN_H 1
#define PETSC_HAVE_SCHED_H 1
#define PETSC_HAVE_STDINT_H 1
#define PETSC_HAVE_LINUX_KERNEL_H 1
#define PETSC_HAVE_TIME_H 1
#define PETSC_HAVE_MATH_H 1
#define PETSC_TIME_WITH_SYS_TIME 1
#define PETSC_HAVE_SYS_PARAM_H 1
#define PETSC_HAVE_SYS_SOCKET_H 1
#define PETSC_HAVE_UNISTD_H 1
#define PETSC_HAVE_STDLIB_H 1
#define PETSC_HAVE_SYS_WAIT_H 1
#define PETSC_HAVE_LIMITS_H 1
#define PETSC_HAVE_SYS_UTSNAME_H 1
#define PETSC_HAVE_NETINET_IN_H 1
#define PETSC_HAVE_FENV_H 1
#define PETSC_HAVE_PTHREAD_H 1
#define PETSC_HAVE_FLOAT_H 1
#define PETSC_HAVE_SEARCH_H 1
#define PETSC_HAVE_SYS_RESOURCE_H 1
#define PETSC_HAVE_SYS_TIMES_H 1
#define PETSC_HAVE_NETDB_H 1
#define PETSC_HAVE_MALLOC_H 1
#define PETSC_HAVE_PWD_H 1
#define PETSC_HAVE_FCNTL_H 1
#define PETSC_HAVE_STRINGS_H 1
#define PETSC_HAVE_MEMORY_H 1
#define PETSC_HAVE_SYS_SYSINFO_H 1
#define PETSC_HAVE_SYS_TIME_H 1
#define PETSC_USING_F90 1
#define PETSC_USING_F2003 1
#define PETSC_HAVE_RTLD_NOW 1
#define PETSC_HAVE_RTLD_LOCAL 1
#define PETSC_HAVE_RTLD_LAZY 1
#define PETSC_C_STATIC_INLINE static inline
#define PETSC_HAVE_FORTRAN_UNDERSCORE 1
#define PETSC_HAVE_CXX_NAMESPACE 1
#define PETSC_HAVE_RTLD_GLOBAL 1
#define PETSC_C_RESTRICT  __restrict__
#define PETSC_CXX_RESTRICT  __restrict__
#define PETSC_CXX_STATIC_INLINE static inline
#define PETSC_HAVE_LIBBLAS 1
#define PETSC_HAVE_LIBDL 1
#define PETSC_HAVE_LIBGFORTRAN 1
#define PETSC_HAVE_LIBSCALAPACK 1
#define PETSC_HAVE_LIBLAPACK 1
#define PETSC_HAVE_LIBPTSCOTCH 1
#define PETSC_HAVE_LIBM 1
#define PETSC_HAVE_LIBDMUMPS 1
#define PETSC_HAVE_LIBPTESMUMPS 1
#define PETSC_HAVE_LIBMUMPS_COMMON 1
#define PETSC_HAVE_LIBPTHREAD 1
#define PETSC_HAVE_LIBMPICHCXX 1
#define PETSC_HAVE_LIBBLACS 1
#define PETSC_HAVE_LIBZMUMPS 1
#define PETSC_HAVE_LIBSTDC__ 1
#define PETSC_HAVE_LIBHYPRE 1
#define PETSC_HAVE_LIBAMD 1
#define PETSC_HAVE_LIBSMUMPS 1
#define PETSC_HAVE_LIBRT 1
#define PETSC_HAVE_LIBPTSCOTCHERR 1
#define PETSC_HAVE_LIBMPICHF90 1
#define PETSC_HAVE_LIBCMUMPS 1
#define PETSC_HAVE_LIBUMFPACK 1
#define PETSC_HAVE_LIBPORD 1
#define PETSC_HAVE_ERF 1
#define PETSC_HAVE_LIBQUADMATH 1
#define PETSC_ARCH "x86_64"
#define PETSC_DIR "/usr/local/src/petsc-3.3-p7"
#define HAVE_GZIP 1
#define PETSC_CLANGUAGE_C 1
#define PETSC_USE_EXTERN_CXX  
#define PETSC_USE_ERRORCHECKING 1
#define PETSC_MISSING_DREAL 1
#define PETSC_SIZEOF_MPI_COMM 4
#define PETSC_BITS_PER_BYTE 8
#define PETSC_SIZEOF_MPI_FINT 4
#define PETSC_SIZEOF_VOID_P 8
#define PETSC_RETSIGTYPE void
#define PETSC_HAVE_CXX_COMPLEX 1
#define PETSC_SIZEOF_LONG 8
#define PETSC_USE_FORTRANKIND 1
#define PETSC_SIZEOF_SIZE_T 8
#define PETSC_SIZEOF_CHAR 1
#define PETSC_SIZEOF_DOUBLE 8
#define PETSC_SIZEOF_FLOAT 4
#define PETSC_HAVE_C99_COMPLEX 1
#define PETSC_SIZEOF_INT 4
#define PETSC_SIZEOF_LONG_LONG 8
#define PETSC_SIZEOF_SHORT 2
#define PETSC_HAVE_STRCASECMP 1
#define PETSC_HAVE_POPEN 1
#define PETSC_HAVE_SIGSET 1
#define PETSC_HAVE_GETWD 1
#define PETSC_HAVE_VSNPRINTF 1
#define PETSC_HAVE_TIMES 1
#define PETSC_HAVE_DLSYM 1
#define PETSC_HAVE_SNPRINTF 1
#define PETSC_HAVE_GETPWUID 1
#define PETSC_HAVE_GETHOSTBYNAME 1
#define PETSC_HAVE_SLEEP 1
#define PETSC_HAVE_DLERROR 1
#define PETSC_HAVE_FORK 1
#define PETSC_HAVE_RAND 1
#define PETSC_HAVE_GETTIMEOFDAY 1
#define PETSC_HAVE_DLCLOSE 1
#define PETSC_HAVE_UNAME 1
#define PETSC_HAVE_GETHOSTNAME 1
#define PETSC_HAVE_MKSTEMP 1
#define PETSC_HAVE_SIGACTION 1
#define PETSC_HAVE_DRAND48 1
#define PETSC_HAVE_MEMALIGN 1
#define PETSC_HAVE_VA_COPY 1
#define PETSC_HAVE_CLOCK 1
#define PETSC_HAVE_ACCESS 1
#define PETSC_HAVE_SIGNAL 1
#define PETSC_HAVE_USLEEP 1
#define PETSC_HAVE_GETRUSAGE 1
#define PETSC_HAVE_VFPRINTF 1
#define PETSC_HAVE_NANOSLEEP 1
#define PETSC_HAVE_GETDOMAINNAME 1
#define PETSC_HAVE_TIME 1
#define PETSC_HAVE_LSEEK 1
#define PETSC_HAVE_SOCKET 1
#define PETSC_HAVE_SYSINFO 1
#define PETSC_HAVE_READLINK 1
#define PETSC_HAVE_REALPATH 1
#define PETSC_HAVE_DLOPEN 1
#define PETSC_HAVE_MEMMOVE 1
#define PETSC_HAVE__GFORTRAN_IARGC 1
#define PETSC_SIGNAL_CAST  
#define PETSC_HAVE_GETCWD 1
#define PETSC_HAVE_VPRINTF 1
#define PETSC_HAVE_BZERO 1
#define PETSC_HAVE_GETPAGESIZE 1
#define PETSC_USE_PROC_FOR_SIZE 1
#define PETSC_USE_GDB_DEBUGGER 1
#define PETSC_USE_INFO 1
#define PETSC_PETSC_USE_BACKWARD_LOOP 1
#define PETSC_Alignx(a,b)   
#define PETSC_USE_DEBUG 1
#define PETSC_USE_LOG 1
#define PETSC_IS_COLOR_VALUE_TYPE short
#define PETSC_USE_CTABLE 1
#define PETSC_USE_SCALAR_REAL 1
#define PETSC_HAVE_ISINF 1
#define PETSC_HAVE_ISNAN 1
#define PETSC_USE_REAL_DOUBLE 1
#define PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT 1
#define PETSC_HAVE_GFORTRAN_IARGC 1
#define PETSC_HAVE_DYNAMIC_LIBRARIES 1
#define PETSC_HAVE_SHARED_LIBRARIES 1
#define PETSC_USE_SHARED_LIBRARIES 1
#define PETSC_HAVE_MPI_COMM_C2F 1
#define PETSC_HAVE_MPI_INIT_THREAD 1
#define PETSC_HAVE_MPI_LONG_DOUBLE 1
#define PETSC_HAVE_MPI_COMM_F2C 1
#define PETSC_HAVE_MPI_FINT 1
#define PETSC_HAVE_MPI_F90MODULE 1
#define PETSC_HAVE_MPI_FINALIZED 1
#define PETSC_HAVE_MPI_COMM_SPAWN 1
#define PETSC_HAVE_MPI_WIN_CREATE 1
#define PETSC_HAVE_MPI_REPLACE 1
#define PETSC_HAVE_MPI_EXSCAN 1
#define PETSC_HAVE_MPIIO 1
#define PETSC_HAVE_MPI_C_DOUBLE_COMPLEX 1
#define PETSC_HAVE_MPI_ALLTOALLW 1
#define PETSC_HAVE_MPI_IN_PLACE 1
#define PETSC_MEMALIGN 16
#define PETSC_LEVEL1_DCACHE_LINESIZE 64
#define PETSC_LEVEL1_DCACHE_SIZE 32768
#define PETSC_LEVEL1_DCACHE_ASSOC 8
#define PETSC_BLASLAPACK_UNDERSCORE 1
#define PETSC_HAVE_SCHED_CPU_SET_T 1
#define PETSC_HAVE_PTHREAD_BARRIER_T 1
#define PETSC_HAVE_SYS_SYSCTL_H 1
-----------------------------------------
Using C/C++ include paths: -I/usr/local/src/petsc-3.3-p7/include -I/usr/local/src/petsc-3.3-p7/x86_64/include
Using C/C++ compiler: /usr/local/src/petsc-3.3-p7/x86_64/bin/mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g3 -fno-inline -O0   
Using Fortran include/module paths: -I/usr/local/src/petsc-3.3-p7/include -I/usr/local/src/petsc-3.3-p7/x86_64/include
Using Fortran compiler: /usr/local/src/petsc-3.3-p7/x86_64/bin/mpif90 -fPIC  -Wall -Wno-unused-variable -Wno-unused-dummy-argument -g   
-----------------------------------------
Using C/C++ linker: /usr/local/src/petsc-3.3-p7/x86_64/bin/mpicc
Using C/C++ flags: -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -g3 -fno-inline -O0
Using Fortran linker: /usr/local/src/petsc-3.3-p7/x86_64/bin/mpif90
Using Fortran flags: -fPIC  -Wall -Wno-unused-variable -Wno-unused-dummy-argument -g
-----------------------------------------
Using libraries: -Wl,-rpath,/usr/local/src/petsc-3.3-p7/x86_64/lib -L/usr/local/src/petsc-3.3-p7/x86_64/lib  -lpetsc -Wl,-rpath,/usr/local/src/petsc-3.3-p7/x86_64/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.6 -L/usr/lib/gcc/x86_64-linux-gnu/4.6 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpichcxx -lstdc++ -lptesmumps -lptscotch -lptscotcherr -lscalapack -lblacs -lpthread -lumfpack -lamd -llapack -lblas -lm -lmpichf90 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpichcxx -lstdc++ -lrt -lm -ldl -lmpich -lopa -lmpl -lrt -lpthread -lgcc_s -ldl 
------------------------------------------
Using mpiexec: /usr/local/src/petsc-3.3-p7/x86_64/bin/mpiexec
==========================================
Building PETSc using CMake with 5 build threads
==========================================
Re-run cmake file: Makefile older than: ../CMakeLists.txt
-- Configuring done
-- Generating done
-- Build files have been written to: /usr/local/src/petsc-3.3-p7/x86_64
[100%] Built target petsc
=========================================
Now to check if the libraries are working do:
make PETSC_DIR=/usr/local/src/petsc-3.3-p7 PETSC_ARCH=x86_64 test
=========================================
Running test examples to verify correct installation
Using PETSC_DIR=/usr/local/src/petsc-3.3-p7 and PETSC_ARCH=x86_64
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI process
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI processes
Fortran example src/snes/examples/tutorials/ex5f run successfully with 1 MPI process
Completed test examples
=== 

Timo Heister

unread,
Jan 31, 2014, 7:33:10 AM1/31/14
to dea...@googlegroups.com
>> > indicate that the matrix is singular. Can you try with deal.II 8.1?
>>
>> and if you do that, please also run "make test" in the deal.II build
>> directory and report any failing tests.
>
> Interestingly (in version 8.0.0), make test fails because there is no target
> defined:

This is new in 8.1.

Wolfgang Bangerth

unread,
Jan 31, 2014, 8:39:25 AM1/31/14
to dea...@googlegroups.com

> So, is there no hope for version 8.0.0 and I'll need to go the higher version?

I would suggest you do. As for the PETSc error -- you may have to run your
program under a debugger to find out where the segmentation fault happens.

Uday K

unread,
Feb 2, 2014, 11:00:26 AM2/2/14
to dea...@googlegroups.com
On Friday, 31 January 2014 19:09:25 UTC+5:30, Wolfgang Bangerth wrote:

> So, is there no hope for version 8.0.0 and I'll need to go the higher version?

I would suggest you do. As for the PETSc error -- you may have to run your
program under a debugger to find out where the segmentation fault happens.


Well, I didn't have better luck with version 8.1.0. In this case, while make install is successful, make test fails. I'm pasting the output of detailed.log, as well as  the output of the log file from tests/quick_tests/quicktests.log in the build directory:

===
detailed.log
===
###
#
#  deal.II configuration:
#        CMAKE_BUILD_TYPE:       DebugRelease
#        BUILD_SHARED_LIBS:      ON
#        CMAKE_INSTALL_PREFIX:   /usr/local/bin/deal.II
#        CMAKE_SOURCE_DIR:       /usr/local/src/deal.II (Version 8.1.0)
#        CMAKE_BINARY_DIR:       /usr/local/src/build
#        CMAKE_CXX_COMPILER:     GNU 4.6.3 on platform Linux x86_64
#                                /usr/local/src/petsc-3.3-p7/x86_64/bin/mpicxx
#        CMAKE_C_COMPILER:       /usr/local/src/petsc-3.3-p7/x86_64/bin/mpicc
#        CMAKE_Fortran_COMPILER: /usr/local/src/petsc-3.3-p7/x86_64/bin/mpif90
#        CMAKE_GENERATOR:        Unix Makefiles
#
#  Compiler flags used for this build:
#        CMAKE_CXX_FLAGS:              -pedantic -fpic -Wall -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Wno-long-long -Wno-deprecated -Wno-deprecated-declarations -std=c++0x  -fPIC -Wno-parentheses -Wno-long-long -Wno-long-long
#        DEAL_II_CXX_FLAGS_RELEASE:    -O2 -funroll-loops -funroll-all-loops -fstrict-aliasing -Wno-unused
#        DEAL_II_CXX_FLAGS_DEBUG:      -O0 -ggdb -Wa,--compress-debug-sections
#        DEAL_II_LINKER_FLAGS:         -Wl,--as-needed -rdynamic  -Wl,-rpath  -Wl,/usr/local/src/petsc-3.3-p7/x86_64/lib
#        DEAL_II_LINKER_FLAGS_RELEASE:
#        DEAL_II_LINKER_FLAGS_DEBUG:   -ggdb
#
#  Configured Features (DEAL_II_ALLOW_BUNDLED = ON, DEAL_II_ALLOW_AUTODETECTION = ON):
#      ( DEAL_II_WITH_64BIT_INDICES = OFF )
#      ( DEAL_II_WITH_ARPACK = OFF )
#        DEAL_II_WITH_BOOST set up with bundled packages
#        DEAL_II_WITH_FUNCTIONPARSER set up with bundled packages
#        DEAL_II_WITH_HDF5 set up with external dependencies
#            HDF5_INCLUDE_DIRS = /usr/include
#            HDF5_LIBRARIES = /usr/lib/libhdf5_hl.so;/usr/lib/libhdf5.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#        DEAL_II_WITH_LAPACK set up with external dependencies
#            LAPACK_LIBRARIES = /usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so
#      ( DEAL_II_WITH_METIS = OFF )
#        DEAL_II_WITH_MPI set up with external dependencies
#            MPI_VERSION = 2.2
#            MPI_CXX_COMPILER = /usr/local/src/petsc-3.3-p7/x86_64/bin/mpicxx
#            MPI_CXX_COMPILE_FLAGS =  -fPIC
#            MPI_CXX_INCLUDE_PATH = /usr/local/src/petsc-3.3-p7/x86_64/include
#            MPI_CXX_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichcxx.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#            MPI_CXX_LINK_FLAGS =  -Wl,-rpath  -Wl,/usr/local/src/petsc-3.3-p7/x86_64/lib
#      ( DEAL_II_WITH_MUMPS = OFF )
#        DEAL_II_WITH_NETCDF set up with external dependencies
#            NETCDF_INCLUDE_DIRS = /usr/include
#            NETCDF_LIBRARIES = /usr/lib/libnetcdf_c++.so;/usr/lib/libnetcdf.so
#        DEAL_II_WITH_P4EST set up with external dependencies
#            P4EST_VERSION = 0.3.4.2
#            P4EST_DIR = /usr/local/src/p4est-install/FAST
#            P4EST_INCLUDE_DIRS = /usr/local/src/p4est-install/FAST/include;/usr/local/src/p4est-install/FAST/include
#            P4EST_LIBRARIES = /usr/local/src/p4est-install/FAST/lib/libp4est.so;/usr/local/src/p4est-install/FAST/lib/libsc.so;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#        DEAL_II_WITH_PETSC set up with external dependencies
#            PETSC_VERSION = 3.3.0.3
#            PETSC_DIR = /usr/local/src/petsc-3.3-p7
#            PETSC_INCLUDE_DIRS = /usr/local/src/petsc-3.3-p7/include;/usr/local/src/petsc-3.3-p7/x86_64/include
#            PETSC_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libpetsc.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libcmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libdmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libsmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libzmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmumps_common.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libpord.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libHYPRE.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichcxx.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptesmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptscotch.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptscotcherr.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libscalapack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libblacs.a;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libumfpack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libamd.a;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichcxx.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libdl.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/libdl.so
#      ( DEAL_II_WITH_SLEPC = OFF )
#        DEAL_II_WITH_THREADS set up with bundled packages
#      ( DEAL_II_WITH_TRILINOS = OFF )
#        DEAL_II_WITH_UMFPACK set up with external dependencies
#            UMFPACK_VERSION = 5.5.1
#            UMFPACK_DIR = /usr/local/src/petsc-3.3-p7/x86_64/lib
#            UMFPACK_INCLUDE_DIRS = /usr/local/src/petsc-3.3-p7/x86_64/include;/usr/local/src/petsc-3.3-p7/x86_64/include
#            UMFPACK_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libumfpack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libamd.a;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpichf90.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpich.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libopa.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so;/usr/lib/x86_64-linux-gnu/librt.so
#        DEAL_II_WITH_ZLIB set up with external dependencies
#            ZLIB_VERSION = 1.2.3.4
#            ZLIB_INCLUDE_DIRS = /usr/include
#            ZLIB_LIBRARIES = /usr/lib/x86_64-linux-gnu/libz.so
#
#  Component configuration:
#        DEAL_II_COMPONENT_COMPAT_FILES
#      ( DEAL_II_COMPONENT_DOCUMENTATION = OFF )
#        DEAL_II_COMPONENT_EXAMPLES
#        DEAL_II_COMPONENT_MESH_CONVERTER
#      ( DEAL_II_COMPONENT_PARAMETER_GUI = OFF )
#
###
===


===
tests/quick_tests/quicktests.log:
===
[HANDLER_OUTPUT]
Test project /usr/local/src/build/tests/quick_tests

    Start 1: step.debug
1/7 Test #1: step.debug .......................   Passed    4.25 sec
    Start 2: step.release
2/7 Test #2: step.release .....................   Passed    3.10 sec
    Start 3: affinity.debug
3/7 Test #3: affinity.debug ...................   Passed    3.02 sec
    Start 4: mpi.debug
4/7 Test #4: mpi.debug ........................***Failed    3.07 sec
Test mpi.debug: RUN
===============================   OUTPUT BEGIN  ===============================
[  0%] Built target kill-mpi.debug-OK
[  0%] Built target expand_instantiations_exe
[  1%] Built target obj_meshworker.inst
[  1%] Built target obj_meshworker.debug
[  8%] Built target obj_boost_serialization.debug
[ 10%] Built target obj_functionparser.debug
[ 18%] Built target obj_tbb.debug
[ 23%] Built target obj_numerics.inst
[ 30%] Built target obj_numerics.debug
[ 38%] Built target obj_fe.inst
[ 47%] Built target obj_fe.debug
[ 49%] Built target obj_dofs.inst
[ 52%] Built target obj_dofs.debug
[ 55%] Built target obj_lac.inst
[ 71%] Built target obj_lac.debug
[ 71%] Built target obj_base.inst
[ 83%] Built target obj_base.debug
[ 86%] Built target obj_grid.inst
[ 89%] Built target obj_grid.debug
[ 89%] Built target obj_hp.inst
[ 91%] Built target obj_hp.debug
[ 93%] Built target obj_multigrid.inst
[ 94%] Built target obj_multigrid.debug
[ 96%] Built target obj_distributed.inst
[ 96%] Built target obj_distributed.debug
[ 96%] Built target obj_algorithms.inst
[ 98%] Built target obj_algorithms.debug
[ 98%] Built target obj_integrators.debug
[ 98%] Built target obj_matrix_free.inst
[100%] Built target obj_matrix_free.debug
[100%] Built target deal_II.g
[100%] Built target mpi.debug
mpi.debug: RUN failed. Output:
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 4746 on node Bhairavi exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
make[7]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/rule] Error 2
make[4]: *** [mpi.debug.run] Error 2


mpi.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort



    Start 5: tbb.debug
5/7 Test #5: tbb.debug ........................   Passed    3.04 sec
    Start 6: p4est.debug
6/7 Test #6: p4est.debug ......................***Failed    3.14 sec
Test p4est.debug: RUN
===============================   OUTPUT BEGIN  ===============================
[  0%] Built target kill-p4est.debug-OK
[  0%] Built target expand_instantiations_exe
[  1%] Built target obj_meshworker.inst
[  1%] Built target obj_meshworker.debug
[  8%] Built target obj_boost_serialization.debug
[ 10%] Built target obj_functionparser.debug
[ 18%] Built target obj_tbb.debug
[ 23%] Built target obj_numerics.inst
[ 30%] Built target obj_numerics.debug
[ 38%] Built target obj_fe.inst
[ 47%] Built target obj_fe.debug
[ 49%] Built target obj_dofs.inst
[ 52%] Built target obj_dofs.debug
[ 55%] Built target obj_lac.inst
[ 71%] Built target obj_lac.debug
[ 71%] Built target obj_base.inst
[ 83%] Built target obj_base.debug
[ 86%] Built target obj_grid.inst
[ 89%] Built target obj_grid.debug
[ 89%] Built target obj_hp.inst
[ 91%] Built target obj_hp.debug
[ 93%] Built target obj_multigrid.inst
[ 94%] Built target obj_multigrid.debug
[ 96%] Built target obj_distributed.inst
[ 96%] Built target obj_distributed.debug
[ 96%] Built target obj_algorithms.inst
[ 98%] Built target obj_algorithms.debug
[ 98%] Built target obj_integrators.debug
[ 98%] Built target obj_matrix_free.inst
[100%] Built target obj_matrix_free.debug
[100%] Built target deal_II.g
[100%] Built target p4est.debug
p4est.debug: RUN failed. Output:
--------------------------------------------------------------------------
mpirun noticed that process rank 3 with PID 5183 on node Bhairavi exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
2 total processes killed (some possibly by mpirun during cleanup)
make[7]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run.dir/rule] Error 2
make[4]: *** [p4est.debug.run] Error 2


p4est.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort



    Start 7: step-petsc.debug
7/7 Test #7: step-petsc.debug .................***Failed    3.11 sec
Test step-petsc.debug: RUN
===============================   OUTPUT BEGIN  ===============================
[  0%] Built target kill-step-petsc.debug-OK
[  0%] Built target expand_instantiations_exe
[  1%] Built target obj_meshworker.inst
[  1%] Built target obj_meshworker.debug
[  8%] Built target obj_boost_serialization.debug
[ 10%] Built target obj_functionparser.debug
[ 18%] Built target obj_tbb.debug
[ 23%] Built target obj_numerics.inst
[ 30%] Built target obj_numerics.debug
[ 38%] Built target obj_fe.inst
[ 47%] Built target obj_fe.debug
[ 49%] Built target obj_dofs.inst
[ 52%] Built target obj_dofs.debug
[ 55%] Built target obj_lac.inst
[ 71%] Built target obj_lac.debug
[ 71%] Built target obj_base.inst
[ 83%] Built target obj_base.debug
[ 86%] Built target obj_grid.inst
[ 89%] Built target obj_grid.debug
[ 89%] Built target obj_hp.inst
[ 91%] Built target obj_hp.debug
[ 93%] Built target obj_multigrid.inst
[ 94%] Built target obj_multigrid.debug
[ 96%] Built target obj_distributed.inst
[ 96%] Built target obj_distributed.debug
[ 96%] Built target obj_algorithms.inst
[ 98%] Built target obj_algorithms.debug
[ 98%] Built target obj_integrators.debug
[ 98%] Built target obj_matrix_free.inst
[100%] Built target obj_matrix_free.debug
[100%] Built target deal_II.g
[100%] Built target step-petsc.debug
step-petsc.debug: RUN failed. Output:
Segmentation fault (core dumped)
make[7]: *** [tests/quick_tests/CMakeFiles/step-petsc.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/step-petsc.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/step-petsc.debug.run.dir/rule] Error 2
make[4]: *** [step-petsc.debug.run] Error 2


step-petsc.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort




57% tests passed, 3 tests failed out of 7

Total Test time (real) =  22.73 sec

The following tests FAILED:
          4 - mpi.debug (Failed)
          6 - p4est.debug (Failed)
          7 - step-petsc.debug (Failed)
[ERROR_MESSAGE]
Errors while running CTest

=== 

Timo Heister

unread,
Feb 2, 2014, 2:06:39 PM2/2/14
to dea...@googlegroups.com
> Start 4: mpi.debug
> 4/7 Test #4: mpi.debug ........................***Failed 3.07 sec
> mpi.debug: RUN failed. Output:
> --------------------------------------------------------------------------
> mpirun noticed that process rank 0 with PID 4746 on node Bhairavi exited on
> signal 11 (Segmentation fault).

That means that your MPI is broken (the test is just a simple MPI
ping-pong example). Can you use your system's MPI instead of letting
PETSc install mpich? (you need to use the system MPI for PETSc,
deal.II, and p4est).

Uday K

unread,
Feb 5, 2014, 5:30:48 AM2/5/14
to dea...@googlegroups.com
Thanks for the tip, Timo, I will build with system MPI libraries. 

This entire exercise has raised a question in my mind regarding the need of PETSc. Specifically, what sort of linear algebra functionality does PETSc offer that deal.ii doesn't? In the past, I have written my own FEM code from scratch and whatever little linear algebra I needed, I took use of the opensource wrapper Seldon that interfaces C++ with MUMPS. Given the growing complexity of my code, I would like to switch to deal.ii and hence the question of what more is needed in addition to deal.ii for the purpose.

A recent example where I felt I needed a more sophisticated linear algebra library was as follows: My sparse equation is over the field of complex numbers. If all I care about is taking this complex matrix and solving it using MUMPS, there are no issues. But, if I want to extract the real and imaginary parts of the matrix and perform some additional math on them, then I found an *efficient* solution to be impossible using Seldon (i.e. storage was being needless doubled, etc.). 

I also wanted to say that I really appreciate the efforts you all have put into making and maintaining this fantastic library :).

 Thanks,
Uday

Toby D. Young

unread,
Feb 5, 2014, 7:14:18 AM2/5/14
to dea...@googlegroups.com
Dnia 2014-02-05, o godz. 02:30:48
Uday K <uda...@gmail.com> napisał(a):

Uday,

> This entire exercise has raised a question in my mind regarding the
> need of PETSc. Specifically, what sort of linear algebra
> functionality does PETSc offer that deal.ii doesn't?

It's not really a question of linear algebra. The real reason to use
PETSc (or Trilinos) is the MPI. With a decent grid partition (say,
p4est), you are able to distribute you problem over many machines and
obtain a significant speed-up for large problems. This is nicely
illustrated in step-40.

> A recent example where I felt I needed a more sophisticated linear
> algebra library was as follows: My sparse equation is over the field
> of complex numbers.

PETSc with complex numbers is something we are slowly working on. :-)
In the meantime, you can solve (using PETSc in parallel) by splitting
complex valued functions into their real and imaginary parts.
This is done in step-29.

Best,
Toby


--
Toby D. Young
Assistant Professor

Division of Computational Method
Institute of Fundamental Technological Research
of the Polish Academy of Sciences
ul Pawinskiego 5b
02-106 Warsaw
Poland

www: http://www.ippt.pan.pl/~tyoung

Session on quantum many-body theory:
http://www.ippt.pan.pl/~tyoung/icnaam.html

Timo Heister

unread,
Feb 5, 2014, 9:28:47 AM2/5/14
to dea...@googlegroups.com
> This entire exercise has raised a question in my mind regarding the need of
> PETSc. Specifically, what sort of linear algebra functionality does PETSc
> offer that deal.ii doesn't?

You need PETSc or Trilinos to do parallel computations with MPI (*).
Additionally, PETSc and Trilinos provide access to a lot more
solvers/preconditioners.

*: well, that is not quite correct, because you can solve some
problems matrix-free in parallel (step-48).

Uday K

unread,
Feb 5, 2014, 12:50:51 PM2/5/14
to dea...@googlegroups.com
Thanks for the info, Toby & Timo.

Meanwhile, I downloaded and installed mpich from source (the latest stable version, mpich-3.0.4). I also recompiled p4est from source, making sure it picks up the new versions of mpi. Following this, I attempted to install deal.ii, which passed successfully. What failed again, however, is make test. Here is the output of make test:

===

[  0%] Built target expand_instantiations_exe
[  0%] Built target obj_functionparser.release
[  0%] Built target obj_integrators.release
[  5%] [  7%] Built target obj_numerics.inst
Built target obj_meshworker.inst
[ 10%] Built target obj_dofs.inst
[ 19%] [ 23%] Built target obj_fe.inst
Built target obj_lac.inst
[ 23%] Built target obj_base.inst
[ 23%] [ 26%] Built target obj_hp.inst
Built target obj_grid.inst
[ 28%] Built target obj_multigrid.inst
[ 30%] [ 30%] Built target obj_algorithms.inst
Built target obj_distributed.inst
[ 37%] [ 37%] Built target obj_tbb.release
Built target obj_matrix_free.inst
[ 37%] Built target obj_meshworker.release
[ 44%] Built target obj_boost_serialization.release
[ 46%] Built target obj_hp.release
[ 48%] Built target obj_multigrid.release
[ 51%] Built target obj_grid.release
[ 51%] Built target obj_distributed.release
[ 53%] Built target obj_dofs.release
[ 53%] Built target obj_algorithms.release
[ 53%] Built target obj_matrix_free.release
[ 66%] Built target obj_base.release
[ 82%] Built target obj_lac.release
[ 89%] Built target obj_numerics.release
[100%] Built target obj_fe.release
[100%] Built target deal_II
Scanning dependencies of target build_library
[100%] Built target build_library
Scanning dependencies of target test
Test project /usr/local/src/build/tests/quick_tests
    Start 1: step.debug
1/6 Test #1: step.debug .......................   Passed    9.06 sec
    Start 2: step.release
2/6 Test #2: step.release .....................   Passed    6.51 sec
    Start 3: affinity.debug
3/6 Test #3: affinity.debug ...................   Passed    6.39 sec
    Start 4: mpi.debug
4/6 Test #4: mpi.debug ........................***Failed    6.23 sec
Test mpi.debug: RUN
===============================   OUTPUT BEGIN  ===============================
Scanning dependencies of target kill-mpi.debug-OK
[  0%] Built target kill-mpi.debug-OK
[  0%] Built target expand_instantiations_exe
[  6%] Built target obj_boost_serialization.debug
[  6%] Built target obj_functionparser.debug
[ 15%] Built target obj_tbb.debug
[ 20%] Built target obj_numerics.inst
[ 29%] Built target obj_fe.inst
[ 32%] Built target obj_dofs.inst
[ 36%] Built target obj_lac.inst
[ 36%] Built target obj_base.inst
[ 39%] Built target obj_grid.inst
[ 39%] Built target obj_hp.inst
[ 41%] Built target obj_multigrid.inst
[ 43%] Built target obj_distributed.inst
[ 43%] Built target obj_algorithms.inst
[ 43%] Built target obj_integrators.debug
[ 43%] Built target obj_matrix_free.inst
[ 44%] Built target obj_meshworker.inst
[ 51%] Built target obj_numerics.debug
[ 60%] Built target obj_fe.debug
[ 63%] Built target obj_dofs.debug
[ 79%] Built target obj_lac.debug
[ 91%] Built target obj_base.debug
[ 94%] Built target obj_grid.debug
[ 96%] Built target obj_hp.debug
[ 98%] Built target obj_multigrid.debug
[ 98%] Built target obj_distributed.debug
[100%] Built target obj_algorithms.debug
[100%] Built target obj_matrix_free.debug
[100%] Built target obj_meshworker.debug
[100%] Built target deal_II.g
Scanning dependencies of target mpi.debug
[100%] Building CXX object tests/quick_tests/CMakeFiles/mpi.debug.dir/mpi.cc.o
/usr/local/src/deal.II/tests/quick_tests/mpi.cc: In function ‘int main(int, char**)’:
/usr/local/src/deal.II/tests/quick_tests/mpi.cc:44:7: warning: variable ‘err’ set but not used [-Wunused-but-set-variable]
Linking CXX executable mpi.debug
/usr/bin/ld: warning: libmpich.so.3, needed by /usr/local/src/p4est-install/FAST/lib/libp4est.so, may conflict with libmpich.so.10
[100%] Built target mpi.debug
Scanning dependencies of target mpi.debug.run
mpi.debug: RUN failed. Output:
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 26695 on node Bhairavi exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
make[7]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/rule] Error 2
make[4]: *** [mpi.debug.run] Error 2


mpi.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort



    Start 5: tbb.debug
5/6 Test #5: tbb.debug ........................   Passed    6.34 sec
    Start 6: p4est.debug
6/6 Test #6: p4est.debug ......................***Failed    7.17 sec
Test p4est.debug: RUN
===============================   OUTPUT BEGIN  ===============================
Scanning dependencies of target kill-p4est.debug-OK
[  0%] Built target kill-p4est.debug-OK
[  0%] Built target expand_instantiations_exe
[  6%] Built target obj_boost_serialization.debug
[  6%] Built target obj_functionparser.debug
[ 15%] Built target obj_tbb.debug
[ 20%] Built target obj_numerics.inst
[ 28%] Built target obj_fe.inst
[ 32%] Built target obj_dofs.inst
[ 35%] Built target obj_lac.inst
[ 35%] Built target obj_base.inst
[ 38%] Built target obj_grid.inst
[ 38%] Built target obj_hp.inst
[ 40%] Built target obj_multigrid.inst
[ 42%] Built target obj_distributed.inst
[ 42%] Built target obj_algorithms.inst
[ 42%] Built target obj_integrators.debug
[ 42%] Built target obj_matrix_free.inst
[ 44%] Built target obj_meshworker.inst
[ 50%] Built target obj_numerics.debug
[ 59%] Built target obj_fe.debug
[ 62%] Built target obj_dofs.debug
[ 77%] Built target obj_lac.debug
[ 89%] Built target obj_base.debug
[ 93%] Built target obj_grid.debug
[ 94%] Built target obj_hp.debug
[ 96%] Built target obj_multigrid.debug
[ 96%] Built target obj_distributed.debug
[ 98%] Built target obj_algorithms.debug
[ 98%] Built target obj_matrix_free.debug
[ 98%] Built target obj_meshworker.debug
[ 98%] Built target deal_II.g
Scanning dependencies of target p4est.debug
[100%] Building CXX object tests/quick_tests/CMakeFiles/p4est.debug.dir/p4est.cc.o
/usr/local/src/deal.II/tests/quick_tests/p4est.cc: In function ‘int main(int, char**)’:
/usr/local/src/deal.II/tests/quick_tests/p4est.cc:80:16: warning: unused variable ‘myid’ [-Wunused-variable]
Linking CXX executable p4est.debug
[100%] Built target p4est.debug
Scanning dependencies of target p4est.debug.run
p4est.debug: RUN failed. Output:
--------------------------------------------------------------------------
mpirun noticed that process rank 0 with PID 27324 on node Bhairavi exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
2 total processes killed (some possibly by mpirun during cleanup)
make[7]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run.dir/rule] Error 2
make[4]: *** [p4est.debug.run] Error 2


p4est.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort




67% tests passed, 2 tests failed out of 6

Total Test time (real) =  41.71 sec
Errors while running CTest

The following tests FAILED:
          4 - mpi.debug (Failed)
          6 - p4est.debug (Failed)


*******************************     WARNING     *******************************

Some of the tests failed!

Please scroll up or check the file tests/quick_tests/quicktests.log for the
error messages. If you are unable to fix the problems, see the FAQ or write
to the mailing list linked at http://www.dealii.org


The p4est test can fail if you are running an OpenMPI version before 1.5.
This is a known problem and the only work around is to update to a more
recent version or use a different MPI library like MPICH.

[100%] Built target test
===

And here is the output of detailed.log 

==
###
#
#  deal.II configuration:
#        CMAKE_BUILD_TYPE:       DebugRelease
#        BUILD_SHARED_LIBS:      ON
#        CMAKE_INSTALL_PREFIX:   /usr/local/bin/deal.II
#        CMAKE_SOURCE_DIR:       /usr/local/src/deal.II (Version 8.1.0)
#        CMAKE_BINARY_DIR:       /usr/local/src/build
#        CMAKE_CXX_COMPILER:     GNU 4.6.3 on platform Linux x86_64
#                                /usr/local/mpich-install/bin/mpicxx
#        CMAKE_C_COMPILER:       /usr/local/mpich-install/bin/mpicc
#        CMAKE_Fortran_COMPILER: /usr/local/mpich-install/bin/mpif90
#        CMAKE_GENERATOR:        Unix Makefiles
#
#  Compiler flags used for this build:
#        CMAKE_CXX_FLAGS:              -pedantic -fpic -Wall -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Wno-long-long -Wno-deprecated -Wno-deprecated-declarations -std=c++0x -Wno-parentheses -Wno-long-long
#        DEAL_II_CXX_FLAGS_RELEASE:    -O2 -funroll-loops -funroll-all-loops -fstrict-aliasing -Wno-unused
#        DEAL_II_CXX_FLAGS_DEBUG:      -O0 -ggdb -Wa,--compress-debug-sections
#        DEAL_II_LINKER_FLAGS:         -Wl,--as-needed -rdynamic  -Wl,-rpath  -Wl,/usr/local/mpich-install/lib
#        DEAL_II_LINKER_FLAGS_RELEASE:
#        DEAL_II_LINKER_FLAGS_DEBUG:   -ggdb
#
#  Configured Features (DEAL_II_ALLOW_BUNDLED = ON, DEAL_II_ALLOW_AUTODETECTION = ON):
#      ( DEAL_II_WITH_64BIT_INDICES = OFF )
#      ( DEAL_II_WITH_ARPACK = OFF )
#        DEAL_II_WITH_BOOST set up with bundled packages
#        DEAL_II_WITH_FUNCTIONPARSER set up with bundled packages
#        DEAL_II_WITH_HDF5 set up with external dependencies
#            HDF5_INCLUDE_DIRS = /usr/include
#            HDF5_LIBRARIES = /usr/lib/libhdf5_hl.so;/usr/lib/libhdf5.so;/usr/local/mpich-install/lib/libmpich.so;/usr/local/mpich-install/lib/libopa.so;/usr/local/mpich-install/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#        DEAL_II_WITH_LAPACK set up with external dependencies
#            LAPACK_LIBRARIES = /usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/local/mpich-install/lib/libmpichf90.so;/usr/local/mpich-install/lib/libmpich.so;/usr/local/mpich-install/lib/libopa.so;/usr/local/mpich-install/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so
#      ( DEAL_II_WITH_METIS = OFF )
#        DEAL_II_WITH_MPI set up with external dependencies
#            MPI_VERSION = 3.0
#            MPI_CXX_COMPILER = /usr/local/mpich-install/bin/mpicxx
#            MPI_CXX_COMPILE_FLAGS =
#            MPI_CXX_INCLUDE_PATH = /usr/local/mpich-install/include
#            MPI_CXX_LIBRARIES = /usr/local/mpich-install/lib/libmpichcxx.so;/usr/local/mpich-install/lib/libmpich.so;/usr/local/mpich-install/lib/libopa.so;/usr/local/mpich-install/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#            MPI_CXX_LINK_FLAGS =  -Wl,-rpath  -Wl,/usr/local/mpich-install/lib
#      ( DEAL_II_WITH_MUMPS = OFF )
#        DEAL_II_WITH_NETCDF set up with external dependencies
#            NETCDF_INCLUDE_DIRS = /usr/include
#            NETCDF_LIBRARIES = /usr/lib/libnetcdf_c++.so;/usr/lib/libnetcdf.so
#        DEAL_II_WITH_P4EST set up with external dependencies
#            P4EST_VERSION = 0.3.4.2
#            P4EST_DIR = /usr/local/src/p4est-install/FAST
#            P4EST_INCLUDE_DIRS = /usr/local/src/p4est-install/FAST/include;/usr/local/src/p4est-install/FAST/include
#            P4EST_LIBRARIES = /usr/local/src/p4est-install/FAST/lib/libp4est.so;/usr/local/src/p4est-install/FAST/lib/libsc.so;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/local/mpich-install/lib/libmpichf90.so;/usr/local/mpich-install/lib/libmpich.so;/usr/local/mpich-install/lib/libopa.so;/usr/local/mpich-install/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so;/usr/local/mpich-install/lib/libmpich.so;/usr/local/mpich-install/lib/libopa.so;/usr/local/mpich-install/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#      ( DEAL_II_WITH_PETSC = OFF )
#      ( DEAL_II_WITH_SLEPC = OFF )
#        DEAL_II_WITH_THREADS set up with bundled packages
#      ( DEAL_II_WITH_TRILINOS = OFF )
#        DEAL_II_WITH_UMFPACK set up with external dependencies
#            UMFPACK_VERSION = 5.5.1
#            UMFPACK_DIR = /usr/local/src/petsc-3.3-p7/x86_64/lib
#            UMFPACK_INCLUDE_DIRS = /usr/local/src/petsc-3.3-p7/x86_64/include;/usr/local/src/petsc-3.3-p7/x86_64/include
#            UMFPACK_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libumfpack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libamd.a;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/local/mpich-install/lib/libmpichf90.so;/usr/local/mpich-install/lib/libmpich.so;/usr/local/mpich-install/lib/libopa.so;/usr/local/mpich-install/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so;/usr/lib/x86_64-linux-gnu/librt.so
#        DEAL_II_WITH_ZLIB set up with external dependencies
#            ZLIB_VERSION = 1.2.3.4
#            ZLIB_INCLUDE_DIRS = /usr/include
#            ZLIB_LIBRARIES = /usr/lib/x86_64-linux-gnu/libz.so
#
#  Component configuration:
#        DEAL_II_COMPONENT_COMPAT_FILES
#      ( DEAL_II_COMPONENT_DOCUMENTATION = OFF )
#        DEAL_II_COMPONENT_EXAMPLES
#        DEAL_II_COMPONENT_MESH_CONVERTER
#      ( DEAL_II_COMPONENT_PARAMETER_GUI = OFF )
#
###
==

 Any help in resolving these issues would be welcome.

Best,
 Uday

Matthias Maier

unread,
Feb 5, 2014, 12:59:53 PM2/5/14
to dea...@googlegroups.com

Am 05. Feb 2014, 18:50 schrieb Uday K <uda...@gmail.com>:

> Thanks for the info, Toby & Timo.
>
> Meanwhile, I downloaded and installed mpich from source (the latest
> stable version, mpich-3.0.4). I also recompiled p4est from source,
> making sure it picks up the new versions of mpi. Following this, I
> attempted to install deal.ii, which passed successfully. What failed
> again, however, is make test. Here is the output of make test:
>
> --------------------------------------------------------------------------
> mpirun noticed that process rank 1 with PID 26695 on node Bhairavi
> exited on signal 11 (Segmentation fault).
> --------------------------------------------------------------------------
> make[7]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run] Error 1
> make[6]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/all]
> Error 2
> make[5]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/rule]
> Error 2
> make[4]: *** [mpi.debug.run] Error 2
>
>
> mpi.debug: ****** RUN failed *******
>


Just a wild guess: Which mpirun is actually in path? These outcome is
very likely if you e.g. accidentally use mpirun from openmpi to run a
program linked against mpich.

Best,
Matthias

Timo Heister

unread,
Feb 5, 2014, 3:03:10 PM2/5/14
to dea...@googlegroups.com
> Just a wild guess: Which mpirun is actually in path? These outcome is
> very likely if you e.g. accidentally use mpirun from openmpi to run a
> program linked against mpich.

Yeah, it is again two MPI versions mixing:

>> /usr/bin/ld: warning: libmpich.so.3, needed by /usr/local/src/p4est-install/FAST/lib/libp4est.so, may conflict with libmpich.so.10

Here you can see that p4est is linking to a different mpich than deal.II is.

Make sure you have only one MPI installation and that it is in the
path and then configure all libraries with this.

Uday K

unread,
Feb 6, 2014, 2:16:19 AM2/6/14
to dea...@googlegroups.com
Thanks for pointing that out. The p4est install via a non-login shell wasn't seeing the custom install of mpich and hence picked up the old version. Fixed that, and the ld warning is gone. But, alas, make test still fails at the same two tests, mpi.debug, p4est.debug :(. I verified that mpirun being picked up is the intended one, firstly by making sure that it is found first in PATH and second by seeing the output of mpirun -info , which is:

==
$ mpirun -info
HYDRA build details:
    Version:                                 3.0.4
    Release Date:                            Wed Apr 24 10:08:10 CDT 2013
    CC:                              cc    
    CXX:                             c++    
    F77:                             gfortran   
    F90:                             gfortran   
    Configure options:                       '--disable-option-checking' '--prefix=/usr/local/mpich-install' '--enable-shared' '--cache-file=/dev/null' '--srcdir=.' 'CC=cc' 'CFLAGS= -O2' 'LDFLAGS= ' 'LIBS=-lrt -lpthread ' 'CPPFLAGS= -I/usr/local/src/mpich-3.0.4/src/mpl/include -I/usr/local/src/mpich-3.0.4/src/mpl/include -I/usr/local/src/mpich-3.0.4/src/openpa/src -I/usr/local/src/mpich-3.0.4/src/openpa/src -I/usr/local/src/mpich-3.0.4/src/mpi/romio/include'
    Process Manager:                         pmi
    Launchers available:                     ssh rsh fork slurm ll lsf sge manual persist
    Topology libraries available:            hwloc
    Resource management kernels available:   user slurm ll lsf sge pbs cobalt
    Checkpointing libraries available:       blcr
    Demux engines available:                 poll select
==

Is there some compile option to mpich that I'm missing? This is presently how I have compiled mpich:
./configure --enable-shared --prefix=/usr/local/mpich-install 2>&1 | tee c.txt

Here is the log of make test:

===
[  0%] [  0%] Built target obj_functionparser.release
[  0%] Built target obj_integrators.release
Built target expand_instantiations_exe
[  0%] [  5%] [  7%] Built target obj_base.inst
Built target obj_dofs.inst
Built target obj_lac.inst
[  8%] Built target obj_meshworker.inst
[ 17%] Built target obj_fe.inst
[ 23%] Built target obj_numerics.inst
[ 30%] Built target obj_tbb.release
[ 30%] [ 32%] Built target obj_hp.inst
[ 35%] Built target obj_multigrid.inst
Built target obj_grid.inst
[ 37%] [ 37%] Built target obj_distributed.inst
[ 37%] Built target obj_algorithms.inst
Built target obj_matrix_free.inst
[ 37%] Built target obj_meshworker.release
[ 44%] Built target obj_boost_serialization.release
[ 46%] Built target obj_hp.release
[ 48%] Built target obj_multigrid.release
[ 50%] Built target obj_dofs.release
[ 53%] Built target obj_grid.release
[ 53%] Built target obj_distributed.release
[ 53%] Built target obj_matrix_free.release
[ 53%] Built target obj_algorithms.release
[ 66%] Built target obj_base.release
[ 82%] Built target obj_lac.release
[ 89%] Built target obj_numerics.release
[100%] Built target obj_fe.release
[100%] Built target deal_II
Scanning dependencies of target build_library
[100%] Built target build_library
Scanning dependencies of target test
Test project /usr/local/src/build/tests/quick_tests
    Start 1: step.debug
1/6 Test #1: step.debug .......................   Passed    9.09 sec
    Start 2: step.release
2/6 Test #2: step.release .....................   Passed    6.46 sec
    Start 3: affinity.debug
3/6 Test #3: affinity.debug ...................   Passed    6.26 sec
    Start 4: mpi.debug
4/6 Test #4: mpi.debug ........................***Failed    6.30 sec
[100%] Built target mpi.debug
Scanning dependencies of target mpi.debug.run
mpi.debug: RUN failed. Output:
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 3708 on node Bhairavi exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
make[7]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/rule] Error 2
make[4]: *** [mpi.debug.run] Error 2


mpi.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort



    Start 5: tbb.debug
5/6 Test #5: tbb.debug ........................   Passed    6.28 sec
    Start 6: p4est.debug
6/6 Test #6: p4est.debug ......................***Failed    7.15 sec
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 4349 on node Bhairavi exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------
make[7]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run.dir/rule] Error 2
make[4]: *** [p4est.debug.run] Error 2


p4est.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort




67% tests passed, 2 tests failed out of 6

Total Test time (real) =  41.54 sec

The following tests FAILED:
 4 - mpi.debug (Failed)
 6 - p4est.debug (Failed)
Errors while running CTest


*******************************     WARNING     *******************************

Some of the tests failed!

Please scroll up or check the file tests/quick_tests/quicktests.log for the
error messages. If you are unable to fix the problems, see the FAQ or write
to the mailing list linked at http://www.dealii.org


The p4est test can fail if you are running an OpenMPI version before 1.5.
This is a known problem and the only work around is to update to a more
recent version or use a different MPI library like MPICH.

[100%] Built target test
===

Toby D. Young

unread,
Feb 6, 2014, 6:16:26 AM2/6/14
to dea...@googlegroups.com

Uday,

Something is very screwy here. I can not see where. :-(
My guess, your MPICH is broken.

Did you check your MPICH is working correctly without deal.II?
https://wiki.mpich.org/mpich/index.php/Testing_MPICH
First check your compiler is sane.

Did you run "make test" with PETSc? That is a good idea.

Are you compiling your own MPICH? This is not the best idea. Use your
Linux distribution to get MPICH/OpenMPI. Use yum, or apt-get, or emerge,
or something similar. This should check dependencies and put a decent
MPICH on your machine for you.

Then I suggest, you please take a deep breath and start again from
scratch / fresh. Remove your deal.II, PETSc, p4est, and start again
from zero. This will remove your history and (I hope) the confusion
between libraries.

I rarely recommend this. In your case I think you know what you are
doing, but we missed a step somewhere and compilers are still getting
mixed up. Either that, or your MPICH really is broken. :-|

Come back if things blow up again.
We want to help you get this working. :-)

Timo Heister

unread,
Feb 6, 2014, 8:29:10 AM2/6/14
to dea...@googlegroups.com
>> mpi.debug: RUN failed. Output:
>> --------------------------------------------------------------------------
>> mpirun noticed that process rank 1 with PID 3708 on node Bhairavi exited on signal 11 (Segmentation fault).

> My guess, your MPICH is broken.

Yes. Take a look at deal.II/tests/quick_tests/mpi.cc. This is a very
simple MPI test that should not fail. It doesn't use any deal.II
feature. A first test would be to make this program work without
deal.II (remove the deal.II headers and the reference two
Triangulation at the bottom and compile with mpicxx).

I realize more and more how helpful "make test" is. At first I thought
how unnecessary a test like mpi.cc would be, but this was definitely a
good idea.

> Are you compiling your own MPICH? This is not the best idea. Use your
> Linux distribution to get MPICH/OpenMPI.

Agreed.

Uday K

unread,
Feb 7, 2014, 1:56:54 AM2/7/14
to dea...@googlegroups.com
Thanks for your patience in helping me with this. Things still 'blow' up. Here is what I did:

1. Taking your advice, I got rid of the version of mpich that I had compiled myself. Then made sure that the versions of mpi** in PATH were the system built versions. I also removed openmpi to remove any possible ambiguities, build directories of petsc,p4est, and deal.ii.

2. Next, I installed PETSc 3.3-p7 from source and I noted that it picked up the system mpi. make test in the PETSc directories also succeeded. 

3. Then, I installed p4est using the script on the deal.ii site. 

4. I configured deal.ii with the system mpi, petsc, and p4est. (make -j8 install) goes through fine, where as (make test) bombs, now at an additional test, step-petsc.debug. As Timo suggested, I edited mpi.cc in tests/quick_tests/, compiled with mpicxx and ran with two processes, and it goes through fine (two "hi" statements with different IDs are reported).

Here is how make test fails:

==
==

and here is detailed.log

==
###
#
#  deal.II configuration:
#        CMAKE_BUILD_TYPE:       DebugRelease
#        BUILD_SHARED_LIBS:      ON
#        CMAKE_INSTALL_PREFIX:   /usr/local/bin/deal.II
#        CMAKE_SOURCE_DIR:       /usr/local/src/deal.II (Version 8.1.0)
#        CMAKE_BINARY_DIR:       /usr/local/src/build
#        CMAKE_CXX_COMPILER:     GNU 4.6.3 on platform Linux x86_64
#                                /usr/bin/c++
#        CMAKE_C_COMPILER:       /usr/bin/cc
#        CMAKE_Fortran_COMPILER: /usr/bin/gfortran
#        CMAKE_GENERATOR:        Unix Makefiles
#
#  Compiler flags used for this build:
#        CMAKE_CXX_FLAGS:              -pedantic -fpic -Wall -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Wno-long-long -Wno-deprecated -Wno-deprecated-declarations -std=c++0x -Wno-parentheses -Wno-long-long -Wno-long-long
#        DEAL_II_CXX_FLAGS_RELEASE:    -O2 -funroll-loops -funroll-all-loops -fstrict-aliasing -Wno-unused
#        DEAL_II_CXX_FLAGS_DEBUG:      -O0 -ggdb -Wa,--compress-debug-sections
#        DEAL_II_LINKER_FLAGS:         -Wl,--as-needed -rdynamic  -Wl,-Bsymbolic-functions  -Wl,-z,relro -pthread
#        DEAL_II_LINKER_FLAGS_RELEASE: 
#        DEAL_II_LINKER_FLAGS_DEBUG:   -ggdb
#
#  Configured Features (DEAL_II_ALLOW_BUNDLED = ON, DEAL_II_ALLOW_AUTODETECTION = ON):
#      ( DEAL_II_WITH_64BIT_INDICES = OFF )
#      ( DEAL_II_WITH_ARPACK = OFF )
#        DEAL_II_WITH_BOOST set up with external dependencies
#            BOOST_VERSION = 1.46.1
#            BOOST_DIR = 
#            BOOST_INCLUDE_DIRS = /usr/include
#            BOOST_LIBRARIES = /usr/lib/libboost_serialization-mt.so;/usr/lib/libboost_system-mt.so;/usr/lib/libboost_thread-mt.so
#        DEAL_II_WITH_FUNCTIONPARSER set up with bundled packages
#        DEAL_II_WITH_HDF5 set up with external dependencies
#            HDF5_INCLUDE_DIRS = /usr/include
#            HDF5_LIBRARIES = /usr/lib/libhdf5_hl.so;/usr/lib/libhdf5.so;/usr/lib/libmpich.so;/usr/lib/libopa.so;/usr/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/libcr.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#        DEAL_II_WITH_LAPACK set up with external dependencies
#            LAPACK_LIBRARIES = /usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so
#      ( DEAL_II_WITH_METIS = OFF )
#        DEAL_II_WITH_MPI set up with external dependencies
#            MPI_VERSION = 2.2
#            MPI_CXX_COMPILER = /usr/bin/mpicxx
#            MPI_CXX_COMPILE_FLAGS = 
#            MPI_CXX_INCLUDE_PATH = /usr/include/mpich2
#            MPI_CXX_LIBRARIES = /usr/lib/libmpichcxx.so;/usr/lib/libmpich.so;/usr/lib/libopa.so;/usr/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/libcr.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#            MPI_CXX_LINK_FLAGS =  -Wl,-Bsymbolic-functions  -Wl,-z,relro
#      ( DEAL_II_WITH_MUMPS = OFF )
#        DEAL_II_WITH_NETCDF set up with external dependencies
#            NETCDF_INCLUDE_DIRS = /usr/include
#            NETCDF_LIBRARIES = /usr/lib/libnetcdf_c++.so;/usr/lib/libnetcdf.so
#        DEAL_II_WITH_P4EST set up with external dependencies
#            P4EST_VERSION = 0.3.4.2
#            P4EST_DIR = /usr/local/src/p4est-install/FAST
#            P4EST_INCLUDE_DIRS = /usr/local/src/p4est-install/FAST/include;/usr/local/src/p4est-install/FAST/include
#            P4EST_LIBRARIES = /usr/local/src/p4est-install/FAST/lib/libp4est.so;/usr/local/src/p4est-install/FAST/lib/libsc.so;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so;/usr/lib/libmpich.so;/usr/lib/libopa.so;/usr/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/libcr.so;/usr/lib/x86_64-linux-gnu/libpthread.so
#        DEAL_II_WITH_PETSC set up with external dependencies
#            PETSC_VERSION = 3.3.0.3
#            PETSC_DIR = /usr/local/src/petsc-3.3-p7
#            PETSC_INCLUDE_DIRS = /usr/local/src/petsc-3.3-p7/include;/usr/local/src/petsc-3.3-p7/x86_64/include;/usr/include/mpich2
#            PETSC_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libpetsc.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libcmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libdmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libsmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libzmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libmumps_common.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libpord.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libHYPRE.a;/usr/lib/libmpichcxx.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptesmumps.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptscotch.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libptscotcherr.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libscalapack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libblacs.a;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/local/src/petsc-3.3-p7/x86_64/lib/libumfpack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libamd.a;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/libmpichf90.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/libmpichcxx.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libz.so;/usr/lib/x86_64-linux-gnu/libz.so;/usr/lib/x86_64-linux-gnu/libdl.so;/usr/lib/libmpich.so;/usr/lib/libopa.so;/usr/lib/libmpl.so;/usr/lib/x86_64-linux-gnu/librt.so;/usr/lib/libcr.so;/usr/lib/x86_64-linux-gnu/libpthread.so;/usr/lib/x86_64-linux-gnu/libdl.so
#      ( DEAL_II_WITH_SLEPC = OFF )
#        DEAL_II_WITH_THREADS set up with bundled packages
#      ( DEAL_II_WITH_TRILINOS = OFF )
#        DEAL_II_WITH_UMFPACK set up with external dependencies
#            UMFPACK_VERSION = 5.5.1
#            UMFPACK_DIR = /usr/local/src/petsc-3.3-p7/x86_64/lib
#            UMFPACK_INCLUDE_DIRS = /usr/local/src/petsc-3.3-p7/x86_64/include;/usr/local/src/petsc-3.3-p7/x86_64/include
#            UMFPACK_LIBRARIES = /usr/local/src/petsc-3.3-p7/x86_64/lib/libumfpack.a;/usr/local/src/petsc-3.3-p7/x86_64/lib/libamd.a;/usr/lib/liblapack.so;/usr/lib/libblas.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libgfortran.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/gcc/x86_64-linux-gnu/4.6/libquadmath.so;/usr/lib/x86_64-linux-gnu/libm.so;/usr/lib/x86_64-linux-gnu/libc.so;/usr/lib/x86_64-linux-gnu/librt.so
#        DEAL_II_WITH_ZLIB set up with external dependencies
#            ZLIB_VERSION = 1.2.3.4
#            ZLIB_INCLUDE_DIRS = /usr/include
#            ZLIB_LIBRARIES = /usr/lib/x86_64-linux-gnu/libz.so
#
#  Component configuration:
#        DEAL_II_COMPONENT_COMPAT_FILES
#      ( DEAL_II_COMPONENT_DOCUMENTATION = OFF )
#        DEAL_II_COMPONENT_EXAMPLES
#        DEAL_II_COMPONENT_MESH_CONVERTER
#      ( DEAL_II_COMPONENT_PARAMETER_GUI = OFF )
#
###
==

Uday K

unread,
Feb 7, 2014, 1:59:32 AM2/7/14
to dea...@googlegroups.com
oops, forgot to paste the output of make test:

==
$ sudo make test 
[  0%] [  0%] Built target obj_integrators.release
Built target expand_instantiations_exe
[  0%] Built target obj_functionparser.release
[  1%] [  1%] [ 14%] [ 21%] [ 21%] [ 23%] Built target obj_base.inst
Built target obj_meshworker.inst
Built target obj_dofs.inst
Built target obj_fe.inst
Built target obj_numerics.inst
Built target obj_lac.inst
[ 25%] Built target obj_grid.inst
[ 25%] [ 25%] [ 25%] Built target obj_algorithms.inst
[ 27%] [ 27%] Built target obj_distributed.inst
Built target obj_hp.inst
Built target obj_matrix_free.inst
Built target obj_multigrid.inst
[ 36%] Built target obj_tbb.release
[ 41%] [ 41%] [ 43%] [ 45%] [ 45%] Built target obj_meshworker.release
Built target obj_dofs.release
[ 45%] [ 47%] Built target obj_hp.release
Built target obj_multigrid.release
Built target obj_distributed.release
Built target obj_algorithms.release
Built target obj_matrix_free.release
[ 60%] [ 65%] Built target obj_grid.release
Built target obj_base.release
[ 81%] Built target obj_lac.release
[ 89%] Built target obj_numerics.release
[100%] Built target obj_fe.release
[100%] Built target deal_II
[100%] Built target build_library
Test project /usr/local/src/build/tests/quick_tests
    Start 1: step.debug
1/7 Test #1: step.debug .......................   Passed    1.16 sec
    Start 2: step.release
2/7 Test #2: step.release .....................   Passed    0.25 sec
    Start 3: affinity.debug
3/7 Test #3: affinity.debug ...................   Passed    0.18 sec
    Start 4: mpi.debug
4/7 Test #4: mpi.debug ........................***Failed    0.23 sec
Test mpi.debug: RUN
===============================   OUTPUT BEGIN  ===============================
[  0%] [  0%] Built target obj_functionparser.debug
[  0%] Built target expand_instantiations_exe
Built target kill-mpi.debug-OK
[  0%] Built target obj_integrators.debug
[  0%] [  0%] [  1%] Built target obj_meshworker.inst
Built target obj_base.inst
Built target obj_dofs.inst
[  7%] [ 15%] [ 22%] Built target obj_lac.inst
Built target obj_fe.inst
Built target obj_numerics.inst
[ 22%] [ 24%] [ 24%] Built target obj_grid.inst
Built target obj_hp.inst
[ 26%] [ 26%] Built target obj_algorithms.inst
[ 26%] Built target obj_distributed.inst
Built target obj_multigrid.inst
Built target obj_matrix_free.inst
[ 35%] Built target obj_tbb.debug
[ 35%] [ 36%] Built target obj_meshworker.debug
[ 38%] [ 40%] Built target obj_hp.debug
Built target obj_distributed.debug
Built target obj_multigrid.debug
[ 42%] Built target obj_algorithms.debug
[ 47%] Built target obj_dofs.debug
[ 61%] Built target obj_base.debug
[ 66%] Built target obj_grid.debug
[ 66%] Built target obj_matrix_free.debug
[ 82%] Built target obj_lac.debug
[ 92%] Built target obj_fe.debug
[100%] Built target obj_numerics.debug
[100%] Built target deal_II.g
[100%] Built target mpi.debug
mpi.debug: RUN failed. Output:

=====================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   EXIT CODE: 139
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
=====================================================================================
APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault (signal 11)
make[7]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/mpi.debug.run.dir/rule] Error 2
make[4]: *** [mpi.debug.run] Error 2


mpi.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort



    Start 5: tbb.debug
5/7 Test #5: tbb.debug ........................   Passed    0.17 sec
    Start 6: p4est.debug
6/7 Test #6: p4est.debug ......................***Failed    0.23 sec
Test p4est.debug: RUN
===============================   OUTPUT BEGIN  ===============================
[  0%] Built target expand_instantiations_exe
[  0%] Built target kill-p4est.debug-OK
[  0%] Built target obj_integrators.debug
[  0%] Built target obj_functionparser.debug
[  1%] [  7%] [ 15%] [ 21%] [ 21%] [ 21%] [ 22%] Built target obj_meshworker.inst
Built target obj_numerics.inst
Built target obj_fe.inst
Built target obj_lac.inst
Built target obj_base.inst
Built target obj_hp.inst
Built target obj_multigrid.inst
[ 24%] [ 26%] Built target obj_dofs.inst
Built target obj_grid.inst
[ 26%] Built target obj_distributed.inst
[ 26%] [ 26%] Built target obj_algorithms.inst
Built target obj_matrix_free.inst
[ 35%] Built target obj_tbb.debug
[ 35%] Built target obj_meshworker.debug
[ 49%] [ 54%] [ 56%] [ 57%] Built target obj_base.debug
Built target obj_grid.debug
Built target obj_hp.debug
Built target obj_distributed.debug
[ 59%] Built target obj_algorithms.debug
[ 59%] Built target obj_matrix_free.debug
[ 61%] Built target obj_multigrid.debug
[ 77%] Built target obj_lac.debug
[ 87%] Built target obj_fe.debug
[ 94%] Built target obj_numerics.debug
[100%] Built target obj_dofs.debug
[100%] Built target deal_II.g
[100%] Built target p4est.debug
p4est.debug: RUN failed. Output:
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
make[7]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/p4est.debug.run.dir/rule] Error 2
make[4]: *** [p4est.debug.run] Error 2


p4est.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort



    Start 7: step-petsc.debug
7/7 Test #7: step-petsc.debug .................***Failed    0.18 sec
Test step-petsc.debug: RUN
===============================   OUTPUT BEGIN  ===============================
[  0%] [  0%] Built target expand_instantiations_exe
Built target obj_integrators.debug
[  0%] [  0%] Built target obj_functionparser.debug
Built target kill-step-petsc.debug-OK
[  0%] [ 15%] [ 15%] Built target obj_meshworker.inst
Built target obj_numerics.inst
Built target obj_fe.inst
[ 17%] Built target obj_dofs.inst
[ 22%] Built target obj_lac.inst
[ 22%] [ 24%] [ 24%] [ 33%] [ 35%] [ 35%] Built target obj_base.inst
[ 35%] [ 35%] Built target obj_grid.inst
Built target obj_tbb.debug
Built target obj_hp.inst
Built target obj_multigrid.inst
Built target obj_distributed.inst
Built target obj_algorithms.inst
Built target obj_matrix_free.inst
[ 35%] Built target obj_meshworker.debug
[ 36%] [ 38%] [ 43%] Built target obj_distributed.debug
[ 45%] [ 47%] [ 59%] Built target obj_hp.debug
Built target obj_grid.debug
Built target obj_algorithms.debug
Built target obj_base.debug
Built target obj_matrix_free.debug
[ 61%] Built target obj_multigrid.debug
[ 66%] Built target obj_dofs.debug
[ 77%] Built target obj_fe.debug
[ 92%] Built target obj_lac.debug
[100%] Built target obj_numerics.debug
[100%] Built target deal_II.g
[100%] Built target step-petsc.debug
step-petsc.debug: RUN failed. Output:
Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPI_Init_thread(521): Cannot call MPI_INIT or MPI_INIT_THREAD more than once
make[7]: *** [tests/quick_tests/CMakeFiles/step-petsc.debug.run] Error 1
make[6]: *** [tests/quick_tests/CMakeFiles/step-petsc.debug.run.dir/all] Error 2
make[5]: *** [tests/quick_tests/CMakeFiles/step-petsc.debug.run.dir/rule] Error 2
make[4]: *** [step-petsc.debug.run] Error 2


step-petsc.debug: ******    RUN failed    *******

===============================    OUTPUT END   ===============================
Expected stage PASSED - aborting
CMake Error at /usr/local/src/deal.II/cmake/scripts/run_test.cmake:124 (MESSAGE):
  *** abort




57% tests passed, 3 tests failed out of 7

Total Test time (real) =   2.39 sec

The following tests FAILED:
 4 - mpi.debug (Failed)
 6 - p4est.debug (Failed)
 7 - step-petsc.debug (Failed)
Errors while running CTest


*******************************     WARNING     *******************************

Some of the tests failed!

Please scroll up or check the file tests/quick_tests/quicktests.log for the
error messages. If you are unable to fix the problems, see the FAQ or write
to the mailing list linked at http://www.dealii.org


The p4est test can fail if you are running an OpenMPI version before 1.5.
This is a known problem and the only work around is to update to a more
recent version or use a different MPI library like MPICH.


Additional information about PETSc issues is available
at:

[100%] Built target test

==

Timo Heister

unread,
Feb 7, 2014, 9:54:27 AM2/7/14
to dea...@googlegroups.com
> 1. Taking your advice, I got rid of the version of mpich that I had compiled
> myself. Then made sure that the versions of mpi** in PATH were the system
> built versions. I also removed openmpi to remove any possible ambiguities,
> build directories of petsc,p4est, and deal.ii.

good.

> 2. Next, I installed PETSc 3.3-p7 from source and I noted that it picked up
> the system mpi. make test in the PETSc directories also succeeded.

good.

> 4. I configured deal.ii with the system mpi, petsc, and p4est. (make -j8
> install) goes through fine, where as (make test) bombs, now at an additional
> test, step-petsc.debug. As Timo suggested, I edited mpi.cc in
> tests/quick_tests/, compiled with mpicxx and ran with two processes, and it
> goes through fine (two "hi" statements with different IDs are reported).

> mpi.debug: RUN failed. Output:
>
> =====================================================================================
> = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
> = EXIT CODE: 139
> = CLEANING UP REMAINING PROCESSES
> = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
> =====================================================================================
> APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault (signal 11)


okay, so mpi.cc works when you compile it manually, but fails if it is
run through "make test"? Very strange. Are you sure you deleted the
build directory? Can you please post the output of "ldd mpi.debug"
(you find it in build/tests/quick_tests/) and the output of ldd of
your version of mpi.cc if you compile it manually? What happens if you
run the binary mpi.debug manually (mpirun -n 2 ./mpi.debug)?

Uday Khankhoje

unread,
Feb 7, 2014, 12:45:31 PM2/7/14
to dea...@googlegroups.com
On Fri, Feb 7, 2014 at 8:24 PM, Timo Heister <hei...@clemson.edu> wrote:

okay, so mpi.cc works when you compile it manually, but fails if it is
run through "make test"? Very strange. Are you sure you deleted the
build directory?

​Yes. I tried it yet again to be sure.​
 
Can you please post the output of "ldd mpi.debug"
(you find it in build/tests/quick_tests/) and the output of ldd of
your version of mpi.cc if you compile it manually? What happens if you
run the binary mpi.debug manually (mpirun -n 2 ./mpi.debug)?


​Manually running mpi.debug produces the same error message as above, i.e.:

=====================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   EXIT CODE: 139
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
=====================================================================================
APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault (signal 11)
​​

​Here is the output of ldd on mpi.debug:

====
        linux-vdso.so.1 =>  (0x00007fff3cdf5000)
        libdeal_II.g.so.8.1.0 => /usr/local/src/build/lib/libdeal_II.g.so.8.1.0 (0x00007f9b0abc3000)
        libhdf5.so.6 => /usr/lib/libhdf5.so.6 (0x00007f9b0a581000)
        libmpich.so.3 => /usr/lib/libmpich.so.3 (0x00007f9b0a1a4000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f9b09f71000)
        libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f9b09c70000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f9b098b0000)
        libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f9b0969a000)
        libpetsc.so => /usr/local/src/petsc-3.4.3/arch-linux2-c-debug/lib/libpetsc.so (0x00007f9b07bdf000)
        libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f9b079db000)
        libnetcdf_c++.so.5 => /usr/lib/libnetcdf_c++.so.5 (0x00007f9b077be000)
        libp4est.so.0 => /usr/local/src/p4est-install/FAST/lib/libp4est.so.0 (0x00007f9b07544000)
        libsc.so.0 => /usr/local/src/p4est-install/FAST/lib/libsc.so.0 (0x00007f9b0730c000)
        liblapack.so.3gf => /usr/lib/liblapack.so.3gf (0x00007f9b06716000)
        libblas.so.3gf => /usr/lib/libblas.so.3gf (0x00007f9b0647b000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f9b0617f000)
        libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f9b05f68000)
        librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f9b05d5f000)
        libcr.so.0 => /usr/lib/libcr.so.0 (0x00007f9b05b55000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f9b13aea000)
        libgfortran.so.3 => /usr/lib/x86_64-linux-gnu/libgfortran.so.3 (0x00007f9b0583d000)
        libmpl.so.1 => /usr/lib/libmpl.so.1 (0x00007f9b05638000)
        libnetcdf.so.6 => /usr/lib/libnetcdf.so.6 (0x00007f9b052fa000)
        libquadmath.so.0 => /usr/lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007f9b050c3000)
        libcurl-gnutls.so.4 => /usr/lib/x86_64-linux-gnu/libcurl-gnutls.so.4 (0x00007f9b04e6b000)
        libhdf5_hl.so.6 => /usr/lib/libhdf5_hl.so.6 (0x00007f9b04c38000)
        libidn.so.11 => /usr/lib/x86_64-linux-gnu/libidn.so.11 (0x00007f9b04a05000)
        liblber-2.4.so.2 => /usr/lib/x86_64-linux-gnu/liblber-2.4.so.2 (0x00007f9b047f7000)
        libldap_r-2.4.so.2 => /usr/lib/x86_64-linux-gnu/libldap_r-2.4.so.2 (0x00007f9b045a7000)
        libgssapi_krb5.so.2 => /usr/lib/x86_64-linux-gnu/libgssapi_krb5.so.2 (0x00007f9b04369000)
        libgnutls.so.26 => /usr/lib/x86_64-linux-gnu/libgnutls.so.26 (0x00007f9b040ad000)
        libgcrypt.so.11 => /lib/x86_64-linux-gnu/libgcrypt.so.11 (0x00007f9b03e2e000)
        librtmp.so.0 => /usr/lib/x86_64-linux-gnu/librtmp.so.0 (0x00007f9b03c14000)
        libresolv.so.2 => /lib/x86_64-linux-gnu/libresolv.so.2 (0x00007f9b039f8000)
        libsasl2.so.2 => /usr/lib/x86_64-linux-gnu/libsasl2.so.2 (0x00007f9b037dc000)
        libgssapi.so.3 => /usr/lib/x86_64-linux-gnu/libgssapi.so.3 (0x00007f9b0359e000)
        libkrb5.so.3 => /usr/lib/x86_64-linux-gnu/libkrb5.so.3 (0x00007f9b032d0000)
        libk5crypto.so.3 => /usr/lib/x86_64-linux-gnu/libk5crypto.so.3 (0x00007f9b030a7000)
        libcom_err.so.2 => /lib/x86_64-linux-gnu/libcom_err.so.2 (0x00007f9b02ea3000)
        libkrb5support.so.0 => /usr/lib/x86_64-linux-gnu/libkrb5support.so.0 (0x00007f9b02c9b000)
        libtasn1.so.3 => /usr/lib/x86_64-linux-gnu/libtasn1.so.3 (0x00007f9b02a89000)
        libp11-kit.so.0 => /usr/lib/x86_64-linux-gnu/libp11-kit.so.0 (0x00007f9b02877000)
        libgpg-error.so.0 => /lib/x86_64-linux-gnu/libgpg-error.so.0 (0x00007f9b02673000)
        libheimntlm.so.0 => /usr/lib/x86_64-linux-gnu/libheimntlm.so.0 (0x00007f9b0246b000)
        libkrb5.so.26 => /usr/lib/x86_64-linux-gnu/libkrb5.so.26 (0x00007f9b021e5000)
        libasn1.so.8 => /usr/lib/x86_64-linux-gnu/libasn1.so.8 (0x00007f9b01f45000)
        libhcrypto.so.4 => /usr/lib/x86_64-linux-gnu/libhcrypto.so.4 (0x00007f9b01d10000)
        libroken.so.18 => /usr/lib/x86_64-linux-gnu/libroken.so.18 (0x00007f9b01afb000)
        libkeyutils.so.1 => /lib/x86_64-linux-gnu/libkeyutils.so.1 (0x00007f9b018f7000)
        libwind.so.0 => /usr/lib/x86_64-linux-gnu/libwind.so.0 (0x00007f9b016cd000)
        libheimbase.so.1 => /usr/lib/x86_64-linux-gnu/libheimbase.so.1 (0x00007f9b014be000)
        libhx509.so.5 => /usr/lib/x86_64-linux-gnu/libhx509.so.5 (0x00007f9b01273000)
        libsqlite3.so.0 => /usr/lib/x86_64-linux-gnu/libsqlite3.so.0 (0x00007f9b00fd0000)
        libcrypt.so.1 => /lib/x86_64-linux-gnu/libcrypt.so.1 (0x00007f9b00d97000)
====​


​Here is the output of ldd on the executable from the modified ​mpi.cc

====
        linux-vdso.so.1 =>  (0x00007fff4f5ff000)
        libmpich.so.3 => /usr/lib/libmpich.so.3 (0x00007f9fa6435000)
        libmpl.so.1 => /usr/lib/libmpl.so.1 (0x00007f9fa6230000)
        libstdc++.so.6 => /usr/lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007f9fa5f19000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f9fa5b59000)
        librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f9fa5951000)
        libcr.so.0 => /usr/lib/libcr.so.0 (0x00007f9fa5746000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f9fa5529000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f9fa6814000)
        libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f9fa522d000)
        libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f9fa5016000)
        libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f9fa4e12000)
====

​Regards,
 Uday​​

Timo Heister

unread,
Feb 7, 2014, 2:59:35 PM2/7/14
to dea...@googlegroups.com
It looks like the same MPI libraries are used. I have to admit that I
am slowly running out of ideas. Things you might try:
- run in the debugger to get the call stack.
$ mpirun -n 2 xterm -e gdb --args ./mpi.debug
should open two gdb windows where you can type "run" and then look at
the stack trace using "bt" after the crash. Maybe it will help us to
see where the problem is.
- remove hdf5 from deal.II (maybe it does some weird MPI stuff)
> --
> The deal.II project is located at http://www.dealii.org/
> For mailing list/forum options, see
> https://groups.google.com/d/forum/dealii?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "deal.II User Group" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to dealii+un...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.



Uday Khankhoje

unread,
Feb 8, 2014, 8:10:49 AM2/8/14
to dealii
On Sat, Feb 8, 2014 at 1:29 AM, Timo Heister <hei...@clemson.edu> wrote:
It looks like the same MPI libraries are used. I have to admit that I
am slowly running out of ideas. Things you might try:
- run in the debugger to get the call stack.
$ mpirun -n 2 xterm -e gdb --args ./mpi.debug
should open two gdb windows where you can type "run" and then look at
the stack trace using "bt" after the crash. Maybe it will help us to
see where the problem is.
- remove hdf5 from deal.II (maybe it does some weird MPI stuff)

​Timo, looks like your guess about hdf5 was spot-on! 

On running gdb, here is what I got:

==
Program received signal SIGSEGV, Segmentation fault.
0x00007fffeebd8716 in MPIR_ToPointer () from /usr/lib/libhdf5.so.6
==

and the backtrace revealed libhdf5 to be the culprit (though its not clear why):

==
#0  0x00007fffeebd8716 in MPIR_ToPointer () from /usr/lib/libhdf5.so.6
#1  0x00007fffeebda236 in PMPI_Comm_rank () from /usr/lib/libhdf5.so.6
#2  0x0000000000401709 in main (argc=1, argv=0x7fffffffdec8)
    at /usr/local/src/deal.II/tests/quick_tests/mpi.cc:31
==

​I recompiled deal.ii without hdf5 support, and lo and behold make test goes through without a problem! I can live without hdf5 support for now, and will try building it from source if I really need it.

Thanks to you all for taking the pains to help me with this. I really appreciate it.

​ Warm regards,
Uday

PS: The part about being able to use MUMPS even though deal.ii has been compiled without MUMPS (but PETSc has) wasn't fully clear to me. ​

Timo Heister

unread,
Feb 8, 2014, 10:17:50 AM2/8/14
to dea...@googlegroups.com
> Timo, looks like your guess about hdf5 was spot-on!
>
> On running gdb, here is what I got:
>
> ==
> Program received signal SIGSEGV, Segmentation fault.
> 0x00007fffeebd8716 in MPIR_ToPointer () from /usr/lib/libhdf5.so.6
> ==
>
> and the backtrace revealed libhdf5 to be the culprit (though its not clear
> why):

It might be that hdf5 has been compiled with a different MPI version,
maybe that version of hdf5 is just broken, or your MPI doesn't have
ROMIO support. It is actually pretty easy to install hdf5 yourself.

> Thanks to you all for taking the pains to help me with this. I really
> appreciate it.

Great everything is working for you now! I guess we all learned
something in the process. :-)

> PS: The part about being able to use MUMPS even though deal.ii has been
> compiled without MUMPS (but PETSc has) wasn't fully clear to me.

Yeah, there is SparseDirectMUMPS and PETScWrappers::SparseDirectMUMPS.
Reply all
Reply to author
Forward
0 new messages