c++ compiler not working

654 views
Skip to first unread message

stefano....@unito.it

unread,
Jan 11, 2021, 1:26:41 PM1/11/21
to PLUMED users
Dear all,
I am trying to install Plumed 2.6.2 on a CentOS 7 intel xeon gold workstation. I am using devtoolset-7, so gcc and g++ version is 7.3.1. The configure command is:
./configure CXX=mpicxx CC=mpicc FC=mpifort --prefix=/usr/local LDFLAGS=/home/stefano/CCDC/Python_API_2020/miniconda/lib/libblas.so LIBS=-lblas LDFLAGS=/home/stefano/CCDC/Python_API_2020/miniconda/lib/liblapack.so LIBS=-llapack

and I am getting the following error:
checking whether the C++ compiler works... no
configure: error: in `/home/stefano/plumed-2.6.2':
configure: error: C++ compiler cannot create executables
See `config.log' for more details

The content of config.log is:

This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.

It was created by PLUMED configure 2, which was
generated by GNU Autoconf 2.69.  Invocation command line was

  $ ./configure CXX=mpicxx CC=mpicc FC=mpifort --prefix=/usr/local LDFLAGS=/home/stefano/CCDC/Python_API_2020/miniconda/lib/libblas.so LIBS=-lblas LDFLAGS=/home/stefano/CCDC/Python_API_2020/miniconda/lib/liblapack.so LIBS=-llapack

## --------- ##
## Platform. ##
## --------- ##

hostname = gen71.pharm.unito.it
uname -m = x86_64
uname -r = 3.10.0-1127.19.1.el7.x86_64
uname -s = Linux
uname -v = #1 SMP Tue Aug 25 17:23:54 UTC 2020

/usr/bin/uname -p = x86_64
/bin/uname -X     = unknown

/bin/arch              = x86_64
/usr/bin/arch -k       = unknown
/usr/convex/getsysinfo = unknown
/usr/bin/hostinfo      = unknown
/bin/machine           = unknown
/usr/bin/oslevel       = unknown
/bin/universe          = unknown

PATH: /opt/rh/devtoolset-7/root/usr/bin
PATH: /home/stefano/CCDC/Discovery_2020/bin
PATH: /home/stefano/.local/bin
PATH: /home/stefano/bin
PATH: /usr/local/bin
PATH: /usr/bin
PATH: /usr/sbin
PATH: /usr/local/sbin
PATH: /home/stefano/gamess
PATH: /usr/lib64/openmpi3/bin


## ----------- ##
## Core tests. ##
## ----------- ##

configure:2424: Optional modules are disabled by default
configure:3420: checking for C++ compiler version
configure:3429: mpicxx --version >&5
g++ (GCC) 7.3.1 20180303 (Red Hat 7.3.1-5)
Copyright (C) 2017 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

configure:3440: $? = 0
configure:3429: mpicxx -v >&5
Using built-in specs.
COLLECT_GCC=/opt/rh/devtoolset-7/root/usr/bin/g++
COLLECT_LTO_WRAPPER=/opt/rh/devtoolset-7/root/usr/libexec/gcc/x86_64-redhat-linux/7/lto-wrapper
Target: x86_64-redhat-linux
Configured with: ../configure --enable-bootstrap --enable-languages=c,c++,fortran,lto --prefix=/opt/rh/devtoolset-7/root/usr --mandir=/opt/rh/devtoolset-7/root/usr/share/man --infodir=/opt/rh/devtoolset-7/root/usr/share/info --with-bugurl=http://bugzilla.redhat.com/bugzilla --enable-shared --enable-threads=posix --enable-checking=release --enable-multilib --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions --enable-gnu-unique-object --enable-linker-build-id --with-gcc-major-version-only --enable-plugin --with-linker-hash-style=gnu --enable-initfini-array --with-default-libstdcxx-abi=gcc4-compatible --with-isl=/builddir/build/BUILD/gcc-7.3.1-20180303/obj-x86_64-redhat-linux/isl-install --enable-libmpx --enable-gnu-indirect-function --with-tune=generic --with-arch_32=i686 --build=x86_64-redhat-linux
Thread model: posix
gcc version 7.3.1 20180303 (Red Hat 7.3.1-5) (GCC)
configure:3440: $? = 0
configure:3429: mpicxx -V >&5
g++: error: unrecognized command line option '-V'
g++: fatal error: no input files
compilation terminated.
configure:3440: $? = 1
configure:3429: mpicxx -qversion >&5
g++: error: unrecognized command line option '-qversion'; did you mean '--version'?
g++: fatal error: no input files
compilation terminated.
configure:3440: $? = 1
configure:3460: checking whether the C++ compiler works
configure:3482: mpicxx -O3  /home/stefano/CCDC/Python_API_2020/miniconda/lib/liblapack.so conftest.cpp -llapack >&5
/opt/rh/devtoolset-7/root/usr/libexec/gcc/x86_64-redhat-linux/7/ld: cannot find -llapack
collect2: error: ld returned 1 exit status
configure:3486: $? = 1
configure:3524: result: no
configure: failed program was:
| /* confdefs.h */
| #define PACKAGE_NAME "PLUMED"
| #define PACKAGE_TARNAME "plumed"
| #define PACKAGE_VERSION "2"
| #define PACKAGE_STRING "PLUMED 2"
| #define PACKAGE_BUGREPORT ""
| #define PACKAGE_URL ""
| /* end confdefs.h.  */
|
| int
| main ()
| {
|
|   ;
|   return 0;
| }
configure:3529: error: in `/home/stefano/plumed-2.6.2':
configure:3531: error: C++ compiler cannot create executables
See `config.log' for more details

## ---------------- ##
## Cache variables. ##
## ---------------- ##

ac_cv_env_BASH_COMPLETION_DIR_set=
ac_cv_env_BASH_COMPLETION_DIR_value=
ac_cv_env_CCC_set=
ac_cv_env_CCC_value=
ac_cv_env_CC_set=set
ac_cv_env_CC_value=mpicc
ac_cv_env_CFLAGS_set=
ac_cv_env_CFLAGS_value=
ac_cv_env_CPPFLAGS_set=
ac_cv_env_CPPFLAGS_value=
ac_cv_env_CXXCPP_set=
ac_cv_env_CXXCPP_value=
ac_cv_env_CXXFLAGS_set=
ac_cv_env_CXXFLAGS_value=
ac_cv_env_CXX_set=set
ac_cv_env_CXX_value=mpicxx
ac_cv_env_FCFLAGS_set=
ac_cv_env_FCFLAGS_value=
ac_cv_env_FC_set=set
ac_cv_env_FC_value=mpifort
ac_cv_env_LDFLAGS_set=set
ac_cv_env_LDFLAGS_value=/home/stefano/CCDC/Python_API_2020/miniconda/lib/liblapack.so
ac_cv_env_LDSHARED_set=
ac_cv_env_LDSHARED_value=
ac_cv_env_LIBS_set=set
ac_cv_env_LIBS_value=-llapack
ac_cv_env_MPIEXEC_set=
ac_cv_env_MPIEXEC_value=
ac_cv_env_PYTHON_BIN_set=
ac_cv_env_PYTHON_BIN_value=
ac_cv_env_SOEXT_set=
ac_cv_env_SOEXT_value=
ac_cv_env_STATIC_LIBS_set=
ac_cv_env_STATIC_LIBS_value=
ac_cv_env_build_alias_set=
ac_cv_env_build_alias_value=
ac_cv_env_host_alias_set=
ac_cv_env_host_alias_value=
ac_cv_env_target_alias_set=
ac_cv_env_target_alias_value=

## ----------------- ##
## Output variables. ##
## ----------------- ##

AR_CR=''
BASH_COMPLETION_DIR=''
CC='mpicc'
CFLAGS=''
CPPFLAGS=''
CXX='mpicxx'
CXXCPP=''
CXXFLAGS='-O3'
DEFS=''
ECHO_C=''
ECHO_N='-n'
ECHO_T=''
EGREP=''
EXEEXT=''
FC='mpifort'
FCFLAGS=''
GREP=''
LDFLAGS='/home/stefano/CCDC/Python_API_2020/miniconda/lib/liblapack.so'
LDSHARED=''
LD_RO=''
LIBOBJS=''
LIBS='-llapack'
LTLIBOBJS=''
MPIEXEC=''
OBJEXT=''
OPENMP_CXXFLAGS=''
PACKAGE_BUGREPORT=''
PACKAGE_NAME='PLUMED'
PACKAGE_STRING='PLUMED 2'
PACKAGE_TARNAME='plumed'
PACKAGE_URL=''
PACKAGE_VERSION='2'
PATH_SEPARATOR=':'
PYTHON_BIN=''
SHELL='/bin/sh'
SOEXT=''
STATIC_LIBS=''
ac_ct_CC=''
ac_ct_CXX=''
ac_ct_FC=''
bindir='${exec_prefix}/bin'
build_alias=''
build_dir=''
datadir='${datarootdir}'
datarootdir='${prefix}/share'
disable_dependency_tracking=''
docdir='${datarootdir}/doc/${PACKAGE_TARNAME}'
dot=''
doxygen=''
dvidir='${docdir}'
exec_prefix='NONE'
host_alias=''
htmldir='${docdir}'
includedir='${prefix}/include'
infodir='${datarootdir}/info'
libdir='${exec_prefix}/lib'
libexecdir='${exec_prefix}/libexec'
localedir='${datarootdir}/locale'
localstatedir='${prefix}/var'
make_doc=''
make_pdfdoc=''
make_static_archive=''
mandir='${datarootdir}/man'
oldincludedir='/usr/include'
pdfdir='${docdir}'
pkgconfig_bin=''
prefix='/usr/local'
program_can_run=''
program_can_run_mpi=''
program_name=''
program_transform_name='s,x,x,'
psdir='${docdir}'
readelf=''
sbindir='${exec_prefix}/sbin'
sharedstatedir='${prefix}/com'
sysconfdir='${prefix}/etc'
target_alias=''
use_absolute_soname=''
use_loader_path=''
xxd=''

## ----------- ##
## confdefs.h. ##
## ----------- ##

/* confdefs.h */
#define PACKAGE_NAME "PLUMED"
#define PACKAGE_TARNAME "plumed"
#define PACKAGE_VERSION "2"
#define PACKAGE_STRING "PLUMED 2"
#define PACKAGE_BUGREPORT ""
#define PACKAGE_URL ""

configure: exit 77
Am I missing anything?
Thanks in advance
Stefano

debadutta patra

unread,
Jun 21, 2021, 2:16:11 AM6/21/21
to PLUMED users
I had a similar problem while installing Plumed 2.7 on Ubuntu 21.04. In my case, I was missing package libopenmpi-dev. You can try compiling without mpi support to verify this. If openmpi was causing the issue, I believe the package you will require
is openmpi-devel.
Another thing worth mentioning is in fedora 33 I was not able to compile with mpi support, with the default compilers and had to use intel oneapi compilers in order to for mpi support. It's worth giving it a try if nothing works out for you.

Good luck

Debadutta

Alin Marin Elena

unread,
Jun 21, 2021, 2:26:07 AM6/21/21
to plumed...@googlegroups.com
Hi Stefano,

without config log is difficult to say what is going banana.
my guess is that you are too liberal with LIBS and LDFLAGS usage, you
shall not have two of them of each for example...
another point is use openblas to have blas/lapack rather than
reference netlib implementations.

remove LIBS and LDFLAGS and see if that fixes the error, probably new
ones will crop...

Regards,
Alin


Without Questions there are no Answers!
______________________________________________________________________
Dr. Alin Marin ELENA
http://alin.elena.space/
______________________________________________________________________
> --
> You received this message because you are subscribed to the Google Groups "PLUMED users" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to plumed-users...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/plumed-users/51e985a9-674a-4bff-93f9-34ddfdfbbfc0n%40googlegroups.com.

Giovanni Bussi

unread,
Jun 21, 2021, 3:21:40 PM6/21/21
to plumed...@googlegroups.com
Dear all,

let me add that we are regularly testing plumed on Centos7 and Fedora34. To see the exact commands we use you can see in the corresponding Docker files:



These files can be used as a source of inspiration for, e.g. proper packages to be installed with yum with these specific operating systems.

Giovanni


Reply all
Reply to author
Forward
0 new messages