pbdMPI install failure with macport openmpi

635 views
Skip to first unread message

ChungKai Sun

unread,
Jun 30, 2014, 9:37:52 PM6/30/14
to rbigdatap...@googlegroups.com

If you install openmpi through mac port, and have seen the error message "error: "Cannot find mpi.h header file", you may need to specify the following configuration arguments to install pbdMPI successfully. 

install.packages('pbdMPI', 
configure.args='--with-mpi-type=OPENMPI --with-mpi-include=/opt/local/include/openmpi-gcc48/ --with-mpi-libpath=/opt/local/lib/openmpi-gcc48')

Thank you Drew!




Will Landau

unread,
Sep 4, 2014, 11:34:10 AM9/4/14
to rbigdatap...@googlegroups.com
I'm trying to install pbdMPI too, and when I run that command inside R, I get 

> install.packages('pbdMPI', configure.args='--with-mpi-type=OPENMPI --with-mpi-include=/opt/local/include/openmpi-gcc48/ --with-mpi-libpath=/opt/local/lib/openmpi-gcc48')
--- Please select a CRAN mirror for use in this session ---
Content type 'application/x-gzip' length 617637 bytes (603 Kb)
opened URL
==================================================
downloaded 603 Kb


The downloaded binary packages are in
/var/folders/gs/srjvqghj5hdgq90q9ldwz47m0000gn/T//RtmpJWzDeA/downloaded_packages
> library(pbdMPI)
Loading required package: rlecuyer
Error : .onLoad failed in loadNamespace() for 'pbdMPI', details:
  call: dyn.load(file, DLLpath = DLLpath, ...)
  error: unable to load shared object '/Library/Frameworks/R.framework/Versions/3.0/Resources/library/pbdMPI/libs/pbdMPI.so':
  dlopen(/Library/Frameworks/R.framework/Versions/3.0/Resources/library/pbdMPI/libs/pbdMPI.so, 6): Library not loaded: /usr/lib/libmpi.0.dylib
  Referenced from: /Library/Frameworks/R.framework/Versions/3.0/Resources/library/pbdMPI/libs/pbdMPI.so
  Reason: image not found
Error: package or namespace load failed for ‘pbdMPI’

The package doesn't compile even though I have Open MPI. When I try from the command line, R tries the compilation, but fails.

> R CMD INSTALL pbdMPI --with-mpi-type=OPENMPI --with-mpi-include=/opt/local/include/openmpi-gcc48/ --with-mpi-libpath=/opt/local/lib/openmpi-gcc48

Warning: unknown option ‘--with-mpi-type=OPENMPI’

Warning: unknown option ‘--with-mpi-include=/opt/local/include/openmpi-gcc48/’

Warning: unknown option ‘--with-mpi-libpath=/opt/local/lib/openmpi-gcc48’

* installing to library ‘/Library/Frameworks/R.framework/Versions/3.0/Resources/library’

* installing *source* package ‘pbdMPI’ ...

** package ‘pbdMPI’ successfully unpacked and MD5 sums checked

checking for sed... /usr/bin/sed

checking for mpicc... mpicc

checking for ompi_info... ompi_info

found sed, mpicc, and ompi_info ...

>> TMP_INC_DIRS = /usr/local/include

checking /usr/local/include ...

found /usr/local/include/mpi.h ...

>> TMP_LIB_DIRS = /usr/local/lib

checking /usr/local/lib ...

found /usr/local/lib/libmpi.dylib ...

found mpi.h and libmpi.so ...

>> TMP_INC = /usr/local/include

>> TMP_LIB = /usr/local/lib

checking for openpty in -lutil... yes

checking for main in -lpthread... yes

 

******************* Results of pbdMPI package configure *****************

 

>> TMP_INC = /usr/local/include

>> TMP_LIB = /usr/local/lib

>> TMP_LIBNAME = libmpi.dylib

>> MPI_ROOT = 

>> MPITYPE = OPENMPI

>> MPI_INCLUDE_PATH = /usr/local/include

>> MPI_LIBPATH = /usr/local/lib

>> MPI_LIBNAME = libmpi.dylib

>> MPI_LIBS =  -lutil -lpthread

>> MPI_DEFS = -DMPI2

>> MPI_INCL2 = 

>> MPI_LDFLAGS = 

>> PKG_CPPFLAGS = -I/usr/local/include  -DMPI2 -DOPENMPI

>> PKG_LIBS = -L/usr/local/lib -lmpi  -lutil -lpthread

>> PROF_LDFLAGS = 

 

*************************************************************************

 

configure: creating ./config.status

config.status: creating src/Makevars

configure: creating ./config.status

config.status: creating src/Makevars

config.status: creating R/zzz.r

** libs

make: Nothing to be done for `all'.

installing via 'install.libs.R' to /Library/Frameworks/R.framework/Versions/3.0/Resources/library/pbdMPI

** R

** data

*** moving datasets to lazyload DB

** demo

** inst

** preparing package for lazy loading

** help

*** installing help indices

** building package indices

** installing vignettes

   ‘pbdMPI-guide.Rnw’ 

** testing if installed package can be loaded

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_allocator_basic: dlopen(/usr/local/lib/openmpi/mca_allocator_basic.so, 9): Symbol not found: _ompi_free_list_item_t_class

  Referenced from: /usr/local/lib/openmpi/mca_allocator_basic.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_allocator_basic.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_rcache_vma: dlopen(/usr/local/lib/openmpi/mca_rcache_vma.so, 9): Symbol not found: _ompi_free_list_item_t_class

  Referenced from: /usr/local/lib/openmpi/mca_rcache_vma.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_rcache_vma.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_mpool_grdma: dlopen(/usr/local/lib/openmpi/mca_mpool_grdma.so, 9): Symbol not found: _mca_mpool_base_page_size

  Referenced from: /usr/local/lib/openmpi/mca_mpool_grdma.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_mpool_grdma.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_mpool_sm: dlopen(/usr/local/lib/openmpi/mca_mpool_sm.so, 9): Symbol not found: _ompi_allocator_base_framework

  Referenced from: /usr/local/lib/openmpi/mca_mpool_sm.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_mpool_sm.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_bml_r2: dlopen(/usr/local/lib/openmpi/mca_bml_r2.so, 9): Symbol not found: _mca_bml_base_endpoint_t_class

  Referenced from: /usr/local/lib/openmpi/mca_bml_r2.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_bml_r2.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_btl_self: dlopen(/usr/local/lib/openmpi/mca_btl_self.so, 9): Symbol not found: _mca_btl_base_active_message_trigger

  Referenced from: /usr/local/lib/openmpi/mca_btl_self.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_btl_self.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_btl_sm: dlopen(/usr/local/lib/openmpi/mca_btl_sm.so, 9): Symbol not found: _mca_btl_base_active_message_trigger

  Referenced from: /usr/local/lib/openmpi/mca_btl_sm.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_btl_sm.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_btl_tcp: dlopen(/usr/local/lib/openmpi/mca_btl_tcp.so, 9): Symbol not found: _mca_btl_base_active_message_trigger

  Referenced from: /usr/local/lib/openmpi/mca_btl_tcp.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_btl_tcp.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_btl_vader: dlopen(/usr/local/lib/openmpi/mca_btl_vader.so, 9): Symbol not found: _mca_btl_base_active_message_trigger

  Referenced from: /usr/local/lib/openmpi/mca_btl_vader.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_btl_vader.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_pml_bfo: dlopen(/usr/local/lib/openmpi/mca_pml_bfo.so, 9): Symbol not found: _mca_bml

  Referenced from: /usr/local/lib/openmpi/mca_pml_bfo.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_pml_bfo.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_pml_cm: dlopen(/usr/local/lib/openmpi/mca_pml_cm.so, 9): Symbol not found: _mca_pml_base_recv_requests

  Referenced from: /usr/local/lib/openmpi/mca_pml_cm.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_pml_cm.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_pml_ob1: dlopen(/usr/local/lib/openmpi/mca_pml_ob1.so, 9): Symbol not found: _mca_bml

  Referenced from: /usr/local/lib/openmpi/mca_pml_ob1.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_pml_ob1.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_vprotocol_pessimist: dlopen(/usr/local/lib/openmpi/mca_vprotocol_pessimist.so, 9): Symbol not found: _mca_pml_base_request_t_class

  Referenced from: /usr/local/lib/openmpi/mca_vprotocol_pessimist.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_vprotocol_pessimist.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_coll_basic: dlopen(/usr/local/lib/openmpi/mca_coll_basic.so, 9): Symbol not found: _mca_coll_base_module_t_class

  Referenced from: /usr/local/lib/openmpi/mca_coll_basic.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_coll_basic.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_coll_hierarch: dlopen(/usr/local/lib/openmpi/mca_coll_hierarch.so, 9): Symbol not found: _mca_bml

  Referenced from: /usr/local/lib/openmpi/mca_coll_hierarch.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_coll_hierarch.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_coll_inter: dlopen(/usr/local/lib/openmpi/mca_coll_inter.so, 9): Symbol not found: _mca_coll_base_module_t_class

  Referenced from: /usr/local/lib/openmpi/mca_coll_inter.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_coll_inter.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_coll_libnbc: dlopen(/usr/local/lib/openmpi/mca_coll_libnbc.so, 9): Symbol not found: _mca_coll_base_module_t_class

  Referenced from: /usr/local/lib/openmpi/mca_coll_libnbc.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_coll_libnbc.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_coll_ml: dlopen(/usr/local/lib/openmpi/mca_coll_ml.so, 9): Symbol not found: _mca_bcol_base_components_in_use

  Referenced from: /usr/local/lib/openmpi/mca_coll_ml.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_coll_ml.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_coll_self: dlopen(/usr/local/lib/openmpi/mca_coll_self.so, 9): Symbol not found: _mca_coll_base_module_t_class

  Referenced from: /usr/local/lib/openmpi/mca_coll_self.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_coll_self.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_coll_sm: dlopen(/usr/local/lib/openmpi/mca_coll_sm.so, 9): Symbol not found: _mca_coll_base_module_t_class

  Referenced from: /usr/local/lib/openmpi/mca_coll_sm.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_coll_sm.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_coll_tuned: dlopen(/usr/local/lib/openmpi/mca_coll_tuned.so, 9): Symbol not found: _mca_coll_base_module_t_class

  Referenced from: /usr/local/lib/openmpi/mca_coll_tuned.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_coll_tuned.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_osc_rdma: dlopen(/usr/local/lib/openmpi/mca_osc_rdma.so, 9): Symbol not found: _mca_pml

  Referenced from: /usr/local/lib/openmpi/mca_osc_rdma.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_osc_rdma.so (ignored)

[landau.local:01208] mca: base: component_find: unable to open /usr/local/lib/openmpi/mca_osc_sm: dlopen(/usr/local/lib/openmpi/mca_osc_sm.so, 9): Symbol not found: _ompi_info_t_class

  Referenced from: /usr/local/lib/openmpi/mca_osc_sm.so

  Expected in: flat namespace

 in /usr/local/lib/openmpi/mca_osc_sm.so (ignored)

--------------------------------------------------------------------------

No available pml components were found!


This means that there are no components of this type installed on your

system or all the components reported that they could not be used.


This is a fatal error; your MPI process is likely to abort.  Check the

output of the "ompi_info" command and ensure that components of this

type are available on your system.  You may also wish to check the

value of the "component_path" MCA parameter and ensure that it has at

least one directory that contains valid MCA components.

--------------------------------------------------------------------------

[landau.local:01208] PML ob1 cannot be selected

ERROR: loading failed

* removing ‘/Library/Frameworks/R.framework/Versions/3.0/Resources/library/pbdMPI’

* restoring previous ‘/Library/Frameworks/R.framework/Versions/3.0/Resources/library/pbdMPI’


Here's my Open MPI configuration. I would appreciate any help.

> ompi_info
                 Package: Open MPI lan...@landau.local Distribution
                Open MPI: 1.8.2
  Open MPI repo revision: r32596M
   Open MPI release date: Aug 25, 2014
                Open RTE: 1.8.2
  Open RTE repo revision: r32596M
   Open RTE release date: Aug 25, 2014
                    OPAL: 1.8.2
      OPAL repo revision: r32596M
       OPAL release date: Aug 25, 2014
                 MPI API: 3.0
            Ident string: 1.8.2
                  Prefix: /usr/local
 Configured architecture: x86_64-apple-darwin13.3.0
          Configure host: landau.local
           Configured by: landau
           Configured on: Wed Sep  3 11:42:49 CDT 2014
          Configure host: landau.local
                Built by: landau
                Built on: Wed Sep  3 12:08:37 CDT 2014
              Built host: landau.local
              C bindings: yes
            C++ bindings: yes
             Fort mpif.h: yes (single underscore)
            Fort use mpi: yes (limited: overloading)
       Fort use mpi size: deprecated-ompi-info-value
        Fort use mpi_f08: no
 Fort mpi_f08 compliance: The mpi_f08 module was not built
  Fort mpi_f08 subarrays: no
           Java bindings: no
  Wrapper compiler rpath: unnecessary
              C compiler: gcc
     C compiler absolute: /usr/bin/gcc
  C compiler family name: GNU
      C compiler version: 4.2.1
            C++ compiler: g++
   C++ compiler absolute: /usr/bin/g++
           Fort compiler: gfortran
       Fort compiler abs: /usr/local/bin/gfortran
         Fort ignore TKR: no
   Fort 08 assumed shape: no
      Fort optional args: no
      Fort BIND(C) (all): no
      Fort ISO_C_BINDING: yes
 Fort SUBROUTINE BIND(C): no
       Fort TYPE,BIND(C): no
 Fort T,BIND(C,name="a"): no
            Fort PRIVATE: no
          Fort PROTECTED: no
           Fort ABSTRACT: no
       Fort ASYNCHRONOUS: no
          Fort PROCEDURE: no
 Fort f08 using wrappers: no
             C profiling: yes
           C++ profiling: yes
   Fort mpif.h profiling: yes
  Fort use mpi profiling: yes
   Fort use mpi_f08 prof: no
          C++ exceptions: no
          Thread support: posix (MPI_THREAD_MULTIPLE: no, OPAL support: yes,
                          OMPI progress: no, ORTE progress: yes, Event lib:
                          yes)
           Sparse Groups: no
  Internal debug support: no
  MPI interface warnings: yes
     MPI parameter check: runtime
Memory profiling support: no
Memory debugging support: no
         libltdl support: yes
   Heterogeneous support: no
 mpirun default --prefix: no
         MPI I/O support: yes
       MPI_WTIME support: gettimeofday
     Symbol vis. support: yes
   Host topology support: yes
          MPI extensions: 
   FT Checkpoint support: no (checkpoint thread: no)
   C/R Enabled Debugging: no
     VampirTrace support: yes
  MPI_MAX_PROCESSOR_NAME: 256
    MPI_MAX_ERROR_STRING: 256
     MPI_MAX_OBJECT_NAME: 64
        MPI_MAX_INFO_KEY: 36
        MPI_MAX_INFO_VAL: 256
       MPI_MAX_PORT_NAME: 1024
  MPI_MAX_DATAREP_STRING: 128
           MCA backtrace: execinfo (MCA v2.0, API v2.0, Component v1.8.2)
            MCA compress: bzip (MCA v2.0, API v2.0, Component v1.8.2)
            MCA compress: gzip (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA crs: none (MCA v2.0, API v2.0, Component v1.8.2)
                  MCA db: hash (MCA v2.0, API v1.0, Component v1.8.2)
                  MCA db: print (MCA v2.0, API v1.0, Component v1.8.2)
               MCA event: libevent2021 (MCA v2.0, API v2.0, Component v1.8.2)
               MCA hwloc: hwloc172 (MCA v2.0, API v2.0, Component v1.8.2)
                  MCA if: bsdx_ipv6 (MCA v2.0, API v2.0, Component v1.8.2)
                  MCA if: posix_ipv4 (MCA v2.0, API v2.0, Component v1.8.2)
         MCA installdirs: env (MCA v2.0, API v2.0, Component v1.8.2)
         MCA installdirs: config (MCA v2.0, API v2.0, Component v1.8.2)
               MCA pstat: test (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA sec: basic (MCA v2.0, API v1.0, Component v1.8.2)
               MCA shmem: mmap (MCA v2.0, API v2.0, Component v1.8.2)
               MCA shmem: posix (MCA v2.0, API v2.0, Component v1.8.2)
               MCA shmem: sysv (MCA v2.0, API v2.0, Component v1.8.2)
               MCA timer: darwin (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA dfs: app (MCA v2.0, API v1.0, Component v1.8.2)
                 MCA dfs: orted (MCA v2.0, API v1.0, Component v1.8.2)
                 MCA dfs: test (MCA v2.0, API v1.0, Component v1.8.2)
              MCA errmgr: default_app (MCA v2.0, API v3.0, Component v1.8.2)
              MCA errmgr: default_hnp (MCA v2.0, API v3.0, Component v1.8.2)
              MCA errmgr: default_orted (MCA v2.0, API v3.0, Component
                          v1.8.2)
              MCA errmgr: default_tool (MCA v2.0, API v3.0, Component v1.8.2)
                 MCA ess: env (MCA v2.0, API v3.0, Component v1.8.2)
                 MCA ess: hnp (MCA v2.0, API v3.0, Component v1.8.2)
                 MCA ess: singleton (MCA v2.0, API v3.0, Component v1.8.2)
                 MCA ess: slurm (MCA v2.0, API v3.0, Component v1.8.2)
                 MCA ess: tool (MCA v2.0, API v3.0, Component v1.8.2)
               MCA filem: raw (MCA v2.0, API v2.0, Component v1.8.2)
             MCA grpcomm: bad (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA iof: hnp (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA iof: mr_hnp (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA iof: mr_orted (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA iof: orted (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA iof: tool (MCA v2.0, API v2.0, Component v1.8.2)
                MCA odls: default (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA oob: tcp (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA plm: isolated (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA plm: rsh (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA plm: slurm (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA ras: simulator (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA ras: slurm (MCA v2.0, API v2.0, Component v1.8.2)
               MCA rmaps: lama (MCA v2.0, API v2.0, Component v1.8.2)
               MCA rmaps: mindist (MCA v2.0, API v2.0, Component v1.8.2)
               MCA rmaps: ppr (MCA v2.0, API v2.0, Component v1.8.2)
               MCA rmaps: rank_file (MCA v2.0, API v2.0, Component v1.8.2)
               MCA rmaps: resilient (MCA v2.0, API v2.0, Component v1.8.2)
               MCA rmaps: round_robin (MCA v2.0, API v2.0, Component v1.8.2)
               MCA rmaps: seq (MCA v2.0, API v2.0, Component v1.8.2)
               MCA rmaps: staged (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA rml: oob (MCA v2.0, API v2.0, Component v1.8.2)
              MCA routed: binomial (MCA v2.0, API v2.0, Component v1.8.2)
              MCA routed: debruijn (MCA v2.0, API v2.0, Component v1.8.2)
              MCA routed: direct (MCA v2.0, API v2.0, Component v1.8.2)
              MCA routed: radix (MCA v2.0, API v2.0, Component v1.8.2)
               MCA state: app (MCA v2.0, API v1.0, Component v1.8.2)
               MCA state: hnp (MCA v2.0, API v1.0, Component v1.8.2)
               MCA state: novm (MCA v2.0, API v1.0, Component v1.8.2)
               MCA state: orted (MCA v2.0, API v1.0, Component v1.8.2)
               MCA state: staged_hnp (MCA v2.0, API v1.0, Component v1.8.2)
               MCA state: staged_orted (MCA v2.0, API v1.0, Component v1.8.2)
               MCA state: tool (MCA v2.0, API v1.0, Component v1.8.2)
           MCA allocator: basic (MCA v2.0, API v2.0, Component v1.8.2)
           MCA allocator: bucket (MCA v2.0, API v2.0, Component v1.8.2)
                MCA bcol: basesmuma (MCA v2.0, API v2.0, Component v1.8.2)
                MCA bcol: ptpcoll (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA bml: r2 (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA btl: self (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA btl: sm (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA btl: tcp (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA btl: vader (MCA v2.0, API v2.0, Component v1.8.2)
                MCA coll: basic (MCA v2.0, API v2.0, Component v1.8.2)
                MCA coll: hierarch (MCA v2.0, API v2.0, Component v1.8.2)
                MCA coll: inter (MCA v2.0, API v2.0, Component v1.8.2)
                MCA coll: libnbc (MCA v2.0, API v2.0, Component v1.8.2)
                MCA coll: ml (MCA v2.0, API v2.0, Component v1.8.2)
                MCA coll: self (MCA v2.0, API v2.0, Component v1.8.2)
                MCA coll: sm (MCA v2.0, API v2.0, Component v1.8.2)
                MCA coll: tuned (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA dpm: orte (MCA v2.0, API v2.0, Component v1.8.2)
                MCA fbtl: posix (MCA v2.0, API v2.0, Component v1.8.2)
               MCA fcoll: dynamic (MCA v2.0, API v2.0, Component v1.8.2)
               MCA fcoll: individual (MCA v2.0, API v2.0, Component v1.8.2)
               MCA fcoll: static (MCA v2.0, API v2.0, Component v1.8.2)
               MCA fcoll: two_phase (MCA v2.0, API v2.0, Component v1.8.2)
               MCA fcoll: ylib (MCA v2.0, API v2.0, Component v1.8.2)
                  MCA fs: ufs (MCA v2.0, API v2.0, Component v1.8.2)
                  MCA io: ompio (MCA v2.0, API v2.0, Component v1.8.2)
                  MCA io: romio (MCA v2.0, API v2.0, Component v1.8.2)
               MCA mpool: grdma (MCA v2.0, API v2.0, Component v1.8.2)
               MCA mpool: sm (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA osc: rdma (MCA v2.0, API v3.0, Component v1.8.2)
                 MCA osc: sm (MCA v2.0, API v3.0, Component v1.8.2)
                 MCA pml: v (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA pml: bfo (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA pml: cm (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA pml: ob1 (MCA v2.0, API v2.0, Component v1.8.2)
              MCA pubsub: orte (MCA v2.0, API v2.0, Component v1.8.2)
              MCA rcache: vma (MCA v2.0, API v2.0, Component v1.8.2)
                 MCA rte: orte (MCA v2.0, API v2.0, Component v1.8.2)
                MCA sbgp: basesmsocket (MCA v2.0, API v2.0, Component v1.8.2)
                MCA sbgp: basesmuma (MCA v2.0, API v2.0, Component v1.8.2)
                MCA sbgp: p2p (MCA v2.0, API v2.0, Component v1.8.2)
            MCA sharedfp: individual (MCA v2.0, API v2.0, Component v1.8.2)
            MCA sharedfp: lockedfile (MCA v2.0, API v2.0, Component v1.8.2)
            MCA sharedfp: sm (MCA v2.0, API v2.0, Component v1.8.2)
                MCA topo: basic (MCA v2.0, API v2.1, Component v1.8.2)
           MCA vprotocol: pessimist (MCA v2.0, API v2.0, Component v1.8.2)


Wei-Chen Chen

unread,
Sep 5, 2014, 12:05:14 AM9/5/14
to
Dear Will,

See FAQ, Section 8.3, question 10 for more information at
https://github.com/snoweye/pbdMPI/blob/master/inst/doc/pbdMPI-guide.pdf?raw=true

You forgot --configure-args="..." in R CMD INSTALL

Sincerely,
Wei-Chen Chen

Will Landau

unread,
Sep 7, 2014, 7:13:26 PM9/7/14
to rbigdatap...@googlegroups.com
Thanks, Wei-Chen. That worked the first time I tried it, but then I messed things up again. After installing pbdMPI, I moved on to pbdSLAP, which gave me trouble since R was looking for a "gfortran-4.2" executable (which I didn't have) instead of just plain "gfortran" (which I had). So I decided to clean out my software and install gcc, R, and openMPI all through Homebrew to try to make these packages consistent. I was mostly successful there, except for a warning when reinstalling openMPI ("open-mpi dependency gcc was built with a different C++ standard library (libstdc++ from clang). This may cause problems at runtime."). But then the original problem with installing openMPI came back:

landau.local /Users/landau/pbdr> brew doctor
Your system is ready to brew.
landau.local /Users/landau/pbdr> mpicc --showme:compile
-I/usr/local/Cellar/open-mpi/1.8.1/include
landau.local /Users/landau/pbdr> mpicc --showme:link
-L/usr/local/opt/libevent/lib -L/usr/local/Cellar/open-mpi/1.8.1/lib -lmpi
landau.local /Users/landau/pbdr> R CMD INSTALL pbdMPI_0.2-4.tar.gz 
* installing to library ‘/usr/local/Cellar/r/3.1.1/R.framework/Versions/3.1/Resources/library’
* installing *source* package ‘pbdMPI’ ...
** package ‘pbdMPI’ successfully unpacked and MD5 sums checked
checking for sed... /usr/local/Library/ENV/4.3/sed
checking for mpicc... mpicc
checking for ompi_info... ompi_info
found sed, mpicc, and ompi_info ...
>> TMP_INC_DIRS = /usr/local/Cellar/open-mpi/1.8.1/include
checking /usr/local/Cellar/open-mpi/1.8.1/include ...
found /usr/local/Cellar/open-mpi/1.8.1/include/mpi.h ...
>> TMP_LIB_DIRS = /usr/local/opt/libevent/lib /usr/local/Cellar/open-mpi/1.8.1/lib
checking /usr/local/opt/libevent/lib ...
checking /usr/local/Cellar/open-mpi/1.8.1/lib ...
found /usr/local/Cellar/open-mpi/1.8.1/lib/libmpi.dylib ...
found mpi.h and libmpi.so ...
>> TMP_INC = /usr/local/Cellar/open-mpi/1.8.1/include
>> TMP_LIB = /usr/local/Cellar/open-mpi/1.8.1/lib
checking for openpty in -lutil... yes
checking for main in -lpthread... yes
 
******************* Results of pbdMPI package configure *****************
 
>> TMP_INC = /usr/local/Cellar/open-mpi/1.8.1/include
>> TMP_LIB = /usr/local/Cellar/open-mpi/1.8.1/lib
>> TMP_LIBNAME = libmpi.dylib
>> MPI_ROOT = 
>> MPITYPE = OPENMPI
>> MPI_INCLUDE_PATH = /usr/local/Cellar/open-mpi/1.8.1/include
>> MPI_LIBPATH = /usr/local/Cellar/open-mpi/1.8.1/lib
>> MPI_LIBNAME = libmpi.dylib
>> MPI_LIBS =  -lutil -lpthread
>> MPI_DEFS = -DMPI2
>> MPI_INCL2 = 
>> MPI_LDFLAGS = 
>> PKG_CPPFLAGS = -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI
>> PKG_LIBS = -L/usr/local/Cellar/open-mpi/1.8.1/lib -lmpi  -lutil -lpthread
>> PROF_LDFLAGS = 
 
*************************************************************************
 
configure: creating ./config.status
config.status: creating src/Makevars
configure: creating ./config.status
config.status: creating src/Makevars
config.status: creating R/zzz.r
** libs
echo "MPIRUN = " > Makeconf
echo "MPIEXEC = " >> Makeconf
echo "ORTERUN = " >> Makeconf
echo "TMP_INC = /usr/local/Cellar/open-mpi/1.8.1/include" >> Makeconf
echo "TMP_LIB = /usr/local/Cellar/open-mpi/1.8.1/lib" >> Makeconf
echo "TMP_LIBNAME = libmpi.dylib" >> Makeconf
echo "MPI_ROOT = " >> Makeconf
echo "MPITYPE = OPENMPI" >> Makeconf
echo "MPI_INCLUDE_PATH = /usr/local/Cellar/open-mpi/1.8.1/include" >> Makeconf
echo "MPI_LIBPATH = /usr/local/Cellar/open-mpi/1.8.1/lib" >> Makeconf
echo "MPI_LIBNAME = libmpi.dylib" >> Makeconf
echo "MPI_LIBS =  -lutil -lpthread" >> Makeconf
echo "MPI_DEFS = -DMPI2" >> Makeconf
echo "MPI_INCL2 = " >> Makeconf
echo "MPI_LDFLAGS = " >> Makeconf
echo "PKG_CPPFLAGS = -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI" >> Makeconf
echo "PKG_LIBS = -L/usr/local/Cellar/open-mpi/1.8.1/lib -lmpi  -lutil -lpthread" >> Makeconf
echo "PROF_LDFLAGS = " >> Makeconf
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c comm_errors.c -o comm_errors.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c comm_sort_double.c -o comm_sort_double.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c comm_sort_integer.c -o comm_sort_integer.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c pkg_dl.c -o pkg_dl.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c pkg_tools.c -o pkg_tools.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd.c -o spmd.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_allgather.c -o spmd_allgather.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_allgatherv.c -o spmd_allgatherv.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_allreduce.c -o spmd_allreduce.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_bcast.c -o spmd_bcast.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_communicator.c -o spmd_communicator.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_communicator_spawn.c -o spmd_communicator_spawn.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_gather.c -o spmd_gather.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_gatherv.c -o spmd_gatherv.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_info.c -o spmd_info.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_recv.c -o spmd_recv.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_reduce.c -o spmd_reduce.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_scatter.c -o spmd_scatter.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_scatterv.c -o spmd_scatterv.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_send.c -o spmd_send.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_sendrecv.c -o spmd_sendrecv.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_sendrecv_replace.c -o spmd_sendrecv_replace.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_tool.c -o spmd_tool.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_utility.c -o spmd_utility.o
clang -I/usr/local/Cellar/r/3.1.1/R.framework/Resources/include -DNDEBUG -I/usr/local/Cellar/open-mpi/1.8.1/include  -DMPI2 -DOPENMPI -I/usr/local/opt/gettext/include -I/usr/local/opt/readline/include    -fPIC  -g -O2  -c spmd_wait.c -o spmd_wait.o
clang -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/usr/local/opt/readline/lib -o pbdMPI.so comm_errors.o comm_sort_double.o comm_sort_integer.o pkg_dl.o pkg_tools.o spmd.o spmd_allgather.o spmd_allgatherv.o spmd_allreduce.o spmd_bcast.o spmd_communicator.o spmd_communicator_spawn.o spmd_gather.o spmd_gatherv.o spmd_info.o spmd_recv.o spmd_reduce.o spmd_scatter.o spmd_scatterv.o spmd_send.o spmd_sendrecv.o spmd_sendrecv_replace.o spmd_tool.o spmd_utility.o spmd_wait.o -L/usr/local/Cellar/open-mpi/1.8.1/lib -lmpi -lutil -lpthread -F/usr/local/Cellar/r/3.1.1/R.framework/.. -framework R -L/usr/local/opt/gettext/lib -lintl -Wl,-framework -Wl,CoreFoundation
installing via 'install.libs.R' to /usr/local/Cellar/r/3.1.1/R.framework/Versions/3.1/Resources/library/pbdMPI
** R
** data
*** moving datasets to lazyload DB
** demo
** inst
** preparing package for lazy loading
** help
*** installing help indices
** building package indices
** installing vignettes
   ‘pbdMPI-guide.Rnw’ 
** testing if installed package can be loaded
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_allocator_basic: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_allocator_basic.so, 9): Symbol not found: _ompi_free_list_item_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_allocator_basic.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_allocator_basic.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_rcache_vma: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_rcache_vma.so, 9): Symbol not found: _ompi_free_list_item_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_rcache_vma.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_rcache_vma.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_mpool_grdma: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_mpool_grdma.so, 9): Symbol not found: _mca_mpool_base_page_size
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_mpool_grdma.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_mpool_grdma.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_mpool_sm: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_mpool_sm.so, 9): Symbol not found: _ompi_allocator_base_framework
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_mpool_sm.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_mpool_sm.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_bml_r2: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_bml_r2.so, 9): Symbol not found: _mca_bml_base_endpoint_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_bml_r2.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_bml_r2.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_self: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_self.so, 9): Symbol not found: _mca_btl_base_active_message_trigger
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_self.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_self.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_sm: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_sm.so, 9): Symbol not found: _mca_btl_base_active_message_trigger
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_sm.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_sm.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_tcp: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_tcp.so, 9): Symbol not found: _mca_btl_base_active_message_trigger
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_tcp.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_tcp.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_vader: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_vader.so, 9): Symbol not found: _mca_btl_base_active_message_trigger
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_vader.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_btl_vader.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_bfo: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_bfo.so, 9): Symbol not found: _mca_bml
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_bfo.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_bfo.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_cm: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_cm.so, 9): Symbol not found: _mca_pml_base_recv_requests
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_cm.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_cm.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_ob1: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_ob1.so, 9): Symbol not found: _mca_bml
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_ob1.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_pml_ob1.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_vprotocol_pessimist: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_vprotocol_pessimist.so, 9): Symbol not found: _mca_pml_base_request_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_vprotocol_pessimist.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_vprotocol_pessimist.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_basic: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_basic.so, 9): Symbol not found: _mca_coll_base_module_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_basic.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_basic.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_hierarch: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_hierarch.so, 9): Symbol not found: _mca_bml
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_hierarch.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_hierarch.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_inter: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_inter.so, 9): Symbol not found: _mca_coll_base_module_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_inter.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_inter.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_libnbc: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_libnbc.so, 9): Symbol not found: _mca_coll_base_module_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_libnbc.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_libnbc.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_ml: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_ml.so, 9): Symbol not found: _mca_bcol_base_components_in_use
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_ml.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_ml.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_self: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_self.so, 9): Symbol not found: _mca_coll_base_module_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_self.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_self.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_sm: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_sm.so, 9): Symbol not found: _mca_coll_base_module_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_sm.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_sm.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_tuned: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_tuned.so, 9): Symbol not found: _mca_coll_base_module_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_tuned.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_coll_tuned.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_osc_rdma: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_osc_rdma.so, 9): Symbol not found: _mca_pml
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_osc_rdma.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_osc_rdma.so (ignored)
[landau.local:59381] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_osc_sm: dlopen(/usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_osc_sm.so, 9): Symbol not found: _ompi_info_t_class
  Referenced from: /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_osc_sm.so
  Expected in: flat namespace
 in /usr/local/Cellar/open-mpi/1.8.1/lib/openmpi/mca_osc_sm.so (ignored)
--------------------------------------------------------------------------
No available pml components were found!

This means that there are no components of this type installed on your
system or all the components reported that they could not be used.

This is a fatal error; your MPI process is likely to abort.  Check the
output of the "ompi_info" command and ensure that components of this
type are available on your system.  You may also wish to check the
value of the "component_path" MCA parameter and ensure that it has at
least one directory that contains valid MCA components.
--------------------------------------------------------------------------
[landau.local:59381] PML ob1 cannot be selected
ERROR: loading failed
* removing ‘/usr/local/Cellar/r/3.1.1/R.framework/Versions/3.1/Resources/library/pbdMPI’





...

Wei-Chen Chen

unread,
Sep 8, 2014, 7:24:35 PM9/8/14
to
Dear Will,

Again, please see pbdMPI vignette, page 19, FAQ, Section 8.3, Q2.

Sincerely,
Wei-Chen Chen

Will Landau

unread,
Sep 10, 2014, 9:33:00 AM9/10/14
to rbigdatap...@googlegroups.com
Yeah, I should have checked the FAQ again. In any case, I now have pbdMPI, pbdSLAP, and pbdDEMO installed. Thanks for your help.


On Monday, September 8, 2014 6:24:35 PM UTC-5, Wei-Chen Chen wrote:
Dear Will,

Simon Chapple

unread,
Sep 9, 2015, 12:42:45 PM9/9/15
to rbigdatap...@googlegroups.com
Hi Will,
   I am having the exact same problems you ran into trying to install pbdMPI on a Mac with Yosemite - I have OpenMPI already installed and running fine with Rmpi.
However, I get the same failure to install the pbdMPI package you did namely:

[Simons-Mac-mini.local:21582] mca: base: component_find: unable to open /usr/local/Cellar/open-mpi/1.10.0/lib/openmpi/mca_osc_sm: dlopen(/usr/local/Cellar/open-mpi/1.10.0/lib/openmpi/mca_osc_sm.so, 9): Symbol not found: _ompi_info_t_class

  Referenced from: /usr/local/Cellar/open-mpi/1.10.0/lib/openmpi/mca_osc_sm.so

  Expected in: flat namespace

 in /usr/local/Cellar/open-mpi/1.10.0/lib/openmpi/mca_osc_sm.so (ignored)

--------------------------------------------------------------------------

No available pml components were found!


I have tried all the explicit setting of OpenMPI configure variables mentioned on this thread, following the INSTALL instruction and FAQ - absolutely no difference.


Could you share what you did to overcome this problem?


******************************************************************

***** UPDATE - I figured out a way to fix this problem *****

******************************************************************


First I did a standard R INSTALL for pbdMPI (having previously extracted the source pkg) but added the no test load flag:

R CMD INSTALL ../pbdMPI --configure-args="--with-mpi-type=OPENMPI" --no-test-load


Then, I used the following which is the Mac way of "injecting" a shared library dependency via the dynamic linker when running mpiexec, e.g with one of the spmd demos:

export DYLD_INSERT_LIBRARIES=/usr/local/lib/libmpi.dylib

mpiexec -np 2 Rscript --vanilla allgather.r


Hope that helps someone else...


Cheers,

Simon C

Reply all
Reply to author
Forward
0 new messages