Trying to compile CANTERA in a parallel computational code

519 views
Skip to first unread message

Marcelo Damasceno

unread,
Jul 14, 2015, 8:17:52 AM7/14/15
to canter...@googlegroups.com
Good morning!

After a long discussion with Mr. Bryan Weber and Mr. Ray Speth, I was able to compile cantera using GCC or intel compilers. I intend to couple this tool to a computational code which is used in my lab. The problem is that this code is parallel and is compiled with HDF5-MPI-intel/gcc. When I tried to do a simple test, changing the used fortran compiler from demo.f90  (F90=ifort) to the one which is used in the referred computational code (F90=h5pfc), I've got the following errors:

marcelomrd@marcelomrd-mflab:~/Softwares/Binaries/cantera/share/cantera/samples/f90$ make all
h5pfc -c demo.f90 -module /home/marcelomrd/Softwares/Binaries/cantera/include/cantera -g
h5pfc  -o demo demo.o -L/home/marcelomrd/Softwares/Binaries/cantera/lib -lcantera_fortran -lcantera  -L/usr/lib -lsundials_cvodes -lsundials_ida -lsundials_nvecserial -L/home/marcelomrd/Softwares/Binaries/intel/composer_xe_2013.3.163/mkl/lib/intel64 -lmkl_rt -L/usr/lib -lboost_system -lpthread -lstdc++
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Compare_and_swap'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Iallreduce'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Comm_split_type'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Fetch_and_op'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_flush_all'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Neighbor_alltoallv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Iexscan'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Iscan'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ineighbor_alltoall'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Comm_set_info'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ineighbor_alltoallv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Type_get_true_extent_x'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Neighbor_alltoallw'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Iscatterv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_flush_local'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ireduce_scatter'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Neighbor_allgatherv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_allocate'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ireduce_scatter_block'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_FORTRAN_UNWEIGHTED'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Type_create_hindexed_block'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ialltoallw'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Dist_graph_create'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Get_library_version'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ineighbor_allgatherv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_flush_local_all'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ialltoallv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Dist_graph_neighbors_count'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Rget'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Igatherv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_sync'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Message_f2c'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ialltoall'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Dist_graph_neighbors'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_shared_query'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Comm_idup'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Mprobe'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_allocate_shared'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ineighbor_alltoallw'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_unlock_all'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Get_elements_x'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ireduce'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Raccumulate'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ibarrier'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Status_set_elements_x'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Mrecv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Comm_get_info'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Iscatter'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `mpi_fortran_unweighted__'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Neighbor_alltoall'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ibcast'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Reduce_scatter_block'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Ineighbor_allgather'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Type_get_extent_x'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Improbe'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Get_accumulate'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Iallgatherv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Dist_graph_create_adjacent'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_FORTRAN_WEIGHTS_EMPTY'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Type_size_x'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Neighbor_allgather'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_lock_all'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Igather'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Comm_dup_with_info'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Win_flush'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Comm_create_group'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Iallgather'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Imrecv'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Rget_accumulate'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `mpi_fortran_weights_empty__'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Rput'
/home/marcelomrd/Softwares/Binaries/openmpi-1.8.6/lib/libmpi_mpifh.so: undefined reference to `MPI_Message_c2f'
Makefile:17: recipe for target 'demo' failed
make: *** [demo] Error 1

In the very same code, I have a version o CHEMKIN, which is compiled properly using the aforementioned compiler/wrapper. And also, there are other functions/modules, which are not parallel, but are compiled properly. 
Any ideas?

Bryan W. Weber

unread,
Jul 14, 2015, 9:17:15 AM7/14/15
to canter...@googlegroups.com
Dear Marcello,

It looks like the compiler can't find certain libraries it needs, apparently entirely unrelated to Cantera but related to MPI. What is the command line used to compile CHEMKIN, since that works? Are there any additional/different libraries in that command (following -L or -l commands)? You can use the nm utility to try to find which library contains the symbols that are missing with a line like

nm -g library_to_check.so | grep -n Symbol_You_Are_Looking_For (case sensitive)

If you have the source code, you can search in that too, to see where the symbol is.

Hope it helps,
Bryan

Marcelo Damasceno

unread,
Jul 14, 2015, 1:15:45 PM7/14/15
to canter...@googlegroups.com
Indeed, I use some libraries for CHEMKIN: -lgsl -lm -lgslcblas -lsundials_nvecserial -lsundials_cvode. Some of them are also used by CANTERA, I guess.

Bryan W. Weber

unread,
Jul 14, 2015, 2:34:57 PM7/14/15
to canter...@googlegroups.com
So if you add those extra options to the line for Cantera, what does it do? If it doesn't work, can you please copy & paste the whole command for CHEMKIN, as you have done for Cantera?

Bryan

paul zhang

unread,
Jul 15, 2015, 4:14:56 PM7/15/15
to canter...@googlegroups.com
Hi Marcelo,

I attempted to compile cantera on our cluster. Can you shed some light on how you compile it? I guess you are not using scon. 

Thanks,
Paul



On Tuesday, July 14, 2015 at 8:17:52 AM UTC-4, Marcelo Damasceno wrote:

Marcelo Damasceno

unread,
Jul 17, 2015, 7:41:10 AM7/17/15
to canter...@googlegroups.com
Hi Paul.

Until the moment, I've just managed to compile CANTERA in different personal computers, using gcc or intel compilers (I still haven't compiled it in our clusters). I'll do that as soon as I finish coupling CANTERA to the computational code we use here. 

Marcelo Damasceno

unread,
Jul 23, 2015, 1:45:51 PM7/23/15
to Cantera Users' Group, marce...@gmail.com
I have found a CANTERA User's Guide: Fortran Version, but it was written in november, 2011. Is there a newer version of this tutorial or it's okay to use this one?

Ray Speth

unread,
Jul 23, 2015, 5:29:31 PM7/23/15
to Cantera Users' Group, marce...@gmail.com
Marcelo,

I assume you mean the guide from 2001, not 2011? There isn't any more recent documentation for the Fortran module that I am aware of, and much of what's in that guide is incorrect. The best I can suggest is to look at the examples included with the Cantera source (in the directories samples/f77 and samples/f90), and then for the full Fortran API take a look at the files src/fortran/cantera*.f90 which define the functions in the Fortran 90 Cantera interface.

Regards,
Ray

Marcelo Damasceno

unread,
Jul 23, 2015, 8:24:58 PM7/23/15
to Cantera Users' Group, yar...@gmail.com
You're right. Indeed the guide was written in 2001. I'll study the files you mentioned. One more question: I have a reduced mechanism, but it is in CHEMKIN format, (*.inp and *.tran). Is there a tool or a function to convert those files to something CANTERA could understand?

Bryan W. Weber

unread,
Jul 24, 2015, 6:54:37 AM7/24/15
to Cantera Users' Group, yar...@gmail.com, marce...@gmail.com
Dear Marcello,

There is a script installed in the $prefix/bin directory called ck2cti that converts CHEMKIN format mechanisms to CTI format.

Bryan

Marcelo Damasceno

unread,
Jul 25, 2015, 8:43:06 AM7/25/15
to Cantera Users' Group, yar...@gmail.com, bryan....@gmail.com
Thanks Bryan, I managed to use it. Back to that other issue I'm experiencing: I was able to couple CANTERA with the computational code we use here at the lab. However, it only works with GCC. I tried to recompile CANTERA without using MKL from intel, and I failed in several tests:

*****************************
***    Testing Summary    ***
*****************************

Tests passed: 837
Up-to-date tests skipped: 0
Tests failed: 68
Failed tests:
    - python3:test_convert.chemkinConverterTest.test_sri_falloff
    - python3:test_equilibrium.VCS_EquilTest.test_equil_complete_stoichiometric
    - python3:test_equilibrium.MultiphaseEquilTest.test_equil_overconstrained1
    - python3:test_thermo.TestSpecies.test_fromCti
    - python3:test_equilibrium.VCS_EquilTest.test_equil_incomplete_stoichiometric
    - python3:test_kinetics.TestDuplicateReactions.test_different_type
    - python3:test_equilibrium.ChemEquilTest.test_equil_incomplete_lean
    - python3:test_equilibrium.MultiphaseEquilTest.test_equil_incomplete_stoichiometric
    - python3:test_kinetics.TestReaction.test_fromCti
    - python3:test_thermo.TestSpecies.test_listFromFile_cti
    - python3:test_convert.chemkinConverterTest.test_duplicate_species
    - python3:test_equilibrium.MultiphaseEquilTest.test_equil_incomplete_lean
    - python3:test_kinetics.TestReaction.test_chebyshev_rate
    - python3:test_convert.chemkinConverterTest.test_duplicate_thermo
    - python3:test_convert.chemkinConverterTest.test_transport_normal
    - python3:test_kinetics.TestDuplicateReactions.test_opposite_direction1
    - python3:test_kinetics.TestDuplicateReactions.test_opposite_direction2
    - python3:test_kinetics.TestDuplicateReactions.test_opposite_direction3
    - python3:test_kinetics.TestDuplicateReactions.test_opposite_direction4
    - python3:test_convert.chemkinConverterTest.test_species_only
    - python3:test_convert.chemkinConverterTest.test_chemically_activated
    - python3:test_kinetics.TestReaction.test_listFromCti
    - python3:test_convert.chemkinConverterTest.test_transport_embedded
    - python3:test_equilibrium.ChemEquilTest.test_equil_complete_stoichiometric
    - python3:test_equilibrium.ChemEquilTest.test_equil_incomplete_stoichiometric
    - python3:test_convert.chemkinConverterTest.test_empty_reaction_section
    - python3:test_equilibrium.ChemEquilTest.test_equil_complete_lean
    - python3:test_convert.chemkinConverterTest.test_reaction_units
    - python3:test_thermo.TestSpecies.test_listFromCti
    - python3:test_equilibrium.VCS_EquilTest.test_equil_overconstrained2
    - python3:test_equilibrium.VCS_EquilTest.test_equil_complete_lean
    - python3:test_kinetics.TestReaction.test_plog_rate
    - python3:test_convert.chemkinConverterTest.test_explicit_reverse_rate
    - python3:test_kinetics.TestReaction.test_plog
    - python3:test_thermo.ImportTest.test_import_phase_cti2
    - python3:test_thermo.ImportTest.test_import_phase_cti_text
    - python3:test_convert.chemkinConverterTest.test_pathologicalSpeciesNames
    - python3:test_kinetics.TestReaction.test_modify_plog
    - python3:test_convert.chemkinConverterTest.test_explicit_forward_order
    - python3:test_kinetics.TestReaction.test_chebyshev
    - python3:test_kinetics.TestReaction.test_negative_A
    - python3:test_convert.chemkinConverterTest.test_gri30
    - python3:test_kinetics.TestDuplicateReactions.test_declared_duplicate
    - python3:test_convert.chemkinConverterTest.test_nasa9
    - python3:test_convert.chemkinConverterTest.test_unterminatedSections2
    - python3:test_transport.TestTransportGeometryFlags.test_bad_geometry
    - python3:test_convert.CtmlConverterTest.test_invalid
    - python3:test_thermo.ImportTest.test_import_phase_cti
    - python3:test_kinetics.TestReaction.test_modify_chebyshev
    - python3:test_convert.CtmlConverterTest.test_sofc
    - python3:test_convert.chemkinConverterTest.test_float_stoich_coeffs
    - python3:test_kinetics.TestDuplicateReactions.test_disjoint_efficiencies
    - python3:test_convert.chemkinConverterTest.test_pdep
    - python3:test_equilibrium.MultiphaseEquilTest.test_equil_overconstrained2
    - python3:test_equilibrium.MultiphaseEquilTest.test_equil_complete_lean
    - python3:test_convert.chemkinConverterTest.test_explicit_third_bodies
    - python3:test_convert.CtmlConverterTest.test_diamond
    - python3:test_equilibrium.VCS_EquilTest.test_equil_incomplete_lean
    - python3:test_equilibrium.VCS_EquilTest.test_equil_overconstrained1
    - python3:test_convert.chemkinConverterTest.test_soot
    - python3:test_equilibrium.MultiphaseEquilTest.test_equil_complete_stoichiometric
    - python3:test_equilibrium.ChemEquilTest.test_equil_overconstrained1
    - python3:test_convert.CtmlConverterTest.test_noninteger_atomicity
    - python3:test_kinetics.TestDuplicateReactions.test_common_efficiencies
    - python3:test_equilibrium.ChemEquilTest.test_equil_overconstrained2
    - python3:test_convert.chemkinConverterTest.test_unterminatedSections
    - python3:test_kinetics.TestDuplicateReactions.test_forward_multiple
    - python3:test_convert.CtmlConverterTest.test_pdep

All of them are related to python3. Any idea?

Marcelo Damasceno

unread,
Jul 25, 2015, 5:30:35 PM7/25/15
to Cantera Users' Group, yar...@gmail.com, bryan....@gmail.com, marce...@gmail.com
I performed the whole compiling process again and it is even worse... all tests failed. I really have no idea what is going on here:

*****************************
***    Testing Summary    ***
*****************************

Tests passed: 0
Up-to-date tests skipped: 0
Tests failed: 646
Failed tests:
    - IAPWSTripP
    - negA-cti
    - DH_graph_acommon
    - ChemEquil_gri_matrix
    - IMSTester
    - rankine_democxx
    - IAPWSPres
    - VCS-LiSi
    - python2 ***no results for entire test suite***
    - ISSPTester2
    - HMW_graph_HvT
    - DH_graph_dilute
    - HMW_dupl_test
    - cxx_ex
    - DH_graph_bdotak
    - IAPWSphi
    - surfSolver2
    - HMW_graph_GvT
    - stoichSubSSTP
    - VPsilane_test
    - negA-xml
    - ISSPTester
    - pureFluid
    - HMW_graph_GvI
    - diamondSurf-cti
    - ChemEquil_ionizedGas
    - thermo.passed ***no results for entire test suite***
    - HMW_graph_VvT
    - surfSolver
    - stoichSolidKinetics
    - multiGasTransport
    - WaterSSTP
    - mixGasTransport
    - DH_graph_NM
    - VCS-NaCl
    - python3 ***no results for entire test suite***
    - kinetics.passed ***no results for entire test suite***
    - HMW_test_3
    - DH_graph_Pitzer
    - wtWater
    - WaterPDSS
    - transport.passed ***no results for entire test suite***
    - ChemEquil_red1
    - general.passed ***no results for entire test suite***
    - ChemEquil_gri_pairs
    - surfkin
    - HMW_graph_CpvT
    - dustyGasTransport
    - silane_equil
    - diamondSurf-xml
    - simpleTransport
    - HMW_test_1

*****************************
scons: *** [test_problems/finish_tests] One or more tests failed.
scons: building terminated because of errors.

Ray Speth

unread,
Jul 27, 2015, 5:59:47 PM7/27/15
to Cantera Users' Group, bryan....@gmail.com, marce...@gmail.com
Marcelo,

Can you attach the full output from 'scons test'? Just the test summary doesn't tell us much about what might have gone wrong.

Regards,
Ray

Marcelo Damasceno

unread,
Jul 28, 2015, 9:57:13 AM7/28/15
to Cantera Users' Group, bryan....@gmail.com, yar...@gmail.com
Hi, Ray. I have already solved that issue. The problem now is that when I execute a compiled f90, Cantera can't find BOOST:

./demo: error while loading shared libraries: libboost_system.so.1.58.0: cannot open shared object file: No such file or directory

Bryan W. Weber

unread,
Jul 28, 2015, 10:15:58 AM7/28/15
to Cantera Users' Group, yar...@gmail.com, marce...@gmail.com
Dear Marcelo,

What is the value of the environment variable LD_LIBRARY_PATH? Does it include the directory where Boost is installed? What about the linker line that's used to compile the executable? Does that include any boost references?

Bryan

Carlos Felipe Forigua Rodriguez

unread,
Oct 1, 2016, 8:28:43 AM10/1/16
to Cantera Users' Group, yar...@gmail.com, marce...@gmail.com
Dear developers ans users,

I'm trying to do something similar. But I don't understand the question in this thread.

In order to compile Cantera(C++) + Program(FORTRAN) + MPI ¿Is Marcelo trying to recompile Cantera with all its libraries using the MPI compiler? ¿Is this recompilation necessary?

Thank you very much
Regards,
Carlos

Ray Speth

unread,
Oct 1, 2016, 4:17:06 PM10/1/16
to Cantera Users' Group, yar...@gmail.com, marce...@gmail.com
Carlos,

No, I don't think that it is necessary (or helpful) to compile Cantera using the MPI compiler wrapper. Cantera is not an MPI code and does has no awareness of any preprocessor directives or need of any libraries that might be used by those compiler wrappers.

Regards,
Ray
Reply all
Reply to author
Forward
0 new messages