Working with mpich (for MOOSE environment) and openmpi (for OpenFoam) in Ubuntu 16.04 Computer

226 views
Skip to first unread message

Anil Kunwar

unread,
Apr 19, 2017, 11:49:19 PM4/19/17
to moose-users
Hi All,
In context we have to install MOOSE framework and OpenFOAM in a single Ubuntu 16.04 computer, we may have to deal with the different mpi interface they use.
1. MOOSE uses mpich interface.
2. OpenFoam uses openmpi interface

In order to work with both of them, the concept of using alias for MOOSE environment can be one of the solutions.

Steps:
i. Comment the moose environment source in ~/.bashrc file
# Source MOOSE Environment
#if [ -f /opt/moose/environments/moose_profile ]; then
 
#. /opt/moose/environments/moose_profile
#fi



and instead write the following alias for moose in the ~/.bashrc
alias mooseswitch1='. /home/username/moose-projects/etc/bashrc'


ii. Create a file with name bashrc (or can choose anyother name e.g. bashhome or any other names) in ~/moose-projects/etc/ folder and there copy paste the above commented 

source of moose environment and then uncomment it there.

# Source MOOSE Environment
if [ -f /opt/moose/environments/moose_profile ]; then
 
. /opt/moose/environments/moose_profile
fi



iii. When we open a new terminal, then we need to type the alias of moose environment , namely mooseswitch1 to do every task related to moose software (mpich interface is active in this case).


username@username
-Aspire-1602M:~$ mooseswitch1
username@username
-Aspire-1602M:~$ which mpirun
/opt/moose/mpich/mpich-3.2/gcc-opt/bin/mpirun
username@username
-Aspire-1602M:~$


iv. When the alias switch of moose is not turned on , openmpi if installed and set default is rendered active in this context and the work related to installation and running of OpenFOAM can be done


username@username-Aspire-1602M:~$ which mpirun
/usr/bin/mpirun


Cheers,


Yours Sincerely,

Anil Kunwar


Miller, Jason M

unread,
Apr 20, 2017, 8:57:32 AM4/20/17
to moose...@googlegroups.com
What you suggest does work, but because not everyone knows bash, we include a solution in our package for these matters. By installing a moose-environment package, you are also getting an environment management software suite known as "Modules".


On Wed, Apr 19, 2017 at 9:49 PM, Anil Kunwar <romagu...@gmail.com> wrote:
Hi All,
In context we have to install MOOSE framework and OpenFOAM in a single Ubuntu 16.04 computer, we may have to deal with the different mpi interface they use.
1. MOOSE uses mpich interface.
2. OpenFoam uses openmpi interface

In order to work with both of them, the concept of using alias for MOOSE environment can be one of the solutions.

Steps:
i. Comment the moose environment source in ~/.bashrc file
# Source MOOSE Environment
#if [ -f /opt/moose/environments/moose_profile ]; then
 
#. /opt/moose/environments/moose_profile
#fi


This is equivalent to performing a `module purge`. After 'purging' all loaded modules, your environment is now set as if you never sourced the moose_profile to begin with (except of course the necessary PATHs needed to make modules work in the first place!).
 

iii. When we open a new terminal, then we need to type the alias of moose environment , namely mooseswitch1 to do every task related to moose software (mpich interface is active in this case).


username@username
-Aspire-1602M:~$ mooseswitch1


This is equivalent to performing a `module load moose-dev-gcc moose-tools` (for those on a Macintosh machine:  moose-dev-clang moose-tools).

What we ask our users to do, when needing multiple environments other than moose's is to purge the modules immediately after sourcing the moose_profile:

# Source MOOSE Environment
if [ -f /opt/moose/environments/moose_profile ]; then
  . /opt/moose/environments/moose_profile
  module purge
fi
 
That way with every new terminal window opened the environment is ready to be modified in any other way needed (as you demonstrated).
We also encourage others to take advantage of module's ability to load custom user owned modules located in their ~/privatemodules directory when the need arises for complex environment management:

module load use.own
module load my-favorite-module

"my-favorite-module" in your case, could be the environment necessary to make OpenFOAM work:


username@username-Aspire-1602M:~$ cat ~/privatemodules/my-favorite-module
#%Module1.0#####################################################################
##
## OpenMPI OpenFOAM module
##

# Make sure I don't load any moose modules that break OpenFOAM:
conflict moose-dev-gcc moose-dev-clang

set BASE_PATH   /opt/OpenFOAM
set             MPI_PATH           $BASE_PATH/openmpi/openmpi-1.10.2

# Load some other modules that I like
module load maybe-some-other-module-i-like

# Perform some logic operation (in this example if _this_ machine is not a macintosh machine do the following)
if { [uname sysname] != "Darwin" } {
   prepend-path    LD_LIBRARY_PATH    $MPI_PATH/lib
}

# Set all the paths necessary to make OpenMPI work
prepend-path    C_INCLUDE_PATH     $MPI_PATH/include
prepend-path    CPLUS_INCLUDE_PATH $MPI_PATH/include
prepend-path    FPATH              $MPI_PATH/include
prepend-path    MANPATH            $MPI_PATH/share/man

# Set our compiler
setenv CC       mpicc
setenv CXX      mpicxx
setenv F90      mpif90
setenv F77      mpif77
setenv FC       mpif90

# Set PATH to OpenMPI and the OpenFOAM executable
prepend-path    PATH              $MPI_PATH/bin


The above is of course just an example. But I wanted to demonstrate the flexibility of Modules.
Hope folks find this useful!
Jason

anil kunwar

unread,
Apr 20, 2017, 9:14:00 AM4/20/17
to moose...@googlegroups.com
Hi Jason,
Thank you for the wonderful and detailed information of using "Modules". Earlier, I had learnt about the modules loading and purging in moose but had not thought that it would be used in this sense also.
Now, I learnt about that from you.
Your information is very useful. Thanking you and cheers.

Yours Sincerely,
Anil Kunwar


--
You received this message because you are subscribed to a topic in the Google Groups "moose-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/moose-users/XeK8asiF9I0/unsubscribe.
To unsubscribe from this group and all its topics, send an email to moose-users...@googlegroups.com.
Visit this group at https://groups.google.com/group/moose-users.
To view this discussion on the web visit https://groups.google.com/d/msgid/moose-users/CALgZTAfW8O_xC-9cMRO%3DYj4sNkZjRLQhowmL%3D7X%2BXKzP9-yU_w%40mail.gmail.com.

For more options, visit https://groups.google.com/d/optout.


Derek Gaston

unread,
Apr 21, 2017, 7:10:13 AM4/21/17
to moose...@googlegroups.com
Also: MOOSE works fine with openmpi...

Derek


You received this message because you are subscribed to the Google Groups "moose-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to moose-users...@googlegroups.com.

Anil Kunwar

unread,
Apr 21, 2017, 8:17:34 AM4/21/17
to moose-users
Hi Derek,
Thank you for reminding me that moose can work in either of openmpi or mpich.
1. I remember installing MOOSE with openmpi 1.8.4 mpi implementation (around the April 2015) . Do you mean in the manual installation steps, http://mooseframework.org/wiki/BasicManualInstallation/Linux/, instead of MPICH in step 4, we can use openmpi in the same way?
2. Now,in the redistributable package for MOOSE environment in Ubuntu 16.04 (http://mooseframework.org/getting-started/ubuntu-1204/), mpich implementation has been utilized. Can you point to an archive of moose environment redistributable package for Ubuntu OS (if some one has built) that is based upon openmpi interface .

Thus, MOOSE works both with openmpi and mpich. So, what I had meant by MOOSE uses mpich interface is in relation to the mpich implementation coming with the redistributable package recently (http://mooseframework.org/getting-started/ubuntu-1204/).

Yours Sincerely,
Anil Kunwar

Miller, Jason M

unread,
Apr 21, 2017, 9:24:09 AM4/21/17
to moose...@googlegroups.com
Anil,

The package itself comes with OpenMPI and MPICH. Both of which further containing a version built with GCC, and another with Clang. In order to gain access to these other compiler variations you need to load the 'advanced_modules' module:

[~]> module purge   # <-- always a good idea if you plan on messing around with different modules being loaded
[~]> module load advanced_modules
[~]> module avail

--------------------- /opt/moose/Modules/3.2.10/adv_modules ----------------------
autotools     cppunit-clang mpich-gcc     petsc-head
boost         cppunit-gcc   openmpi-clang tbb
clang         gcc           openmpi-gcc   valgrind
cmake         mpich-clang   pbs-emulator  vtk-clang

-------------------------- /opt/moose/Modules/versions ---------------------------
3.2.10

--------------------- /opt/moose/Modules/3.2.10/modulefiles ----------------------
advanced_modules icecream         module-info      moose-tools
ccache           miniconda        modules          null
dot              miniconda-dev    moose-dev-clang  use.own
git-lfs          module-git       moose-dev-gcc


Lets load the openmpi-gcc module:

[~]> module load openmpi-gcc 
[~]> module avail

--------------------- /opt/moose/Modules/3.2.10/adv_modules ----------------------
autotools     cppunit-clang mpich-gcc     petsc-head
boost         cppunit-gcc   openmpi-clang tbb
clang         gcc           openmpi-gcc   valgrind
cmake         mpich-clang   pbs-emulator  vtk-clang

-------------------------- /opt/moose/Modules/versions ---------------------------
3.2.10

--------------------- /opt/moose/Modules/3.2.10/modulefiles ----------------------
advanced_modules icecream         module-info      moose-tools
ccache           miniconda        modules          null
dot              miniconda-dev    moose-dev-clang  use.own
git-lfs          module-git       moose-dev-gcc

--------------------- /opt/moose/Modules/3.2.10/openmpi_gcc ----------------------
petsc-3.6.4     petsc-3.6.4-dbg petsc-3.7.5


By loading the openmpi-gcc module three more modules have become available (PETSc modules):

--------------------- /opt/moose/Modules/3.2.10/openmpi_gcc ----------------------
petsc-3.6.4     petsc-3.6.4-dbg petsc-3.7.5

The reason these three modules were not available until loading the openmpi-gcc module is because these three versions of PETSc were _built_ with the specific version of openmpi-gcc you previously loaded. There are in fact 12 different versions of PETSc available (2 compilers (gcc|clang) x 2 MPI wrappers (mpich|openmpi) x3 versions of PETSc each =12) +- a few more for those one-offs that my group has asked for (PETSc debug, PETSc bleeding edge, PETSc 64-bit integers, etc)... Actually, you know... its easier if I just perform a list showing all the modules available (on my Macintosh machine):

[]> cd /opt/moose/Modules/3.2.10
[]> find . | grep -v "bin\|init\|share"
./adv_modules
./adv_modules/autotools
./adv_modules/boost
./adv_modules/clang
./adv_modules/cmake
./adv_modules/cppunit-clang
./adv_modules/cppunit-gcc
./adv_modules/gcc
./adv_modules/mpich-clang
./adv_modules/mpich-gcc
./adv_modules/openmpi-clang
./adv_modules/openmpi-gcc
./adv_modules/pbs-emulator
./adv_modules/petsc-head
./adv_modules/tbb
./adv_modules/valgrind
./adv_modules/vtk-clang
./civet
./civet/mpich-clang-petsc_alt
./civet/mpich-clang-petsc_alt-slepc
./civet/mpich-clang-petsc_alt-trilinos-dbg
./civet/mpich-clang-petsc_alt-trilinos-opt
./civet/mpich-clang-petsc_alt-vtk
./civet/mpich-clang-petsc_alt-vtk-trilinos-dbg
./civet/mpich-clang-petsc_alt-vtk-trilinos-opt
./civet/mpich-clang-petsc_default
./civet/mpich-clang-petsc_default-slepc
./civet/mpich-clang-petsc_default-trilinos-dbg
./civet/mpich-clang-petsc_default-trilinos-opt
./civet/mpich-clang-petsc_default-vtk
./civet/mpich-clang-petsc_default-vtk-trilinos-dbg
./civet/mpich-clang-petsc_default-vtk-trilinos-opt
./civet/mpich-clang-petsc_default_64
./civet/mpich-gcc-petsc_alt
./civet/mpich-gcc-petsc_alt-slepc
./civet/mpich-gcc-petsc_alt-trilinos-dbg
./civet/mpich-gcc-petsc_alt-trilinos-opt
./civet/mpich-gcc-petsc_alt-vtk
./civet/mpich-gcc-petsc_alt-vtk-trilinos-dbg
./civet/mpich-gcc-petsc_alt-vtk-trilinos-opt
./civet/mpich-gcc-petsc_default
./civet/mpich-gcc-petsc_default-slepc
./civet/mpich-gcc-petsc_default-trilinos-dbg
./civet/mpich-gcc-petsc_default-trilinos-opt
./civet/mpich-gcc-petsc_default-vtk
./civet/mpich-gcc-petsc_default-vtk-trilinos-dbg
./civet/mpich-gcc-petsc_default-vtk-trilinos-opt
./civet/openmpi-clang-petsc_alt
./civet/openmpi-clang-petsc_alt-trilinos-dbg
./civet/openmpi-clang-petsc_alt-trilinos-opt
./civet/openmpi-clang-petsc_alt-vtk
./civet/openmpi-clang-petsc_alt-vtk-trilinos-dbg
./civet/openmpi-clang-petsc_alt-vtk-trilinos-opt
./civet/openmpi-clang-petsc_default
./civet/openmpi-clang-petsc_default-trilinos-dbg
./civet/openmpi-clang-petsc_default-trilinos-opt
./civet/openmpi-clang-petsc_default-vtk
./civet/openmpi-clang-petsc_default-vtk-trilinos-dbg
./civet/openmpi-clang-petsc_default-vtk-trilinos-opt
./civet/openmpi-gcc-petsc_alt
./civet/openmpi-gcc-petsc_alt-trilinos-dbg
./civet/openmpi-gcc-petsc_alt-trilinos-opt
./civet/openmpi-gcc-petsc_alt-vtk
./civet/openmpi-gcc-petsc_alt-vtk-trilinos-dbg
./civet/openmpi-gcc-petsc_alt-vtk-trilinos-opt
./modulefiles
./modulefiles/advanced_modules
./modulefiles/ccache
./modulefiles/civet
./modulefiles/civet/.civet
./modulefiles/dot
./modulefiles/git-lfs
./modulefiles/icecream
./modulefiles/miniconda
./modulefiles/miniconda-dev
./modulefiles/module-git
./modulefiles/module-info
./modulefiles/modules
./modulefiles/moose
./modulefiles/moose/.clang-3.9.0
./modulefiles/moose/.cppunit-1.12.1-clang
./modulefiles/moose/.cppunit-1.12.1-gcc
./modulefiles/moose/.gcc-6.2.0
./modulefiles/moose/.mpich-3.2_clang
./modulefiles/moose/.mpich-3.2_gcc
./modulefiles/moose/.mpich_petsc-3.6.4-clang
./modulefiles/moose/.mpich_petsc-3.6.4-clang-dbg
./modulefiles/moose/.mpich_petsc-3.6.4-gcc
./modulefiles/moose/.mpich_petsc-3.7.5-64-clang
./modulefiles/moose/.mpich_petsc-3.7.5-clang
./modulefiles/moose/.mpich_petsc-3.7.5-gcc
./modulefiles/moose/.mpich_petsc-bleedingedge-clang
./modulefiles/moose/.mpich_petsc-bleedingedge-gcc
./modulefiles/moose/.mpich_slepc-3.6.3-clang
./modulefiles/moose/.mpich_slepc-3.6.3-gcc
./modulefiles/moose/.mpich_slepc-3.7.3-clang
./modulefiles/moose/.mpich_slepc-3.7.3-gcc
./modulefiles/moose/.mpich_trilinos-release-12-6-2-clang-dbg
./modulefiles/moose/.mpich_trilinos-release-12-6-2-clang-opt
./modulefiles/moose/.openmpi-1.10.2_clang
./modulefiles/moose/.openmpi-1.10.2_gcc
./modulefiles/moose/.openmpi_petsc-3.6.4-clang
./modulefiles/moose/.openmpi_petsc-3.6.4-gcc
./modulefiles/moose/.openmpi_petsc-3.6.4-gcc-dbg
./modulefiles/moose/.openmpi_petsc-3.7.5-clang
./modulefiles/moose/.openmpi_petsc-3.7.5-gcc
./modulefiles/moose/.tbb44_20150728
./modulefiles/moose/.VTK-7.1.0-clang
./modulefiles/moose-dev-clang
./modulefiles/moose-dev-gcc
./modulefiles/moose-tools
./modulefiles/null
./modulefiles/use.own
./mpich_clang
./mpich_clang/petsc-3.6.4
./mpich_clang/petsc-3.6.4-dbg
./mpich_clang/petsc-3.7.5
./mpich_clang/petsc-3.7.5-64
./mpich_clang/petsc-bleedingedge
./mpich_clang/slepc-3.6.3
./mpich_clang/slepc-3.7.3
./mpich_clang/trilinos-dbg
./mpich_clang/trilinos-opt
./mpich_gcc
./mpich_gcc/petsc-3.6.4
./mpich_gcc/petsc-3.7.5
./mpich_gcc/petsc-bleedingedge
./mpich_gcc/slepc-3.6.3
./mpich_gcc/slepc-3.7.3
./openmpi_clang
./openmpi_clang/petsc-3.6.4
./openmpi_clang/petsc-3.7.5
./openmpi_gcc
./openmpi_gcc/petsc-3.6.4
./openmpi_gcc/petsc-3.6.4-dbg
./openmpi_gcc/petsc-3.7.5


Haha, Sorry. A bit of an overkill for an answer?
Jason


To unsubscribe from this group and stop receiving emails from it, send an email to moose-users+unsubscribe@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages