Dear Community,
I have successfully installed Clawpack 5.10 on my machine. However, when I try to run the official example provided for Boussinesq solvers in two space dimensions (located at geoclaw/examples/bouss/radial_flat within the Clawpack root folder), I encounter the following error:
```
==> Applying Bouss equations to selected grids between levels 1 and 10
==> Use Bouss. in water deeper than 1.0000000000000000
Using a PETSc solver
Using Bouss equations from the start
rnode allocated...
node allocated...
listOfGrids allocated...
Storage allocated...
bndList allocated...
Gridding level 1 at t = 0.000000E+00: 4 grids with 10000 cells
Setting initial dt to 2.9999999999999999E-002
At line 42 of file /home/saad/project/clawpack/geoclaw/src/2d/bouss/setMatrixIndex.f90
Fortran runtime error: Index '20' of dimension 1 of array 'node' above upper bound of 19
Error termination. Backtrace:
#0 0x799231223960 in ???
#1 0x7992312244d9 in ???
#2 0x799231224ad6 in ???
#3 0x58f1d7e6fac4 in ???
#4 0x58f1d7e4e5e1 in ???
#5 0x58f1d7e59c94 in ???
#6 0x58f1d7e5ad19 in ???
#7 0x799230e29d8f in __libc_start_call_main
at ../sysdeps/nptl/libc_start_call_main.h:58
#8 0x799230e29e3f in __libc_start_main_impl
at ../csu/libc-start.c:392
#9 0x58f1d7dbf5b4 in ???
#10 0xffffffffffffffff in ???
```
I have PETSc 3.21 installed on my machine. Any assistance in resolving this issue would be greatly appreciated.
Best regards,
Muhammad Ali
This error was resolved by downgrading to Clawpack version 5.9.2. The downgrade occurred automatically when I reinstalled Clawpack in another folder. After the downgrade, the 2D solvers started running successfully.
Thank you for your assistance.
Best regards,
--
You received this message because you are subscribed to the Google Groups "claw-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to claw-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/claw-users/1d5fed15-9222-4159-9118-4d8a63135600n%40googlegroups.com.
--
You received this message because you are subscribed to a topic in the Google Groups "claw-users" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/claw-users/ChdgLREhHBc/unsubscribe.
To unsubscribe from this group and all its topics, send an email to claw-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/claw-users/65bd92ef-c300-41fb-876b-8ad193895049n%40googlegroups.com.
You received this message because you are subscribed to the Google Groups "claw-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to claw-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/claw-users/4345310d-c9ff-4557-98b0-410c24157187n%40googlegroups.com.
<make all check_errors.txt>
To view this discussion on the web visit https://groups.google.com/d/msgid/claw-users/3e762318-8629-4190-a5bd-1d2e2c6fcad3n%40googlegroups.com.
$ make check
===================
CLAW = /Users/praveen/Applications/clawpack
OMP_NUM_THREADS = 1
BOUSS_MPI_PROCS = 6
PETSC_OPTIONS=-options_file /Users/praveen/Applications/clawpack/geoclaw/examples/bouss/petscMPIoptions
PETSC_DIR = /opt/homebrew/Caskroom/miniforge/base/envs/claw
PETSC_ARCH = .
RUNEXE = /opt/homebrew/Caskroom/miniforge/base/envs/claw/./bin/mpiexec -n 6
FFLAGS = -march=armv8.3-a -ftree-vectorize -fPIC -fno-stack-protector -O2 -pipe -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include -DHAVE_PETSC -ffree-line-length-none
===================
Using a PETSc solver
Using Bouss equations from the start
rnode allocated...
node allocated...
listOfGrids allocated...
Storage allocated...
bndList allocated...
Gridding level 1 at t = 0.000000E+00: 4 grids with 10000 cells
Setting initial dt to 2.9999999999999999E-002
max threads set to 1
Done reading data, starting computation ...
Total zeta at initial time: 39269.907650665169
GEOCLAW: Frame 0 output files done at time t = 0.000000D+00
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Unable to locate PCMPI allocated shared address 0x140358000
[0]PETSC ERROR: WARNING! There are unused option(s) set! Could be the program crashed before usage or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-ksp_type value: preonly source: file
[0]PETSC ERROR: Option left: name:-mpi_ksp_max_it value: 200 source: file
[0]PETSC ERROR: Option left: name:-mpi_ksp_reuse_preconditioner (no value) source: file
[0]PETSC ERROR: Option left: name:-mpi_ksp_rtol value: 1.e-9 source: file
[0]PETSC ERROR: Option left: name:-mpi_ksp_type value: gmres source: file
[0]PETSC ERROR: Option left: name:-mpi_linear_solver_server_view (no value) source: file
[0]PETSC ERROR: Option left: name:-mpi_pc_gamg_sym_graph value: true source: file
[0]PETSC ERROR: Option left: name:-mpi_pc_gamg_symmetrize_graph value: true source: file
[0]PETSC ERROR: Option left: name:-mpi_pc_type value: gamg source: file
[0]PETSC ERROR: Option left: name:-pc_mpi_minimum_count_per_rank value: 5000 source: file
[0]PETSC ERROR: Option left: name:-pc_type value: mpi source: file
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.22.0, Sep 28, 2024
[0]PETSC ERROR: /tmp/bouss/radial_flat/xgeoclaw with 6 MPI process(es) and PETSC_ARCH on MacMiniHome.local by praveen Sat Oct 12 10:54:16 2024
[0]PETSC ERROR: Configure options: AR=arm64-apple-darwin20.0.0-ar CC=mpicc CXX=mpicxx FC=mpifort CFLAGS="-ftree-vectorize -fPIC -fstack-protector-strong -O2 -pipe -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include " CPPFLAGS="-D_FORTIFY_SOURCE=2 -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include -mmacosx-version-min=11.0 -mmacosx-version-min=11.0" CXXFLAGS="-ftree-vectorize -fPIC -fstack-protector-strong -O2 -pipe -stdlib=libc++ -fvisibility-inlines-hidden -fmessage-length=0 -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include " FFLAGS="-march=armv8.3-a -ftree-vectorize -fPIC -fno-stack-protector -O2 -pipe -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include " LDFLAGS="-Wl,-headerpad_max_install_names -Wl,-dead_strip_dylibs -Wl,-rpath,/opt/homebrew/Caskroom/miniforge/base/envs/claw/lib -L/opt/homebrew/Caskroom/miniforge/base/envs/claw/lib" LIBS="-Wl,-rpath,/opt/homebrew/Caskroom/miniforge/base/envs/claw/lib -lmpi_mpifh -lgfortran" --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-clib-autodetect=0 --with-cxxlib-autodetect=0 --with-fortranlib-autodetect=0 --with-debugging=0 --with-blas-lib=libblas.dylib --with-lapack-lib=liblapack.dylib --with-yaml=1 --with-hdf5=1 --with-fftw=1 --with-hwloc=0 --with-hypre=1 --with-metis=1 --with-mpi=1 --with-mumps=1 --with-parmetis=1 --with-pthread=1 --with-ptscotch=1 --with-shared-libraries --with-ssl=0 --with-scalapack=1 --with-superlu=1 --with-superlu_dist=1 --with-superlu_dist-include=/opt/homebrew/Caskroom/miniforge/base/envs/claw/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --with-suitesparse=1 --with-suitesparse-dir=/opt/homebrew/Caskroom/miniforge/base/envs/claw --with-x=0 --with-scalar-type=real --with-cuda=0 --with-batch --prefix=/opt/homebrew/Caskroom/miniforge/base/envs/claw
[0]PETSC ERROR: #1 PetscShmgetMapAddresses() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/sys/utils/server.c:114
[0]PETSC ERROR: #2 PCMPISetMat() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/pc/impls/mpi/pcmpi.c:269
[0]PETSC ERROR: #3 PCSetUp_MPI() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/pc/impls/mpi/pcmpi.c:853
[0]PETSC ERROR: #4 PCSetUp() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/pc/interface/precon.c:1071
[0]PETSC ERROR: #5 KSPSetUp() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/ksp/interface/itfunc.c:415
[0]PETSC ERROR: #6 KSPSolve_Private() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/ksp/interface/itfunc.c:826
[0]PETSC ERROR: #7 KSPSolve() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/ksp/interface/itfunc.c:1075
--
You received this message because you are subscribed to the Google Groups "claw-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to claw-users+...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/claw-users/5dc24084-eae2-4178-b4e8-38a4f3f9ee16n%40googlegroups.com.
# set min numbers of matrix rows per MPI rank (default is 10000)
-mpi_linear_solve_minimum_count_per_rank 5000
# Krylov linear solver:
-mpi_linear_solver_server
-mpi_linear_solver_server_view
-ksp_type gmres
-ksp_max_it 200
-ksp_reuse_preconditioner
-ksp_rtol 1.e-9
# preconditioner:
-pc_type gamg
On Oct 14, 2024, at 9:29 PM, Ren <renzhi...@gmail.com> wrote:
Dear Marsha,I use the Ubuntu 20.04, with Petsc-3.21.2. In fact, I found that many people met this problem. See, https://github.com/clawpack/geoclaw/issues/606.Up to now, I find that, if I modify the "petscMPIoptions", it could avoid this problem, and give the good results. However, the simulation time is very long. The modified "petscMPIoptions" is shown below(red words).I think this is not the best solution way. Because even I use 40 threads, it still run a very long time. We should find a better way to solve this problem.Regards,Zhiyuan Ren# linear solver:
-mpi_linear_solver_server
-ksp_type gmres
-mpi_ksp_type gmres
-mpi_ksp_max_it 200
-mpi_ksp_reuse_preconditioner
# preconditioner:
-pc_type none
-mpi_pc_type gamg
-mpi_pc_gamg_symmetrize_graph true
-mpi_pc_gamg_sym_graph true
-mpi_linear_solver_server_view
To view this discussion on the web visit https://groups.google.com/d/msgid/claw-users/2560d330-93bf-4fff-aa12-4b64713953fdn%40googlegroups.com.
<frame0020fig20.png>
Using SGN equations
==> Applying Bouss equations to selected grids between levels 1 and 10
==> Use Bouss. in water deeper than 1.0000000000000000
Using a PETSc solver
Using Bouss equations from the start
rnode allocated...
node allocated...
listOfGrids allocated...
Storage allocated...
bndList allocated...
Gridding level 1 at t = 0.000000E+00: 4 grids with 10000 cells
Setting initial dt to 2.9999999999999999E-002
max threads set to 1
Done reading data, starting computation ...
Total zeta at initial time: 39269.907650665169
GEOCLAW: Frame 0 output files done at time t = 0.000000D+00
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Unable to locate PCMPI allocated shared address 0x128488000
[0]PETSC ERROR: WARNING! There are unused option(s) set! Could be the program crashed before usage or a spelling mistake, etc!
[0]PETSC ERROR: Option left: name:-ksp_max_it value: 200 source: file
[0]PETSC ERROR: Option left: name:-ksp_reuse_preconditioner (no value) source: file
[0]PETSC ERROR: Option left: name:-ksp_rtol value: 1.e-9 source: file
[0]PETSC ERROR: Option left: name:-ksp_type value: gmres source: file
[0]PETSC ERROR: Option left: name:-mpi_linear_solve_minimum_count_per_rank value: 5000 source: file
[0]PETSC ERROR: Option left: name:-mpi_linear_solver_server_view (no value) source: file
[0]PETSC ERROR: Option left: name:-pc_type value: gamg source: file
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.22.0, Sep 28, 2024
[0]PETSC ERROR: /tmp/bouss/radial_flat/xgeoclaw with 6 MPI process(es) and PETSC_ARCH on chandra.tifrbng.res.in by praveen Wed Oct 16 08:22:01 2024
[0]PETSC ERROR: Configure options: AR=arm64-apple-darwin20.0.0-ar CC=mpicc CXX=mpicxx FC=mpifort CFLAGS="-ftree-vectorize -fPIC -fstack-protector-strong -O2 -pipe -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include " CPPFLAGS="-D_FORTIFY_SOURCE=2 -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include -mmacosx-version-min=11.0 -mmacosx-version-min=11.0" CXXFLAGS="-ftree-vectorize -fPIC -fstack-protector-strong -O2 -pipe -stdlib=libc++ -fvisibility-inlines-hidden -fmessage-length=0 -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include " FFLAGS="-march=armv8.3-a -ftree-vectorize -fPIC -fno-stack-protector -O2 -pipe -isystem /opt/homebrew/Caskroom/miniforge/base/envs/claw/include " LDFLAGS="-Wl,-headerpad_max_install_names -Wl,-dead_strip_dylibs -Wl,-rpath,/opt/homebrew/Caskroom/miniforge/base/envs/claw/lib -L/opt/homebrew/Caskroom/miniforge/base/envs/claw/lib" LIBS="-Wl,-rpath,/opt/homebrew/Caskroom/miniforge/base/envs/claw/lib -lmpi_mpifh -lgfortran" --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-clib-autodetect=0 --with-cxxlib-autodetect=0 --with-fortranlib-autodetect=0 --with-debugging=0 --with-blas-lib=libblas.dylib --with-lapack-lib=liblapack.dylib --with-yaml=1 --with-hdf5=1 --with-fftw=1 --with-hwloc=0 --with-hypre=1 --with-metis=1 --with-mpi=1 --with-mumps=1 --with-parmetis=1 --with-pthread=1 --with-ptscotch=1 --with-shared-libraries --with-ssl=0 --with-scalapack=1 --with-superlu=1 --with-superlu_dist=1 --with-superlu_dist-include=/opt/homebrew/Caskroom/miniforge/base/envs/claw/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --with-suitesparse=1 --with-suitesparse-dir=/opt/homebrew/Caskroom/miniforge/base/envs/claw --with-x=0 --with-scalar-type=real --with-cuda=0 --with-batch --prefix=/opt/homebrew/Caskroom/miniforge/base/envs/claw
[0]PETSC ERROR: #1 PetscShmgetMapAddresses() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/sys/utils/server.c:114
[0]PETSC ERROR: #2 PCMPISetMat() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/pc/impls/mpi/pcmpi.c:269
[0]PETSC ERROR: #3 PCSetUp_MPI() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/pc/impls/mpi/pcmpi.c:853
[0]PETSC ERROR: #4 PCSetUp() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/pc/interface/precon.c:1071
[0]PETSC ERROR: #5 KSPSetUp() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/ksp/interface/itfunc.c:415
[0]PETSC ERROR: #6 KSPSolve_Private() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/ksp/interface/itfunc.c:826
[0]PETSC ERROR: #7 KSPSolve() at /Users/runner/miniforge3/conda-bld/petsc_1728030427805/work/src/ksp/ksp/interface/itfunc.c:1075