Re: compile Phantom with lower version gfortran

18 views
Skip to first unread message

Daniel Price

unread,
Jul 16, 2017, 9:38:03 PM7/16/17
to 李智, phantoms...@googlegroups.com
Hi Zhi,

Thanks for the bug reports. The incompatibility with gfortran v4.4 is easy to fix, as it is just a neat feature I used from Fortran 2008. The one with v4.7 looks more serious so I’ll take a look.

On the other two questions:

1) We have been working hard to get the MPI code going. The current repository version contains a working MPI implementation. We have not finished everything, there are some optimisations still to do, but the code does run and should run faster on multiple nodes. Just compile with MPI=yes. At this stage it will probably not scale to 600 cpus, but should scale to 40 or 60 at least.

2) There is a BIG difference between dissipation in SPH and grid based codes. In grid codes there is diffusion from i) advection errors and ii) shock capturing (i.e. PPM reconstruction). In SPH there are zero advection errors, and dissipation terms are explicitly added rather than being part of the numerical scheme. Thus the dissipation terms in SPH can be directly understood as a Navier-Stokes viscosity term. For grid codes the dissipation depends on the grid direction and flow velocity relative to the grid as well as on the shock capturing algorithm. This does not necessarily mean SPH is less dissipative in practice — dissipation in grid codes depends a lot on whether the flow aligns with the grid. In SPH it depends on how well you can control the viscosity terms away from shocks.

Hope that helps!

Daniel

> On 15 Jul 2017, at 6:15 pm, 李智 <li...@shao.ac.cn> wrote:
>
> Hi Daniel,
>
> My name is Zhi Li, we just met at the TDLI summer school and it was indeed a great experience to learn SPH from you. I'm very interested in using your Phantom code to do some galactic disk simulations then compare them with my current results which are done by athena and athena++. However, when running the test suite of Phantom in three super clusters (two in Shanghai Astronomical Observatory with Intel(R) Xeon(R) CPU and one in National Observatory of China with AMD Opteron(tm) Processor), I got some errors by using the lower version of gfortran (i.e. gfortran 4.7.0, and gfortran 4.4.6, error information are pasted below), versions higher than 4.8.0 give me a PASS. Running the test suite with ifort of 12.1.0 and 14.0.1 also looks good. I've checked the Makefile in phantom/build/ then use SYSTEM=gfortran44 or gfortran47 instead of SYSTEM=gfortran according to the installed version, but the error message is the same. As most super clusters didn't update their software frequently in order to maintain stable, I think it might be good to make Phantom compatible with lower version compilers. In addition, installing a latest version of gcc without root permission in the super clusters is terrible (It took me ~8 hours to install gcc-4.8.0 and gcc-5.1.0 from source codes).
>
> I also have two additional questions:
> 1. I roughly go through the Phantom wiki pages and the paper on arxiv. It looks to me that Phantom use openMP to parallelize the run, which depends on shared memory instead of distributed memory. Does this imply that I can only run Phantom in a single node, where the cpus and threads are limited? In my case, the super clusters have ~20 cpus per node and ~30 nodes, with a total cpu number of ~600. It would be good if I can use more than 20 cpus in order to accelerate the simulation.
> 2. We've discussed the artificial mass/density viscosity terms in grid-based codes like athena. However, I cannot find such a term in my code. I'm not sure whether the viscosity term you mean is the ppm reconstruction, as this process do introduce some diffusion-like behaviors. I would really appreciate if you could send me some references on discussing the artificial viscosity in modern grid-based codes.
>
> Best and many thanks,
> Zhi
>
> =================================
> with gfortran 4.7.0:
> .....
> --> testing DUSTYBOX
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> WARNING! get_neighbour_list: 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE
> (buffering remaining warnings... 2h > 0.5*L in periodic neighb. search: USE HIGHER RES, BIGGER BOX or LOWER MINPART IN TREE)
> ERROR: density iteration failed after 50 iterations
> hnew = 341.26643062507998 hi_old = 3.7499999999999999E-002 nneighi = 1
> rhoi = 3.0652441212728002E-013 gradhi = 1.2258013183626704
> error = 1516.7396916670214 tolh = 1.0000000000000000E-004
> itype = 2
> x,y,z = -0.49218750000000000 -0.24537037037037038 -0.14033534984695289
>
> FATAL ERROR! densityiterate: could not converge in density on particle 6913: error = 1.517E+03
> make[1]: Leaving directory `/data/raid2/lizhi/code/phantom/build'
>
>
> ----------------
> with gfortran 4.4.6:
> .....
> gfortran -c -O3 -Wall -Wno-unused-dummy-argument -frecord-marker=4 -gdwarf-2 -mcmodel=medium -finline-functions-called-once -finline-limit=1500 -funroll-loops -ftree-vectorize -std=f2008 -fall-intrinsics -fopenmp -fdefault-real-8 -fdefault-double-8 ../src/main/utils_datafiles.f90 -o utils_datafiles.o
> ../src/main/utils_datafiles.f90:187.16:
>
> open(newunit=iunit,file=trim(localfile),status='old',iostat=ierr)
> 1
> Error: Syntax error in OPEN statement at (1)
> ../src/main/utils_datafiles.f90:204.13:
>
> open(newunit=iunit,file=trim(dir)//'data.tmp.abcd',action='write',iostat=ierr)
> 1
> Error: Syntax error in OPEN statement at (1)
> f951: warning: unrecognized command line option "-Wno-unused-dummy-argument"
> make[2]: *** [utils_datafiles.o] Error 1
>
>
>
>
>
>

Reply all
Reply to author
Forward
0 new messages