Additional diagnostics

142 views
Skip to first unread message

Steven Peckham

unread,
Aug 22, 2013, 4:03:38 PM8/22/13
to mpas-atmos...@googlegroups.com

MPAS help,

I see in the registry and the output several diagnostics at standard pressure levels (e.g., 500 mb height, T, U,V). I have been asked to add some additional fields to the diagnostics (QV at all trop. levels along with including 700 mb). There has also been a request for stratospheric winds up at 100, 25, 20 and 10 mb.

Besides adding these arrays to the registry.xml file and adding the calculation to maps_atm_interp_diagnostics, would there be much else I need to do? Am I biting off a lot of work with this request?

Steven


Laura Fowler

unread,
Aug 23, 2013, 10:53:33 AM8/23/13
to mpas-atmos...@googlegroups.com
Hi Steven:
You are right. Besides adding your extra diagnostics in Registry.xml and
adding the calculation of the interpolated diagnostics following what we
have done in mpas_interp_diagnostics.F, this is all you need to do. We
will be working on having user-defined interpolated diagnostics through
namelist.input in a future release of MPAS.

Let us know if you have problems with implementing your new diagnostics,
of course.

Regards,
Laura
--
!----------------------------------------------------
Laura D. Fowler
Mesoscale and Microscale Meteorology Division (MMM)
National Center for Atmospheric Research
P.O. Box 3000, Boulder CO 80307-3000

e-mail: la...@ucar.edu
phone : 303-497-1628

!----------------------------------------------------

Steven Peckham

unread,
Jan 29, 2014, 9:43:00 PM1/29/14
to mpas-atmos...@googlegroups.com

MPAS help,

Looking at the version 2.0 model output I find that several diagnostics are all zero. For example, we are looking at u10 and v10. Both fields are uniformly zero at the initial output time and have the typical values on the order of 10m/s at later times.

Is there a way to get the diagnostic fields computed and output at time zero? Or do I need to run forward 1 time step to get these diagnostic fields?

Steven

du...@ucar.edu

unread,
Feb 3, 2014, 8:56:57 PM2/3/14
to mpas-atmos...@googlegroups.com, steven....@noaa.gov
Hi, Steven.

We can pass the horizontally interpolated U10, V10, Q2, and T2 fields through the initialization to the model initial condition file, where we can read them in the model so that they'll be written to the history file at time zero. For example code to do this, you can look at the 'atmosphere/initial_diagnostics' branch of my fork of the MPAS-Release repository: https://github.com/mgduda/MPAS-Release .

To get the code, you can:

> git clone g...@github.com:mgduda/MPAS-Release.git
> cd MPAS-Release
> git checkout atmosphere/initial_diagnostics

and then build the init_atmosphere_model and atmosphere_model executables as usual. To see the code changes to pass the surface fields through to the initial model history file, you can

> git diff 75fa8ebf 5c6dc9e9

If there are other fields that you'd like to see in the model history at time zero, just let me know.

Best regards,
Michael

Steven Peckham

unread,
Feb 3, 2014, 11:53:05 PM2/3/14
to du...@ucar.edu, mpas-atmos...@googlegroups.com

Thanks for providing me this code.

Steven

Steven Peckham

unread,
Mar 7, 2014, 3:11:39 PM3/7/14
to mpas-atmos...@googlegroups.com

Got a stupid question here, but I can not figure out the solution at this time.

I have been running with the 60 km grid (x1.163842.grid.nc) for a while without any issues.

I have recently tried building init and static for a 30 km grid (x1.655362.grid.nc).

I compiled MPAS2.0 with ifort13 and netcdf3.6 along with pnetcdf and pip.  Should not be a compiler issue.


When I run init_atmosphere to generate the static.nc file I get the error message in the log.0000.err file:


 Reading namelist.input
 Namelist record &vertical_grid not found; using default values for this namelis
 t's variables
 Namelist record &decomposition not found; using default values for this namelis
 t's variables
 Namelist record &restart not found; using default values for this namelist's va
 riables

  

  

 Error opening input file 'x1.655362.grid.nc'

  

MPI: Global rank 0 is aborting with error code 0.
     Process ID: 849, Host: bmem4, Program: /scratch2/portfolios/BMC/rtfim/peckham/MPAS-2.0_init_hires/init_atmosphere_model

MPI: --------stack traceback-------
MPI: Attaching to program: /proc/849/exe, process 849
MPI: [Thread debugging using libthread_db enabled]
MPI: 0x00002aaaab49e26e in waitpid () from /lib64/libpthread.so.0
MPI: Missing separate debuginfos, use: debuginfo-install glibc-2.12-1.132.el6.x86_64 libbitmask-2.0-sgi706r1.rhel6.x86_64 libcpuset-1.0-sgi706r1.rhel6.x86_64 libgcc-4.4.7-4.el6.x86_64 libibverbs-1.1.7-1.el6.x86_64 libmlx4-1.0.5-4.el6.1.x86_64 libmthca-1.0.6-3.el6.x86_64 xpmem-1.6-sgi706r8.rhel6.x86_64
MPI: (gdb) #0  0x00002aaaab49e26e in waitpid () from /lib64/libpthread.so.0
MPI: #1  0x00002aaaaaf803cc in mpi_sgi_system (header=<value optimized out>) at sig.c:89
MPI: #2  MPI_SGI_stacktraceback (header=<value optimized out>) at sig.c:272
MPI: #3  0x00002aaaaaf0bc2b in print_traceback (ecode=0) at abort.c:168
MPI: #4  0x00002aaaaaf0beda in PMPI_Abort (comm=<value optimized out>, errorcode=0) at abort.c:59
MPI: #5  0x00002aaaaaf0bf5d in pmpi_abort__ () from /apps/mpt/2.06/lib/libmpi.so
MPI: #6  0x000000000055ac25 in mpas_dmpar_mp_mpas_dmpar_abort_ ()
MPI: #7  0x000000000056b6f2 in mpas_io_input_mp_mpas_input_state_for_domain_ ()
MPI: #8  0x0000000000405e87 in mpas_subdriver_mp_mpas_init_ ()
MPI: #9  0x0000000000405d57 in MAIN__ ()
MPI: #10 0x0000000000405d0c in main ()
MPI: (gdb) A debugging session is active.
MPI: 
MPI: Inferior 1 [process 849] will be detached.
MPI: 
MPI: Quit anyway? (y or n) [answered Y; input not from terminal]
MPI: Detaching from program: /proc/849/exe, process 849



The file does exist.  The script run log reports a correct long listing of the file.  Must be something wrong in my compile, or execution.  Any suggestions?


Steven




du...@ucar.edu

unread,
Mar 12, 2014, 4:35:36 PM3/12/14
to mpas-atmos...@googlegroups.com, steven....@noaa.gov
Hi, Steven.

I've just run into a similar issue: I ran the init_atmosphere model to create static fields for the 120-km mesh without trouble, but got the message

 Error opening input file '5km.grid.nc'

when I tried to interpolate static fields for a 5-km mesh that I'd generated. The problem here turned out to be that (as far as I can tell) PIO switches to the serial NetCDF library when only a single MPI task is used; however, the 5-km grid was written in pnetcdf's CDF-5 format, which the regular NetCDF library can't read. By recompiling PIO with pnetcdf support only, I was able to force PIO to use pnetcdf, which of course can read CDF-5 files.

I believe that both our 60-km and 30-km meshes are written in the same CDF-2 (i.e., regular NetCDF with support for large files) format, so a format issue might not be the culprit in your case. Is it the case that you've recompiled before trying the 30-km mesh; have you tried the same executables that work for the 60-km mesh? Under the assumption that both 60-km and 30-km meshes are in CDF-2 format, it would be telling if the same init_atmosphere_model that works for the 60-km mesh did not work for the 30-km mesh. 

It's always worth checking file permissions, as well. In the directory where you're running init_atmosphere_model, are you able to successfully run


on the grid file?

Michael

M Sabet

unread,
Aug 3, 2018, 3:15:40 PM8/3/18
to MPAS-Atmosphere Help

hello

I run the MPAS atmosphere v5.2 model with a   92 - 25 km grid
Precipitation is forcasted in some areas( especially on the seas) with MPAS atmosphere v5.2 model
But these rainfall did not happen actually
And WRF model does not  forecast the rainfall in these areas
Is there a specific setting to correct this false  precipitation 
forecast, especially on sea and ocean?


thanks for your attention
Reply all
Reply to author
Forward
0 new messages