question about HYCOM data

1,787 views
Skip to first unread message

Irina Rypina

unread,
Oct 31, 2012, 12:52:51 PM10/31/12
to fo...@hycom.org
Hi, my name is Irina Rypina, I am a physical oceanographer at Woods
Hole Oceanographic Institution, I would like to download 3d velocities
(u,v,w) in the North Atlantic for 2010-2011 years. What would be the
best way to do so? Thanks for your help. ~Irina


Irina Rypina

unread,
Nov 1, 2012, 5:25:42 PM11/1/12
to johnath...@gmail.com, fo...@hycom.org
Hi Johnathan, thanks for a quick response.

I am still having trouble with accessing the data. I am trying to
download a subset of u-velocities from the data-assimilative
experiment 90.9 for the North Atlantic (0S to 70N and 100W to 0E) from
here

http://ncss.hycom.org/thredds/ncss/grid/GLBa0.08/expt_90.9/dataset.html

and I am getting an error "Variable size in bytes 597498660000 may not
exceed 4294967292."

What should I do?

Thanks for your help,
Irina



Quoting johnath...@gmail.com:

> Hi Irina -
>
> The first step would be to check out the 'Data Server' Link in the left
> menu at http://www.hycom.org. From there, you'd need to decide if you want
> to use Data Assimilative or Non-Data Assimilative runs (i.e. does the model
> use real observational data to adjust the model values - or not). I'm
> guessing you probably want Global Data Assimilative 1/12°. If you take
> that choice, the next step is to pick which experiment you want to use.
> Given your date range, you'd be looking at using experiments 90.8 and 90.9
> or the 'All experiments' choice. Choosing 'All experiments' will bring you
> here:
> http://tds.hycom.org/thredds/global_combined/glb_analysis_catalog.html?dataset=GLBa0.08/glb_analysis.
>> From here, you can pick if you want to FTP the data, use OPENDAP etc (the
> 'ideal' one will depend on what software you know how to use and/or are
> planning to use). As far as I know, the available global HYCOM models only
> provide u and v velocity values, not w. The Gulf of Mexico model provides
> w values, but that won't help you much.
>
> Good luck,
>
> Johnathan (non-staff, HYCOM forum lurker)

JDTi...@aol.com

unread,
Nov 2, 2012, 12:29:34 PM11/2/12
to fo...@hycom.org, iry...@whoi.edu
It seems the dataset you are downloading is too large for the server to handle. If you are familiar with ftp from a terminal, that might be the best method. Try this for starters...

from a terminal window... "command prompt" in Windows, "Terminal" in Mac and Linux

for name enter "anonymous", for password enter your email address
$ cd datasets/GLBa0.08/expt_90.9/data
search for and change to the directory for the subdirectories/files you are looking for, for example

$ ls
$ cd uvel

once you cd into the directory you want download the files from, switch to binary mode and use wget, for example if you wanted all files in a directory you can use

$ binary
$ wget *

type "a" to begin downloading all the files.

I hope this helps, but I'm not seeing any w data. I'm not familiar with this dataset, but you might want to verify the 3d data exists. The data may be somewhere else in the directory tree.

Jason

Irina Rypina

unread,
Nov 2, 2012, 3:19:06 PM11/2/12
to JDTi...@aol.com, fo...@hycom.org
Hi Jason, I tried what you suggested and it does work but it seems
extremely unpractical and time consuming to download the whole global
dataset at all vertical levels when I only need North Atlantic
velocities in the top 1 km of the water column.

Is there a way to extract a subset from the global dataset and only
download the selected subset?

Thanks for help.
Irina

JDTi...@aol.com

unread,
Nov 2, 2012, 3:56:52 PM11/2/12
to fo...@hycom.org, JDTi...@aol.com, iry...@whoi.edu
I've seen this question a lot in this forum, but have never seen a resolution. Maybe if you dig through some threads you can find something. Yes, it does seem like you would be taking up a lot of extra space having all sigma levels. This might be best accomplished through THREDDS. Have you tried downloading the data you need in smaller chunks? I know for my particular data needs, I can only download 2 weeks of data at a time. Trying to do a month or more results in an error. You also have to wait for the server to process the data, which can take quite a bit of time. It can be hit or miss at times. Maybe since you only need the top 1 km (and no other hydrological data) you can download a longer time frame than that. Otherwise, you could download as much data that you can handle via ftp, then run a python script with the NetCDF4 library to strip off unnecessary data. Really a script that that has a loop which downloads the data, then strips of the unnecessary stuff before moving to the next file might work best. It would be a big task to create this, but could possibly be worth the effort depending on your situation. I would try THREDDS first. It might take some time, but I think if you make the temporal subset small enough, it should work. And once again, I'm not sure if w data is available with this dataset.

Jason

Irina Rypina

unread,
Nov 2, 2012, 4:04:26 PM11/2/12
to JDTi...@aol.com, fo...@hycom.org
Jason, one more question - when I am trying to chose the "Bounding
Box" option at
http://ncss.hycom.org/thredds/ncss/grid/GLBa0.08/expt_90.9/dataset.html, I am
being transfered to the "page not found" page. Am I doing something
wrong? Thanks again. ~Irina

JDTi...@aol.com

unread,
Nov 2, 2012, 4:21:34 PM11/2/12
to fo...@hycom.org, JDTi...@aol.com, iry...@whoi.edu
Its really hard to say without seeing it. I would double check that the data exists in the bounding box and that you have positive/negative values properly, based on this message I am getting with the link provided:

Bounding Box must have east > west; if crossing 180 meridion, use east boundary > 180

I haven't used THREDDS much, so maybe someone else can chime in here.

Michael McDonald

unread,
Nov 6, 2012, 2:58:42 PM11/6/12
to fo...@hycom.org, iry...@whoi.edu
Irina,

Since you only need a small subset of data, I would recommend using OPENDAP/THREDDS or NCSS (NetCDF Subset Service).

If you are familiar with ferret () then this works quite well for generating plots of hycom data. 

If you want to have a local (physical) copy of the data on your system for offline analysis, then I would recommend using NCSS. This will make your analysis go much quicker/smoother (the thredds/ncss tomcat server is not online 24x7, there are brief downtime windows when the catalog is update and the service is not responsive @ ~ noon EST). As you've discovered (based on your email thread) you cannot request a lot of data at a time. This protection mechanism is there to prevent you from trying to download more then 4GB of data, which is a lot to steam via HTTP. You should only request 1 day of data at a time (ideally). If you simply create a for loop in a bash script - lopping through the days of the year - then this can be achieved quite efficiently. Would you like me to help you setup this wget script?

If so, then please provide the following details (we can use this as another ncss wget example thread in the forum):

starting URL:

Step 1) Select Variable(s):
(do you want u & v only)? ___

note: there is no w velocity in the global output. you need to compute this on your own (offline) using the variables provided.

Step 2) Bounding Box (decimal degrees):
lat/lon bounding box: ___

Step 3) Choose Time Subset:
we will only select 1 day at a time.

Step 4) Add Lat/Lon to file 
[check]


/mike

Irina Rypina

unread,
Nov 6, 2012, 5:12:43 PM11/6/12
to Michael McDonald, fo...@hycom.org
Hi Mike, it would be great if you could help with the wget script.

> Step 1) *Select Variable(s):*
> (do you want u & v only)? ___

I need u and v together with the corresponding Time, Lon, Lat, Depth
(but the last 3 variables will be the same for all files so it seems
unnecessary to include them in all of the files).

> note: there is no w velocity in the global output. you need to compute this
> on your own (offline) using the variables provided.

ok

> Step 2) *Bounding Box (decimal degrees):*
> lat/lon bounding box: ___

100W to OE and 0N to 70N

> Step 3) *Choose Time Subset:*
> we will only select 1 day at a time.

Jan 3, 2011 through Jan 4, 2012

> Step 4) *Add Lat/Lon to file*
> [check]

Yes, I need Lat/Lon and Depth.

Thanks for your help!
Irina



Quoting Michael McDonald <michael....@hycom.org>:

> Irina,
>
> Since you only need a small subset of data, I would recommend using
> OPENDAP/THREDDS or NCSS (NetCDF Subset Service).
>
> If you are familiar with ferret () then this works quite well for
> generating plots of hycom data.
>
> If you want to have a local (physical) copy of the data on your system for
> offline analysis, then I would recommend using NCSS. This will make your
> analysis go much quicker/smoother (the thredds/ncss tomcat server is not
> online 24x7, there are brief downtime windows when the catalog is update
> and the service is not responsive @ ~ noon EST). As you've discovered
> (based on your email thread) you cannot request a lot of data at a time.
> This protection mechanism is there to prevent you from trying to download
> more then 4GB of data, which is a lot to steam via HTTP. You should only
> request 1 day of data at a time (ideally). If you simply create a for loop
> in a bash script - lopping through the days of the year - then this can be
> achieved quite efficiently. Would you like me to help you setup this wget
> script?
>
> If so, then please provide the following details (we can use this as
> another ncss wget example thread in the forum):
>
> starting URL:
> http://ncss.hycom.org/thredds/ncss/grid/GLBa0.08/expt_90.9/dataset.html
>
> Step 1) *Select Variable(s):*
> (do you want u & v only)? ___
>
> note: there is no w velocity in the global output. you need to compute this
> on your own (offline) using the variables provided.
>
> Step 2) *Bounding Box (decimal degrees):*
> lat/lon bounding box: ___
>
> Step 3) *Choose Time Subset:*
> we will only select 1 day at a time.
>
> Step 4) *Add Lat/Lon to file*

Michael McDonald

unread,
Nov 9, 2012, 7:35:43 PM11/9/12
to Irina Rypina, forum
Irina,

> 100W to OE and 0N to 70N

That region (according to NetCDF Subset Service) is primarily over land. Is that really what you are looking for?

Are you interested in the North Atlantic region?

Inline image 1  Inline image 2

Screen Shot 2012-11-09 at 7.30.11 PM.png
Screen Shot 2012-11-09 at 7.30.07 PM.png

Irina Rypina

unread,
Nov 9, 2012, 8:38:45 PM11/9/12
to Michael McDonald, forum
Hi Michael, yes, I am intersted in the North Atlantic region including
the Gulf of Mexico (thus, the left limit of 100W). Thanks. ~Irina

Michael McDonald

unread,
Nov 9, 2012, 10:39:18 PM11/9/12
to Irina Rypina, forum
Irina,

Reference URL:
http://ncss.hycom.org/thredds/ncss/grid/GLBa0.08/expt_90.9/dataset.html

Ok. Good. North Atlantic it is. The NCSS "Bounding Box" for this region is,
north=70, south=0, east=0, west=-100

The time range I am selecting for these examples is a "single day"
(starting and ending times are equal).

Check the "Add Lat/Lon to file" box (this does not add many bytes).

The next step to understanding NCSS is the increases in quantity of
data, for each variable you select.

* selecting 'only' the "u" variable generates a 299.7MB netcdf file
(the source files are 2GB each)

* selecting 'only' the "v" variable generates a 299.7MB netcdf file
(the source files are 2GB each)

* therefore, selecting both "u" & "v" from NCSS generates a 582MB netcdf file.

All of this data is being cached/generated on-the-fly by our data
server (single server) and streamed directly to you via HTTP.
Therefore, obviously there are limitations imposed at the server level
that prevent users from requesting "too much" data to stream back
(which is essentially stitching together hundreds of netcdf files we
have stored on disk - these files,
ftp://ftp.hycom.org/datasets/GLBa0.08/expt_90.9/data/) into a single
netcdf entity. If you tried to get everything (w/o server limits),
then a netcdf file - several terabytes in size - would be generated
(not possible due to memory/disk limitations).

Therefore, a data request for 1 year (365 days) would roughly equate
to 365*582MB=212GB (for all u&v data in your specified bounding box).
Obviously the NCSS is not going to stream all this data to you in a
single request. You need to break down this request into manageable
chunks, while being as "nice" to the server as you can.

e.g., requesting data that spans multiple days will eat up more RAM
and system resources on the server side, resulting in sluggishness for
all other users querying HYCOM data via other methods. Our
recommendation is "single day" requests only.

So, that's the medium answer to why you are getting the message from
NCSS saying,

getting an error "Variable size in bytes 597,498,660,000 may not
exceed 4,294,967,292."

i.e., you cannot concatenate 597GB worth of hycom data into a single
netcdf file on our data server.

Now that we have an understanding on what we want to get, and how much
disk space this will take up, we can proceed to the advanced scripting
side of things (i.e., plugging in these variables to a wget for loop
with a starting date and ending date). Will get back to you with this
code in my next thread reply.

Michael McDonald

unread,
Nov 12, 2012, 4:42:58 PM11/12/12
to Irina Rypina, forum
Irina,
Follow up question: Are you using a Mac or Linux based system that can
run a /bin/bash script containing wget commands?
--
Michael McDonald
HYCOM.org Administrator
http://hycom.org

Irina Rypina

unread,
Nov 12, 2012, 4:47:04 PM11/12/12
to Michael McDonald, forum
Michael, I have a PC with Windows 7. Is that a big problem? Thanks. ~Irina

Michael McDonald

unread,
Nov 12, 2012, 5:16:40 PM11/12/12
to Irina Rypina, forum
> Michael, I have a PC with Windows 7. Is that a big problem?

Well that depends.

What is your favorite scripting language supported under windows?

What "in general" do you want to do with the NetCDF files once they
are downloaded locally to your Windows 7 PC?

Do you have the storage space necessary to store all (estimated) 212GB of data?

Irina Rypina

unread,
Nov 12, 2012, 6:00:36 PM11/12/12
to Michael McDonald, forum
> What is your favorite scripting language supported under windows?

I use Matlab for most of my analysis.

> What "in general" do you want to do with the NetCDF files once they
> are downloaded locally to your Windows 7 PC?

I plan to load them into Matlab, construct 3d matrices of u(x,y,t) and
v(x,y,t) at different z-levels and then use these velocities to
evaluate simulated particle trajectories (again, at various
depth-levels).

> Do you have the storage space necessary to store all (estimated)
> 212GB of data?

Yes, I have the necessary ~212GB of storage space on my machine.

Thanks again for help.
Irina



Michael McDonald

unread,
Nov 13, 2012, 4:05:33 PM11/13/12
to Irina Rypina, forum
Irina,

If there is a Linux box at WHOI (which I'm sure there is), then you
can use the attached script to download your data subset.

I'll work on more examples using Windows shell scripting or using a
more portable language.
hycom-wget-example1.sh

Jason Roberts

unread,
Nov 13, 2012, 5:07:28 PM11/13/12
to Michael McDonald, Irina Rypina, forum
Hi all,

I have not been following this conversation extensively but noticed that
Irina is trying to access HYCOM data using MATLAB running on a Windows PC. I
have had good luck doing this in the past with the CSIRO netCDF/OPeNDAP
interface to MATLAB:

http://www.marine.csiro.au/sw/matlab-netcdf.html

Hope that helps. My apologies of someone has already suggested that
approach. (As I said, I have not been paying close attention to this
topic...)

Jason
--
You received this message because you are a member of HYCOM.org To ask a
question, send an email to fo...@hycom.org

Michael McDonald

unread,
Nov 13, 2012, 5:39:02 PM11/13/12
to Jason Roberts, Irina Rypina, forum
Jason,
Yes. I considered going in a tangent to suggest using matlab loaddap
calls (http://www.opendap.org/matlab-loaddap), but I've hear many
users complain about data query timeouts using this method. Not sure
if the CSIRO netCDF/OPeNDAP is an improvement over this existing
loaddap method. If there is a basic "getting started" script example
you can post, that might help (please start a separate forum topic for
this).

Since the thread started in the direction of downloading a dataset to
do offline analysis, I decided to continue with the ncss+wget
approach.

--
Michael McDonald
HYCOM.org Administrator
http://hycom.org


Irina Rypina

unread,
Nov 13, 2012, 6:03:53 PM11/13/12
to Michael McDonald, forum
Hi Michael, I was able to run your script from Xwin on my Windows
machine. It seems to be working fine! Thank you so much for help! ~Irina

Irina Rypina

unread,
Nov 14, 2012, 10:36:19 AM11/14/12
to Michael McDonald, forum
Hi Michael,

I left your script running overnight and by morning it downloaded
about 60 files (~2 months of data). However, out of these 60 about 1/3
are empty (0 KB) - what is causing this?

Thanks,
Irina


Quoting Michael McDonald <michael....@hycom.org>:

Michael McDonald

unread,
Nov 14, 2012, 11:58:56 AM11/14/12
to Irina Rypina, forum
The data server (hosting ncss.hycom.org) rebooted last night around
3AM EST. So there was most likely multiple timeouts to the server and
your wget command gave up. Luckily I coded that bash script to be
capable of stopping and starting (filling in any missing data gaps,
i.e., files that are zero/missing). Stop and start that script and
then these missing files should be downloaded, and continue
downloading the remainder of the year.
--
Michael McDonald
HYCOM.org Administrator
http://hycom.org


Irina Rypina

unread,
Nov 14, 2012, 11:59:15 AM11/14/12
to Michael McDonald, forum
Great! Thank you. ~Irina

Michael McDonald

unread,
Mar 13, 2013, 3:22:35 PM3/13/13
to xu.jia...@googlemail.com, fo...@hycom.org, Irina Rypina
Xu,

> thank you for your script. but how to read the download data? I try to see
> head informatin with 'ncdump', but could not open it. Besides, could you
> please share the update script as your mentioned?

Can you post/attach the current wget script you are using to download
your target data?

If you cannot perform an ncdump of the file(s) obtained then there is
probably something else wrong.

Michael McDonald

unread,
Mar 26, 2013, 12:24:27 PM3/26/13
to fo...@hycom.org, xu.jia...@googlemail.com, Irina Rypina
jiangling,

I too have tested enabling/disabling this "&addLatLon" variable and the file size will not change by much (although it does change). This is probably a question for the Netcdf Subset Service group at Unidata.

Bottom line, the lat/lon grid is not taking up that much space, so I would recommend keeping it.


On Thursday, March 14, 2013 4:54:26 AM UTC-4, xu.jia...@googlemail.com wrote:
But now I have another problem, I don not want the longitude and latitude data in the downloaded file, since I want to reduce  the size of the file and the downloading time, so, how to change the script?
I tried to remove the "&addLatLon" in the URL variable in that script. but that doesn't work. There are still information of Lat and Lon in the downloaded file. Hope you can understand me.
thanks!

Adriano Barroso

unread,
Apr 26, 2013, 10:06:01 AM4/26/13
to fo...@hycom.org, xu.jia...@googlemail.com, Irina Rypina
Hello Michael,

I have a question about your wget script.

Is it possible to get just a few layers from the dataset or download them with an interval? And what is the meaning of Subset='spatial=bb'?

Thanks in advance.

Michael McDonald

unread,
Apr 26, 2013, 5:22:19 PM4/26/13
to Adriano Barroso, forum, xu.jia...@googlemail.com, Irina Rypina
Adriano,

> Is it possible to get just a few layers from the dataset or download them
> with an interval?

Not with the current version (4.2) of THREDDS we are running. The
latest version has this capability (i.e., choosing certain depth
levels), but we have not yet rolled this out. It is still being
tested.


>And what is the meaning of Subset='spatial=bb'?

This is the lat/lon bounding box of the region you want to download.
http://www.unidata.ucar.edu/projects/THREDDS/tech/interfaceSpec/NetcdfSubsetService.html

adriano barroso

unread,
May 10, 2013, 12:55:21 AM5/10/13
to Michael McDonald, forum, xu.jia...@googlemail.com, Irina Rypina
Michael,

I'm having some problems to download the 2011 dataset from the 90.9 experiment. 

I attached my wget script, which is based on yours. The one that you built for Irina.


Are there something wrong with my alterations?

Thanks in advance.



2013/4/26 Michael McDonald <michael....@hycom.org>



--
Adriano Wiermann Barroso

Oceanólogo 
Universidade Federal do Rio Grande - FURG/RS
baixa_hycom_2011.sh

Michael McDonald

unread,
May 10, 2013, 9:32:51 AM5/10/13
to adriano barroso, forum, xu.jia...@googlemail.com, Irina Rypina
Adriano,

expt_90.9 only has data in 2011 from Jan-05-2011 to Present.

http://hycom.org/dataserver/glb-analysis/expt-90pt9

So you will need to start your sequence from 4.

Line 8:
StartSeq='4' # {Jan-1}+4=Jan-5

Also, Line 14 & 15

EXPT='expt_90.9'
EXPT1='expt_90.3'

Do you want experiment 90.3 or 90.9? Your ouptut filename (line 33)
is using the EXPT1 variable. So your filenames will be incorrect (if
you care which experiment the data came from).

--
Michael McDonald
HYCOM.org Administrator
http://hycom.org


> --

Michael McDonald

unread,
Nov 11, 2013, 4:12:10 PM11/11/13
to Bryan.matamoros, forum
Bryan,
I finally got around to tweaking your script to perform the task you
wanted. You essentially query the same URL every day and this will
return the latest 10 days of global data for your Costa Rica domain.

The script essentially queries this same URL (see below) and saves as
a different filename with "%s" in the name. Note: the absence of the
"date/time" argument. Omitting returns all time/days, which is always
10 for this "GLBa0.08/latest" dataset.

http://ncss.hycom.org/thredds/ncss/grid/GLBa0.08/latest?var=u,v,temperature,salinity&spatial=bb&north=13.5&south=2.5&east=-73.5&west=-92.5&addLatLon

/mike



On Fri, Aug 16, 2013 at 6:05 PM, Bryan.matamoros
<bryan.m...@ucr.ac.cr> wrote:
>
> Michael,
> I'm Looking for download the data from last ten days by scripts, I try to
> modify yours, using a boxing around Costa Rica, but can't figure it out how
> to tell the script to detect the latest ten days
> I attached version of the wget script,
>
> Thanks in advance.
>
> Bryan Matamoros Alvarado
> Módulo de Información Oceanográfica (MIO)
> Ofic.223, CIMAR Ciudad de la Investigación
> Universidad de Costa Rica
> www.cimar.ucr.ac.cr, www.miocimar.ucr,ac.cr
> Tel. (506) 2511-3146
> Fax. (506) 2511-3280
hycom-wget-Costa-Rica_emm.sh

Michael McDonald

unread,
Nov 11, 2013, 4:19:12 PM11/11/13
to Rebecca Ross, forum
Bex,
Never try to guess java/tomcat URLs. You will be wrong most of the
time :) Your best option is to start at the THREDDS server root URLs,
"http://ncss.hycom.org/" or "http://tds.hycom.org/" and then browse
for your desired datset. When you get to the page listing "Access:"
methods, those are the correct URLs to use.

it should be the following base for all NCSS queries,
NCSS='http://ncss.hycom.org/thredds/ncss/grid'

/mike

On Mon, Nov 11, 2013 at 12:24 PM, Rebecca Ross <bexe...@gmail.com> wrote:
> Hello Michael,
>
> Having found this thread a while ago and I downloaded your bash script
> designed for Irina, and easily managed to modify and run it successfully for
> my own specified area (with much appreciation for your help there! Thank
> you!). However, I haven't tried this since the server reshuffle and am now
> trying in vain to work out what the correct URL set up is. Can you help?
>
> I think it is just the NCSS='http://tds.hycom.org/thredds/ncss/grid' line
> which needs changing, but I've tried a number of permutations
> (http://tds.hycom.org/thredds, http://ncss.hycom.org/thredds/ncss/grid, the
> same extended to the expt with MODEL and EXPT lines removed...) - mostly
> these come up with a 404 error but the
> http://ncss.hycom.org/thredds/ncss/grid option is coming up with error 500:
> Internal server error - is this actually right and I should try again
> another time, or have I just got it wrong, in which case how should I (and
> others, no doubt) amend your above bash script to work with the new server
> set up?
>
> Thank you so much for your time and your script on behalf of others like me
> whom you have helped without even knowing it!
> All the best,
> Bex

xu.jia...@googlemail.com

unread,
Oct 11, 2014, 11:51:11 AM10/11/14
to michael.mcdonald, Bryan.matamoros, forum
hi, michael,

sorry to bother you. could you please tell me how to set the level parameter in your script?
the other parameters I set as follows:
NCSS='http://ncss.hycom.org/thredds/ncss/grid'
MODEL='GLBu0.08'
EXPT='expt_19.1'

VARS="var=surf_el,water_u,water_v,salinity,water_temp"
Subset='spatial=bb'
# south boudary
Ns='north=0'
Ss='south=0'
Es='east=150'
Ws='west=99'
MyTime=`date -d "$YEAR-$MONTH-$DAY +$PlusDay days" +%Y-%m-%dT%H:%M:%SZ`
TimeStart="time_start=$MyTime"
TimeEnd="time_end=$MyTime"
URL1="$NCSS/$MODEL/$EXPT?$VARS&$SPATIAL&$Ns&$Ss&$Es&$Ws&$TimeStart&$TimeEnd"

my question is : how to add the level definition?

looking forward to your help.

jiangling xu

 
Date: 2013-11-12 05:12
CC: forum
Subject: Re: question about HYCOM data

Michael McDonald

unread,
Oct 11, 2014, 11:51:27 AM10/11/14
to xu.jia...@googlemail.com, Bryan.matamoros, forum
> could you please tell me how to set the level parameter in your script?

http://www.unidata.ucar.edu/software/thredds/current/tds/reference/NetcdfSubsetServiceReference.html#Vertical

vertCoord=#

Where # is one of the integer depth levels here,

with Vertical Levels ( depth ) : 0.0 2.0 4.0 6.0 8.0 10.0 12.0 15.0
20.0 25.0 30.0 35.0 40.0 45.0 50.0 60.0 70.0 80.0 90.0 100.0 125.0
150.0 200.0 250.0 300.0 350.0 400.0 500.0 600.0 700.0 800.0 900.0
1000.0 1250.0 1500.0 2000.0 2500.0 3000.0 4000.0 5000.0 m

(*depth levels may vary* refer to the NCSS catalog pages)
http://ncss.hycom.org/thredds/ncss/grid/GLBu0.08/expt_19.1/dataset.html


e.g.,
VARS="var=surf_el,water_u,water_v,salinity,water_temp"
LEVEL='vertCoord=0'
...
URL1="$NCSS/$MODEL/$EXPT?$VARS&$LEVEL&$SPATIAL&$Ns&$Ss&$Es&$Ws&$TimeStart&$TimeEnd"

xu.jia...@googlemail.com

unread,
Oct 12, 2014, 9:33:59 PM10/12/14
to Michael McDonald, xu.jia...@googlemail.com, Bryan.matamoros, forum
that's so kind of you! thanks a lot
 

 
Date: 2014-10-11 23:50
Subject: Re: Re: question about HYCOM data
> could you please tell me how to set the level parameter in your script?
 

Irina Rypina

unread,
Jan 14, 2015, 12:26:37 PM1/14/15
to Michael McDonald, fo...@hycom.org
Hi Michael, I am interested in the global HYCOM-based SSH and surface
u and v velocity fields (from the GLBa0.08 expt_90.9), and I was
wondering if there is a way to only download u and v fields at the
surface without downloading u and v throughout the whole water column.
Thanks. ~Irina

Michael McDonald

unread,
Jan 14, 2015, 5:12:20 PM1/14/15
to Irina Rypina, fo...@hycom.org
Yes. You can simply set the "Depth" to 0 (in OPeNDAP requests) or set the "Level" to 0 (in NCSS queries).

 


Irina Rypina

unread,
Jan 15, 2015, 4:28:59 PM1/15/15
to Michael McDonald, fo...@hycom.org
Hi Michael, I am able to connect to the thredds/ncss server but the
length of the downloaded .nc file is 0. Do you know why? My bash
script is below. Thanks for help! ~Irina

#!/bin/bash

WGET='/usr/bin/wget'

YEAR='2011'
MONTH='01'
DAY='03'
StartSeq='0'
EndSeq='0'

NCSS='http://tds.hycom.org/thredds/ncss/grid'
MODEL='GLBa0.08'
EXPT='expt_90.9'

VARS="var=u,v"
Subset='spatial=bb'
NORTH='north=70'
SOUTH='south=0'
EAST='east=0'
WEST='west=-100'
Level='level=0'

for PlusDay in `seq $StartSeq $EndSeq`; do

MyTime=`date -d "$YEAR-$MONTH-$DAY +$PlusDay days" +%Y-%m-%dT%H:%M:%SZ`
TimeStart="time_start=$MyTime"
TimeEnd="time_end=$MyTime"

OutFile=$MODEL"_"$EXPT"_`echo $MyTime | cut -d 'T' -f 1`T00Z.nc"


URL="$NCSS/$MODEL/$EXPT/$YEAR?$VARS&$SPATIAL&$NORTH&$SOUTH&$EAST&$WEST&$Level&$TimeStart&$TimeEnd&addLatLon"

if [ -s $OutFile ]; then
echo "[warning] File $OutFile exists (skipping)"
else
wget -O $OutFile "$URL"
fi
done

Michael McDonald

unread,
Jan 15, 2015, 5:49:47 PM1/15/15
to Irina Rypina, forum
On Thu, Jan 15, 2015 at 4:12 PM, Irina Rypina <iry...@whoi.edu> wrote:
> NCSS='http://tds.hycom.org/thredds/ncss/grid'

Change the server to this,
NCSS='http://ncss.hycom.org/thredds/ncss/grid'

Irina Rypina

unread,
Jan 15, 2015, 5:56:53 PM1/15/15
to Michael McDonald, forum
now it says: "ERROR 400: Bad Request"

Michael McDonald

unread,
Jan 16, 2015, 5:18:46 PM1/16/15
to Irina Rypina, forum
now it says: "ERROR 400: Bad Request"
 
> Change the server to this,
> NCSS='http://ncss.hycom.org/thredds/ncss/grid'

URL="$NCSS/$MODEL/$EXPT/$YEAR?$VARS&$SPATIAL&$NORTH&$SOUTH&$EAST&$WEST&$Level&$TimeStart&$TimeEnd&addLatLon"


Remove the "&addLatLon" at the end of the "URL" and add "&accept=netcdf" (optional). This is due to the latest NCSS code from Unidata's THREDDS.

Irina Rypina

unread,
Jan 16, 2015, 9:03:41 PM1/16/15
to Michael McDonald, forum
Hi Michael, setting 'Level=0' doesn't seem to work because my script
below still downloads all levels. Please let me know how to fix it.
Thanks. ~Irina

URL="$NCSS/$MODEL/$EXPT/$YEAR?$VARS&$NORTH&$SOUTH&$EAST&$WEST&$LEVEL&$TimeStart&$TimeEnd&accept=netcdf"
VARS="var=u"
NORTH='north=10'
SOUTH='south=-10'
EAST='east=10'
WEST='west=-10'
LEVEL='Level=0'

Michael McDonald

unread,
Jan 20, 2015, 12:54:34 PM1/20/15
to Irina Rypina, forum
> Hi Michael, setting 'Level=0' doesn't seem to work because my script below
> still downloads all levels. Please let me know how to fix it. Thanks. ~Irina

The variable name in wget/manual queries is named different
(vertCoord) than via the web GUI+form (Level).

https://www.unidata.ucar.edu/software/thredds/current/tds/reference/NetcdfSubsetServiceReference.html

i.e.,
vertCoord

sudip.m...@noaa.gov

unread,
Feb 20, 2015, 10:44:33 AM2/20/15
to fo...@hycom.org
Hi Michael,

Sorry to bother you, I have a similar question. I am using your code to download reanalysis data every three days. How do I add this to you code. I believe the code bellow downloads everything, but I need the file every 3 day.
 I am using  the following code.

#!/bin/bash

WGET='/usr/bin/wget'

YEAR='2011'
MONTH='01'
DAY='01'
StartSeq='2'
EndSeq='360'


NCSS='http://ncss.hycom.org/thredds/ncss/grid'
MODEL='GLBu0.08'
EXPT='expt_19.1'
VARS="var=salinity,water_temp,water_u,water_v"
Subset='spatial=bb'
NORTH='north=0'
SOUTH='south=60'
EAST='east=40'
WEST='west=-70'


for PlusDay in `seq $StartSeq $EndSeq`; do
 
  MyTime=`date -d "$YEAR-$MONTH-$DAY +$PlusDay days" +%Y-%m-%dT%H:%M:%SZ`
  TimeStart="time_start=$MyTime"
  TimeEnd="time_end=$MyTime"
 
  OutFile=$MODEL"_"$EXPT"_`echo $MyTime | cut -d 'T' -f 1`T00Z.nc"

URL="$NCSS/$MODEL/$EXPT?$VARS&$SPATIAL&$NORTH&$SOUTH&$EAST&$WEST&$TimeStart&$TimeEnd"

 
  if [ -s $OutFile ]; then
      echo "[warning] File $OutFile exists (skipping)"
  else
      wget -O $OutFile "$URL"
  fi
done

Michael McDonald

unread,
Feb 21, 2015, 6:37:37 PM2/21/15
to sudip.m...@noaa.gov, forum
> Sorry to bother you, I have a similar question. I am using your code to download reanalysis data every three days.
>
> StartSeq='2'
> EndSeq='360'
> ...
> for PlusDay in `seq $StartSeq $EndSeq`; do

Rather simple to do.

% man seq
...
seq - print a sequence of numbers
seq [OPTION]... FIRST INCREMENT LAST
...


add in a "3" to the for loop "INCREMENT" part,

e.g.,
`seq $StartSeq 3 $EndSeq`

Sudip Majumder - NOAA Affiliate

unread,
Mar 2, 2015, 5:00:12 PM3/2/15
to Michael McDonald, forum
Hello Michael,

Sorry to bother you again. I am stuck with another issue. I let the code run for the weekend to download data  (for expt_19.1) from the beginning of 2000 to the end of 2012.  I found the code downloaded empty files from 2002 - 2010 and after that it stopped and could not connect to the server. I tried today morning again but could not connect to the server.  Please let me know if the server is down or I am doing something wrong. The code remained the same. I just added the interval.

Thank you,
Sudip

Michael McDonald

unread,
Mar 2, 2015, 5:55:57 PM3/2/15
to Sudip Majumder - NOAA Affiliate, forum
Sudip,

> Sorry to bother you again. I am stuck with another issue. I let the code run
> for the weekend to download data (for expt_19.1) from the beginning of 2000
> to the end of 2012. I found the code downloaded empty files from 2002 -
> 2010 and after that it stopped and could not connect to the server. I tried
> today morning again but could not connect to the server. Please let me know
> if the server is down or I am doing something wrong. The code remained the
> same. I just added the interval.

We just upgraded our THREDDS servers to the latest stable release to
help fix an issue.

What IP address are/were you connecting from (just the first 3 octets)?

Irina Rypina

unread,
Jan 16, 2020, 5:35:17 PM1/16/20
to Michael McDonald, fo...@hycom.org
Hi Michael, I was wondering if you could help me download HYCOM u and
v at 2 m depth for May-Dec 2019 in the western north atlantic
(north=52, south=20, east=-52, west=-100).

I tried to do this manually using
https://ncss.hycom.org/thredds/ncss/grid/GLBy0.08/latest/dataset.html
but the netcdf files were too large.

Could you please send me a shell script or even better a matlab code
that would automatically download the files?

Thank you,
Irina

Michael McDonald

unread,
Jan 18, 2020, 10:17:43 AM1/18/20
to Irina Rypina, forum
Irina,
The "GLBy0.08/latest" only contains the full forecast runs (accessed
via the FMRC) for the last 5-6 days. This time range you are
interested in is only available in experiment #93.0 full/hindcast
collection.

https://tds.hycom.org/thredds/catalogs/GLBy0.08/expt_93.0.html

see the 4 links at the bottom of the above URL.

e.g.,
see the *Access* methods you are looking for NCSS
https://tds.hycom.org/thredds/catalogs/GLBy0.08/expt_93.0.html?dataset=GLBy0.08-expt_93.0

e.g.,
NCSS form page,
https://ncss.hycom.org/thredds/ncss/grid/GLBy0.08/expt_93.0/dataset.html

Fill out an NCSS form request for a "Single time" value.

Since you are also looking for data only at 2m, then you complete the
"Choose Vertical Level:" part of the NCSS form page to level = 2,

We also strongly recommend changing the "Output Format" from netcdf to netcdf4.

When all done *before* clicking Submit there will be an "NCSS Request
URL:" in a grey shaded box at the bottom of the page that can be
re-used in scripted code, i.e., that iterates over other single time
values in a for loop within this dataset.

Hope this helps. Let us know if you need additional assistance.

Irina Rypina

unread,
Nov 25, 2020, 3:34:51 PM11/25/20
to Michael McDonald, forum
Hi Michael, I am looking for a high-resolution HYCOM output for the
Western N. Atlantic, from about 90W to 60W and from 15N to 47N. Is
there a regional or basin HYCOM run available that has resolution
higher than 1/12 deg? Thank you. ~Irina


Michael McDonald

unread,
Nov 27, 2020, 11:20:49 AM11/27/20
to Irina Rypina, forum
The only high(er) resolution model that hycom.org serves is the Gulf
of Mexico domain at 1/25 degree. All other global data is at 1/12-deg.

Irina Rypina

unread,
Nov 27, 2020, 12:20:41 PM11/27/20
to Michael McDonald, forum
Thanks for your answer, Michael. Are there any non-global runs for the
Western N. Atlantic that have higher res than 1/12-deg? This page
(https://www.hycom.org/basin) suggests that there is a basin-scale run
for the N. Atlantic that has 5 km res near the coast, but its offline
for upgdrade. Do you know when this will come back on-line? Or maybe
there is an older version available? Alos, what about the old retired
Altantic NRL Stennins run - is it still available? Thanks. ~Irina

Michael McDonald

unread,
Dec 1, 2020, 4:04:29 PM12/1/20
to Irina Rypina, forum
Thanks for pointing out this missing status update.
Unfortunately no. That too has been "Discontinued".
The past data is available via the link below.

https://polar.ncep.noaa.gov/ofs/

This system has been superseded by the Global Real Time Ocean
Forecasting System (RTOFS Global)
https://polar.ncep.noaa.gov/global/

IRINA RYPINA

unread,
Aug 16, 2021, 1:05:48 PM8/16/21
to HYCOM.org Forum, Michael McDonald, forum, Irina Rypina
Hi Michael, is it possible to download several vertical levels at once using:
URL="$NCSS/$MODEL/$EXPT?$VARS&$NORTH&$WEST&$EAST&$SOUTH&horizStride=1&$TimeStart%3A00%3A00Z&$TimeEnd%3A00%3A00Z&timeStride=1&$LEVEL&accept=netcdf"

If I use  LEVEL='vertCoord=10' then it works but downloads only 1 vertical level (i.e., depth=10m).

Thanks,
Irina

Michael McDonald

unread,
Aug 18, 2021, 3:27:16 PM8/18/21
to IRINA RYPINA, HYCOM.org Forum
No. NCSS Subset requests are either "all vertical levels" or "single"
(the one you specify, sans units).

https://www.unidata.ucar.edu/software/tds/current/reference/NetcdfSubsetServiceReference.html#Vertical

You can do this via a client-side tool like 'ncks'.
Reply all
Reply to author
Forward
0 new messages