concerns over paleoclimate modeling using WorldClim data

251 views
Skip to first unread message

Elizabeth J. Sbrocco

unread,
Jul 8, 2013, 10:58:02 AM7/8/13
to max...@googlegroups.com
Hi all,

I was wondering if anyone can lead me to an actual publication that describes how the paleoclimate data on the Worldclim website were created?  I have found a rough description of the methods for both the paleo and future climate data on the worldclim website (http://www.worldclim.org/downscaling), but I am curious for more details on how the data were downscaled and whether the paleoclimate data was taken from a single simulation year or averaged over a number of years, etc.

My main concern is that the methods on this page (http://www.worldclim.org/downscaling) state that the anomaly between the paleo time period and the baseline period "is computed as the (absolute or relative) difference between the output of the GCM run for the baseline years (typically 1960-1990 for future climate studies and "pre-industrial" for past climate studies) and for the target years (e.g. 2050-2080)."  These anomalies are then interpolated and added to the "current period" -- i.e., data from the WorldClim database.

For the future climate data this seems fine because the baseline years (1960-1990) from the GCM overlap with the climate data used to build the WorldClim database.  But for the paleoclimatic data -- the baseline years are the "pre-industrial" period (around 1750).  There has been measurable change in climate since then, and so I'm wondering if it's really appropriate to just add the anomalies between the LGM and this pre-industrial period to the current WorldClim data taken from the late 20th century in the downscaling process??  It seems like the amount of climate change since the LGM would be underestimated since the anomalies don't take into account change between 1750 and the late 20th century.  Does this bug anyone else?  Am I not understanding the methods correctly?  Perhaps there is a reason why the pre-industrial period from a GCM approximates the current period from WorldClim?  Or as biogeographers, are we okay with underestimation of the actual amount of climate change?  It seems to me that some additional step is required to calculate the difference between the pre-industrial period and the climate of today.

I am currently developing paleoclimate layers for marine data to accompany my MARSPEC database (www.marspec.org) and so I would appreciate some insight into this issue! I have contacted the WorldClim authors using the "contact form" on their site with these questions but have gotten no response.

Cheers!
Elizabeth

Pierre sepulchre

unread,
Jul 17, 2013, 4:58:58 AM7/17/13
to max...@googlegroups.com
Hi Elizabeth,

In climate simulations, the main difference between pre-industrial and present-day simulations is the prescribed pCO2, which is 280 ppmv for pre-industrial, and 350-400 for present-day. Although this can lead to changes in temperature/precipitation patterns around the globe, my feeling as a (paleo)climate modeller is that it is OK to consider it as small when compared to differences between pre-industrial and paleoclimate states, such as the LGM, Holocene, or older climate.
As you might have seen on the worldclim website, the paleo climate data comes from two GCMs, namely MIROC & CCSM, that were run within the PMIP2 framework. I guess these output come from equilibrated simulations, so the GCMs must have been run around 1,000 years, to get an equilibrated ocean, and then a climatology (i.e. an "averaged" year with 12 timesteps) must have been computed, typically over the last 50 years of simulations.

One could be concerned with two other aspects of the "anomaly" method:

1/ Such a method hypotheses that the model bias is constant through time... which is not obvious at all. Fully-coupled GCMs have so much components and related interactions that it is unclear if the bias observed for present-day is valid for climate state in which you have totally different planetary albedo, huge icesheets, etc.

2/ The downscaling approach. I might have misread, but it seems to me that the Worldclim method for paleoclimate applies an interpolation to obtain values at a 1km resolution. But the PMIP2 GCMs are run at a ~200km resolution. To me, such a downscaling is really... brutal. I don't have problems to downscale from 2° to 1° with a basic bilinear interpolation, but to 1km... that's another story. To me, there's a lot to do here to obtain downscaled paleoclimate layers that actually make sense, physically.
One way to do that might be to use statistical downscaling (From the worldclim website, I could not figure out if it was used or not for the paleo, but it seems not). There are promising work coming out now for Quaternary paleoclimates (for example see this). 
Reply all
Reply to author
Forward
0 new messages