Results from calibration differs by far from validation

1,042 views
Skip to first unread message

claudia gabriela quiroga tamayo

unread,
Oct 14, 2015, 3:24:28 PM10/14/15
to SWAT-CUP
Dear SWATCUP community,

I've tried to calibrate my model using different data set, applying many techniques to have proper parameters and data for my SWAT model. Finally, the best result that I can obtain in calibration using SWATCUP are presented below:

Then, I was expecting to have results for the validation somehow close to the p factor, r factor, nse, pbias values I obtained in the calibration stage (I know they are not the best), but as you can see the validated are by far different:

My observations are daily, but with a lot of missing data i.e every 3 days and then every 5 days, etc. My simulation period is 13 years from 1992 to 2004, with warm up period of 3 years. 

Calibration 


For validation, I used the rest of my observations for the period that corresponds to  the file Sufi2_extract_rch

Maybe I did something wrong in the validation, or maybe it's reasonable to have this validation results based in those values I got in calibration. I would really appreciate your comment.

Regards,

Claudia

Karim Abbaspour

unread,
Oct 15, 2015, 10:32:27 AM10/15/15
to swat...@googlegroups.com
The uncertainty band is too narrow, it appears to me you have iterated too many time conditioning your parameters too strongly on the calibration period. You should try to get a p-factor> 0.6 or so. Also, it appears that the CN2 values are too low because your simulated signal is too smooth.
Karim
 




--
You received this message because you are subscribed to the Google Groups "SWAT-CUP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to swat-cup+u...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.


claudia gabriela quiroga tamayo

unread,
Oct 15, 2015, 12:29:35 PM10/15/15
to SWAT-CUP, k_abb...@yahoo.com
Thank you Mr. Abbaspour for your reply, 
well, I did 3 iteration each of 500 simulations and every time the p-factor became less and less. I'll take a look on my CN2 values.

Best regards,

Claudia

Sujeet Desai

unread,
Nov 6, 2015, 2:39:26 AM11/6/15
to SWAT-CUP
Dear Claudia,
                     Were you able to solve your problem?? If yes then please let me know as i am facing similar kind of  problem. I did multi-site calibration of 4 gauging stations of my study area on daily basis. I ran the calibration for 500 simulations with 20 parameters.The R2 and NSE values were not very good but were in acceptable range.Considering the same set of parameters i ran SWAT CUP for validation period and found that the results are too bad infact the R2 values were 0 and NSE values were negative. I dont understand where actually is the problem..Kindly help me .Please find the attached files of parameter_inf and summary_stat of calibration and validation period
calibration.JPG
parametres.JPG
validation.JPG

Jim Almendinger

unread,
Nov 6, 2015, 7:55:35 AM11/6/15
to SWAT-CUP
All --
I don't know the specifics of this case, but in general -- when your calibration appears to be good but your validation is bad, there are several possible causes.  The first and obvious one is that perhaps something like land use changed in your watershed between the two periods (calibration and validation).  But usually land use doesn't change so drastically that it ruins a model fit completely. 

The second, and I think unappreciated, cause may be that you've set up the model with incorrect initial values, and that your apparently good calibration is an artifact of this improper initialization.  This is less likely for hydrology, as long as you include a "warm-up" or "spin-up" period (I'd use at least 5 years) to allow the model hydrology to stabilize.  But for water quality parameters, especially nutrients, we don't know the proper model warm-up period.  For example, setting initial concentration of nutrients in soil and water bodies too large or too small may result in the model taking decades of model-run years to stabilize.  About the only way to check this is to allow the model to run for many years (with reasonably stationary climate inputs), and look to see when nutrient and sediment loads stabilize (i.e., are not gradually increasing or decreasing over time).  Again, I think this is less likely for hydrology, and perhaps not so much for sediment, but definitely can be a problem for nutrients. 

To summarize -- try to make sure you model has a long-enough warm-up period so that it is stable, so that your calibration is not an artifact of initial conditions. 

Best,
-- Jim




From: "Sujeet Desai" <desa...@gmail.com>
To: "SWAT-CUP" <swat...@googlegroups.com>
Sent: Friday, November 6, 2015 1:39:25 AM
Subject: Re: Results from calibration differs by far from validation

Dear Claudia,
                     Were you able to solve your problem?? If yes then please let me know as i am facing similar kind of  problem. I did multi-site calibration of 4 gauging stations of my study area on daily basis. I ran the calibration for 500 simulations with 20 parameters.The R2 and NSE values were not very good but were in acceptable range.Considering the same set of parameters i ran SWAT CUP for validation period and found that the results are too bad infact the R2 values were 0 and NSE values were negative. I dont understand where actually is the problem..Kindly help me .Please find the attached files of parameter_inf and summary_stat of calibration and validation period


--
You received this message because you are subscribed to the Google Groups "SWAT-CUP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to swat-cup+u...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
Dr. James E. Almendinger
St. Croix Watershed Research Station
Science Museum of Minnesota
16910 152nd St N
Marine on St. Croix, MN  55047
tel: 651-433-5953 ext 19

claudia gabriela quiroga tamayo

unread,
Nov 6, 2015, 9:24:35 AM11/6/15
to SWAT-CUP

Hey there,

Just to add some things to Jim's contribution I would suggest you to check why your large PBIAS. Once I had such PBIAS values but it was because I made a mistake with timing. 
  
 

claudia gabriela quiroga tamayo

unread,
Nov 6, 2015, 9:33:28 AM11/6/15
to SWAT-CUP
Dear Jim,

I do not have enough data for sediment validation, and as you said warm up period is very important, this can be seen in my model when I reduce the warm up period to 2 years  (in order to have some years to validate), my NSE decreases to much. What do you think about validation backwards, I mean use some years previous the calibration period and do the validation upon them.

Best regards,

Claudia

Jim Almendinger

unread,
Nov 6, 2015, 4:01:33 PM11/6/15
to SWAT-CUP
Claudia --
I don't know that there are any rules about which period you use for validation, and which for calibration, and which comes first.  In my current project, I calibrated to data from 2000-07 and validated from 1990-99, because the monitoring data were better for the 2000-07 period, and the land use is essentially the same as today. 

If you need more years for model warm up, you can always create some "extra" years by simply creating some redundant weather data. 
For example, say you have weather data for only 2000 to 2010.  For model warm-up, you could extract out a five-year period from your existing data and simply add it to the front end of your weather data, and re-label it as going from 1995 to 1999.  Or pick a representative year, and duplicate it five times.  It doesn't have to be exact, since you're going to "skip" all the output from the warm up period, but it should be representative of real weather conditions for your watershed.  The only goal here is to get your model up to some sort of stable condition before you extract the output data for comparison with your monitoring data during calibration.  I suppose this is a little dangerous, since model results always seem to change a little bit with slightly different starting conditions, even after a model apparently reaches a fairly stable state.  But if you have limited data, it can help you use all the data you have as fully as possible. 

Cheers,
-- Jim



From: "claudia gabriela quiroga tamayo" <clauga...@gmail.com>
To: "SWAT-CUP" <swat...@googlegroups.com>
Sent: Friday, November 6, 2015 8:33:28 AM

Subject: Re: Results from calibration differs by far from validation
--
You received this message because you are subscribed to the Google Groups "SWAT-CUP" group.
To unsubscribe from this group and stop receiving emails from it, send an email to swat-cup+u...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Sujeet Desai

unread,
Nov 8, 2015, 2:22:09 AM11/8/15
to SWAT-CUP
Dear Claudia,
                     Thanks, but could you please let me know how you solved the issue of PBIAS. I tried but still my PBIAS values are high. As per the suggestions of Dr. Jim, should i start my first iteration with the initial values given in the SWAT CUP?? . I have considered warm up period of 3 years. I also tried the validation on monthly basis but i had the same problem in validation.However i got the results after reducing the number of years of validation but my PBIAS value is still not acceptable.Please find the attachment of monthly calibration and validation. 
calibration.JPG
validation.JPG

José Alberto Monteiro

unread,
Nov 8, 2015, 12:01:45 PM11/8/15
to SWAT-CUP
Dear Claudia,

One option to complete series of sediment is a software called Loadest by the USGS. The software suggest the best model out of 7 to predict sediment in function of discharge. To use it you will need a daily series of discharge. Let me know if you will need some help.

Kind regards,
José

Ardi Nur Armanto

unread,
Jun 3, 2018, 5:23:10 AM6/3/18
to SWAT-CUP
Hi i have some problem with my validation result. So for calibration, I used land use 2003 year and get the result nse 0.54 and r2 0.63. For validation, I used land use 2013 year but the result is weird. Nse -1.67 and r2 0.62. Can you help me what I have to do?

Reuben C. Ruttoh

unread,
Jul 3, 2023, 7:03:31 AM7/3/23
to SWAT-CUP
Dear  claudia gabriela quiroga tamayo and  Sujeet Desai
Colleagues did you solve the problem of good R-square and NSE in calibration and very low in validation process. Mine i did 3-iterations for 500 simulations and obtain good R-square and NSE but when i did validation after changing recommended parameters (observed data, simulation period in file cio etc) the results gave me low R-square and negative NSE., 
Reply all
Reply to author
Forward
0 new messages