Hello,
Does anybody here do water budget analysis of WRF-Hydro simulations? I am currently running a model of a small basin in Iowa. I am using AORC forcing data and the NWM model setup. My domain contains the calibrated NWM parameters. I am doing a 5-year spin up and then a 3-year simulation.
I am running the model like this for two different simulations, one with tile drainage and one without tile drainage. When I have the tile drainage component turned off, I have significantly more volume as runoff compared to when I run with TD. When doing a mass balance and accumulating the evapotranspiration, rainfall and outlet discharge over the basin, the ET and discharge is much higher than precipitation. This is still the case when accounting for changes in storage (soil column volume, aquifer depth, SWE, etc.).
I got outlet discharge to mm but converting m3/s to mm3 per timestep and then dividing by the area of the basin. ET and precipitation are averaged over the basin. ET values are from LDASOUT files, precipitation from LDASIN and outlet discharge from CHRTOUT.
ET rates are similar between simulations. I don't know where I'm losing runoff volume between the two different simulations so if anybody has any idea I'd really appreciate it!
Thank you!
Kaleb
