What does a negative sensitivity value mean?

2,445 views
Skip to first unread message

Collins Owusu

unread,
Feb 19, 2021, 9:46:50 PM2/19/21
to SWAT+
Hello All,

I am running a sensitivity analysis for a SWAT+ model using the Sobol method in the SWAT+ Toolbox v.0.7.4.5.

I am getting positive and negative values for the 1st Order Sensitivity for the parameters. What does the negative mean? A negative correlation between the input and output parameter?

Any explanation on the sensitivity analysis would be appreciated. 

Thank you,
Collins

Christoph Schürz

unread,
Feb 22, 2021, 2:15:11 AM2/22/21
to SWAT+
Hi Collins,

I cannot refer to the implementation of the Sobol method in the SWAT+ toolbox. But if the method you used is Sobol, then no negative sensitivities should occur. The first order sensitivity with the sobol method is defined by the variance in the target variable (calculated signature measure from the outputs; e.g. NSE of sim and obs) where only the respective parameter is varied to the total variance where all parameters are varied. Thus any first order sensitivity (and in fact also higher order sensitivity) is limited between 0 and 1 (1 would be the case if that parameter is the only influential one). The variance is defined by the sum of squared differences. The sum of squarded values must not be negative, simply by definition. To your other question, the first order effect does not account for parameter interactions, but tests the impact of a single parameter on the output simulations (although at many different locations in the parameter space).

So maybe someone can give you more information on the specific implementation in the SWAT+ Toolbox. But from a general perspective of the Sobol method I would assume that there is something off here.

Best
Christoph

Celray James CHAWANDA

unread,
Feb 22, 2021, 5:26:02 AM2/22/21
to SWAT+
Hi Collins,

This is an important question. As Christoph explained, in theory the indices cannot be negative.

However, it happens for numerical reasons. Before I go into details, make sure you have enough samples when running Sobol Sensitivity analysis (I will add suggested number of samples based on number of parameters in an update)

SWAT+ Toolbox Sobol method uses the Saltelli et al (2010, Table 2, Equation b) estimator (implementation of SALib) (here is a direct link to the paper: www.andreasaltelli.eu/file/repository/PUBLISHED_PAPER.pdf, check page 262) which uses vectors of model values evaluated at matrices A, B, and A_B. There is no guarantee that sum[ f(B)*(f(A_B) - f(A))] is always positive.

Saltelli et al (2010, Table 2, Equation c) is less to yield negative values but for small values of the S1 index this formula could also lead to negative values because of to sampling variability. However, the negative values are expected to always be relatively small. Their confidence intervals should also overlap with zero.Right now I have not been printing the confidence intervals, but in the next update, I will add the confidence intervals (and may be also second order sensitivities).

Best regards,
Celray James

Natalja C.

unread,
Feb 22, 2021, 6:33:36 AM2/22/21
to SWAT+
Hello,
Indeed I can confirm that Negative sensitivity result might occur when using Sobol (Christoph, I sometimes get negative results in R as well). 
Try to increase the number of simulations and the negative result should disappear. As James points out: "the negative values are expected to always be relatively small and their confidence intervals should also overlap with zero". Treat it as "zero" sensitivity.
Best,
Natalja


Christoph Schürz

unread,
Feb 22, 2021, 6:56:02 AM2/22/21
to SWAT+
Hi all,

thanks James and Natalja for the insights. Yes, I did not consider that some (or actually most) of the numerical implementations use combinations of subset matrices where the value pairs are not squared, but multiplied. This can eventually cause negative numerical values.
@James did you implement bootstrapping or any similar method to calculate the confidence intervals for the Si estimates? As Natalja also suggested, larger sample sizes would be beneficial. I also had the experience that Sobol converges very slowly and requires quite large sample sizes. Confidence intervals would indicate how certain a large or small parameter influence is.

Best
Christoph

Collins Owusu

unread,
Feb 22, 2021, 12:03:51 PM2/22/21
to SWAT+
Hi All,

Thank you for your insights. I want to ask if there is a rule of thumb for choosing the number of samples or simulations to run? 

Also, there are four methods available in the SWAT+ Toolbox for the sensitivity analysis (Sobol, FAST, RBD FAST, Delta Moment-Independent Measure). Which of the methods would you recommend for the analysis. Is the choice dependent on the parameters selected, study area characteristics, or something related to the model?

Thank you,
Collins 

Christoph Schürz

unread,
Feb 23, 2021, 3:15:38 AM2/23/21
to SWAT+
Hi Collins,

the appropriate sample size is always a difficult matter. FAST employs a specific sampling design. Thus, the number of simulations is defined by the sampling design and therefore by the number of parameters you want to analyze. For a small number of parameters FAST is quite efficient, but does not scale so well with increasing numbers of parameters.

Sobol is kind of the reference measure in the sensitivity community. So most of the new methods are typically compared to Sobol. It is considered as quite robust, but also requires a large number of simulations. Often a base sample of n = 1000 is used. Depending on the sampling design (e.g. one proposed by Saltelli) you end up with n*(2*k + 1) simulations, where k is the number of parameters. This means when you want to analyze 20 parametes you have to run 21000 simulations.

Particularly Sobol employs the variance as a measure for the sensitivity of an output to changes in a model input. The variance, however, presumes a close to normally distributed output (so delta output to delta input). The analyzed distributions are typically highly skewed and therefore the variance might not be the best choice to express sensitivity. This brings us to moment-independent measures that do not rely on any moments of the output distribution to express the sensitivity of an output to changes of an input.

So as you can see there is no clear answer to which method is the right one to use. All have advantages and disadvantages. All cxan give you a first good idea on which parameters you have to put a stronger focus on. A few general thoughts however:
- Considering the confidence of the calculated sensitivities is a good idea. It can indicate if the sample size was large enough.
- Also the use of a dummy variable can make sense, so you could compare the calculated sensitivity measures for the model parameters to the one of a parameter where you know that it definitely has no impact. The random part of the sensitivity can also strongly vary depending on the distribution of the output variable and on the sample size.
- My experience with SWAT and sensitivity analysis is to not get rid of all the non sensitive parameters in the calibration steps after the sensitivity analysis, as higher order effects (multiple correlated parameters) are present, that is not identified by the first order sensitivity of a parameter.

I hope this helps you and does not confuse you even more :)

Best
Christoph

Collins Owusu

unread,
Feb 23, 2021, 9:26:16 PM2/23/21
to SWAT+
Hi Christoph,

Thank you for the thorough explanation and direction.

Best Regards,
Collins 

celray.chawanda

unread,
Feb 24, 2021, 4:22:33 PM2/24/21
to Christoph Schürz, SWAT+
@Christop,

This is already implemented in the current SWAT+ Toolbox version but I was not presenting the conference interval (Didn't have the creativity on how to best present it in the UI, but I have ideas now). I am working on it and it will be there in v0.8.

I am also thinking of adding plugin functionality so that other external tools can be used within the UI. This still needs some careful brainstorming.

James
--
You received this message because you are subscribed to a topic in the Google Groups "SWAT+" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/swatplus/xH67VYWMJws/unsubscribe.
To unsubscribe from this group and all its topics, send an email to swatplus+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/swatplus/d154afc2-0592-4995-9a0a-122e8a01d644n%40googlegroups.com.

Christoph Schürz

unread,
Feb 25, 2021, 2:38:23 AM2/25/21
to SWAT+
@James. This is great! I think it is very important to have either the CIs or a comparison to a dummy parameter (or even both). There are some papers out there that show the CIs and dummies in a very pragmatic way (see e.g. https://doi.org/10.1016/j.envsoft.2017.02.001 Fig.4). Such a view on the sensitivities gives you a good idea how significantly different the sensitivity of a model parameter is to a random variable.

All the best with improving the SWAT+ toolbox :)

Christoph
Reply all
Reply to author
Forward
0 new messages