new paper and Statistical Horizons course March 8-10

Skip to first unread message

Ken Frank

Nov 20, 2022, 10:29:05 AM11/20/22
to KonFound-it!

Hope you are doing well and will get some quiet time over Thanksgiving. 


I will be teaching a course on sensitivity analysis for Statistical Horizons March 8-10. Description is below, and details of the course are here.


We also have a new paper out: Frank, K.A., Lin, Q., Xu, R., Maroulis, S.J., Mueller, A. (on-line first). Quantifying the Robustness of Causal Inferences: Sensitivity Analysis for Pragmatic Social Science.  Social Science Research. Abstract is at the bottom.




Sensitivity Analysis - Online Course

The phrase “But have you controlled for …” is fundamental to social science, but can also create a quandary. Even after controlling for the most likely alternative explanations for an inferred effect, there may be some alternative explanation(s) that cannot be ruled out with observed data.  Generally, the first response is to develop the best models that maximally leverage the available data. After that, sensitivity analyses can inform discourse about an inference by quantifying the unobserved conditions necessary to change the inference.

This course provides widely accessible ways, such as correlations or cases, to quantify the sensitivity of an inference. Specifically, in this course you will learn how to generate statements such as “An omitted variable would have to be correlated at ___ with the predictor of interest and with the outcome to change the inference.” Or “To invalidate the inference,     % of the data would have to be replaced with counterfactual cases for which the treatment had no effect.”

Rooted in the foundations of the general linear model and potential outcomes, these techniques can be adapted to a range of analyses, including logistic regression, propensity-based approaches, and multilevel models. As a result, they can broadly facilitate discourse among researchers who seek to make an inference, challengers of that inference, as well as policymakers and clinicians.


The paper below just came out on-line first as part of the 50th anniversary issue of Social Science Research.

Frank, K.A., Lin, Q., Xu, R., Maroulis, S.J., Mueller, A. (on-line first). Quantifying the Robustness of Causal Inferences: Sensitivity Analysis for Pragmatic Social ScienceSocial Science Research.


Social scientists seeking to inform policy or public action must carefully consider how to identify effects and express inferences because actions based on invalid inferences will not yield the intended results.  Recognizing the complexities and uncertainties of social science, we seek to inform inevitable debates about causal inferences by quantifying the conditions necessary to change an inference.  Specifically, we review existing sensitivity analyses within the omitted variables and potential outcomes frameworks. We then present the Impact Threshold for a Confounding Variable (ITCV) based on omitted variables in the linear model and the Robustness of Inference to Replacement (RIR) based on the potential outcomes framework. We extend each approach to include benchmarks and to fully account for sampling variability represented by standard errors as well as bias. We exhort social scientists wishing to inform policy and practice to quantify the robustness of their inferences after utilizing the best available data and methods to draw an initial causal inference. 

Reply all
Reply to author
0 new messages