upcoming talks and workshops

35 views
Skip to first unread message

Ken Frank

unread,
Jan 29, 2024, 3:23:01 PMJan 29
to KonFound-it!

Hope you are doing well.  Lots of news re: Konfound-it.

Check out the new konfound-it app.  More functionality, improved interface.

We also have a new konfound-it website, with lots of resources including new published papers, FAQs, etc.  Some parts still under development, but you can see what’s coming.

Upcoming Workshops on Sensitivity Analysis in 2024 

(all times are eastern unless otherwise specified)

Feb 14. U of Arizona 2:30-5pm (Mountain time), computation in social science.

Feb 29-March 1 (2 day mini course). Statistical Horizons. Virtual.

April 12 (7:45 am – 11:45 am): Philadelphia. American Educational Research Association. In Person.

April 17 (9am-noon, central time).  Society for Research on Adolescent Development. In person

May 24 (TBD). University of Gothenburg, Sweden. School of Public Health. In person.

July 24 (11am-1:30 pm): Austin, Texas. Society for Epidemiological Research. In person

Stay tuned for details of ICPSR mini course in Ann Arbor, possibly Aug 5-8. Hybrid.

Related Talks

March 15 (8am-9:45, Baltimore Marriott Waterfront, Room 12 – Falkland). “Quantifying Sensitivity to Selection on Unobservables: Refining Oster’s Coefficient of Proportionality.” Association for Education Finance and Policy. In Person.

March 21 (8:30 p.m.). Emory University. Workshop on Advanced Research Methods. “Quantifying the Robustness of Causal Inferences: Sensitivity Analysis for Pragmatic Social Science.” Virtual.

April 11 (3:30pm). Michigan State University Biostats (E111 Fee Hall). “Robustness of Inference to Replacement & Fragility for Logistic Regression and Hazard Functions.” In person.

May 23 (1p.m.-2:45p.m. Sweden time). University of Gothenburg, Sweden. “Robustness of Inference to Replacement & Fragility for Logistic Regression and Hazard Functions.” School of Public Health. In person.

 Workshop description (for AERA, others are similar).

 What would it take to Change your Inference? Opening the Discourse about Causal Inferences to a Range of Stakeholders

 

The workshop promotes a discussion about the basis of evidence for educational decisions of policy and practice.  By imagining the conditions necessary to change an inference in the concrete terms of how different the data would have to be or the properties of an omitted variable, this course can inform scientific discourse from a broad set of stakeholders familiar with conventional statistical techniques. Furthermore, the techniques make explicit the role of the researcher in choosing how to estimate and interpret quantitative analysis, and thus careful examination of the positionality of researchers and consumers of quantitative research. Specifically, those challenging racial injustice can engage in discourse on equal footing with those who have historically used quantitative analysis to enforce racial inequities.

In part I we will use Rubin’s causal model to interpret how much bias there must be to invalidate an inference in terms of replacing observed cases with counterfactual cases or cases from an unsampled population.  This supports statements such as “XX% of the cases would have to be due to bias to invalidate the inference” In part II, we will quantify the robustness of causal inferences in terms of correlations associated with unobserved variables or in unsampled populations. This supports statements such as “an omitted variable would have to be correlated at qqq with both the predictor and outcome to change the inference.” Calculations for bivariate and multivariate analysis will be presented in the konfound-it app as well as Stata and R. 

Reply all
Reply to author
Forward
0 new messages