Speaker: Andrew Gelman (Columbia University)
Discussants: Elizabeth Tipton (Northwestern), Avi Feller (Berkeley), Jonathan Roth (Brown), Pedro Sant'Anna (Emory)
Title: Better Than Difference in Differences
Abstract: It is not always clear how to adjust for control data in causal inference, balancing the goals of reducing bias and variance. We show how, in a setting with repeated experiments, Bayesian hierarchical modeling yields an adaptive procedure that uses the data to determine how much adjustment to perform. The result is a novel analysis with increased statistical efficiency compared with the default analysis based on difference estimates. The increased efficiency can have realworld consequences in terms of the conclusions that can be drawn from the experiments. An open question is how to apply these ideas in the context of a single experiment or observational study, in which case the optimal adjustment cannot be estimated from the data; still, the principle holds that differenceindifferences can be extremely wasteful of data.The talk follows up on Andrew Gelman and Matthijs Vákár (2021), Slamming the sham: A Bayesian model for adaptive adjustment with noisy control data