SEMINAR 2024-05-04

R. Clifton Bailey Statistics Seminar Series

Identifying Optimal Methods for Addressing Confounding Bias When Estimating the Effects of State Policies

Beth Ann Griffin

Senior Statistician

RAND Corporation

 

Friday, April 5th, 2024

11:00 A.M. – 12:00 P.M. Eastern Time

Nguyen Engineering Building, Room 1109

4511 Patriot Circle, Fairfax, VA

The seminar talk is also live-streamed. Please register here to receive the link.

Abstract

Policy evaluation studies that assess how state-level policies affect health-related outcomes are foundational to health and social policy research. The relative ability of newer analytic methods to address confounding, a key source of bias in observational studies, has not been closely examined. We conducted a simulation study to examine how differing magnitudes of confounding affected the performance of four methods used for policy evaluations: (1) the two-way fixed effects difference-in-differences model; (2) a one-period lagged autoregressive model; (3) augmented synthetic control method; and (4) the doubly robust DID approach with multiple time periods from Callaway-Sant’Anna. We simulated our data to have staggered policy adoption and multiple confounding scenarios (i.e., varying the magnitude and nature of confounding relationships). We found that bias increased for each method: (1) as confounding magnitude increases; (2) when confounding is generated with respect to prior outcome trends (rather than levels), and (3) when confounding associations are nonlinear (rather than linear). The autoregressive model and augmented synthetic control method had notably lower root mean squared error than the two-way fixed effects and Callaway-Sant’Anna approaches for all scenarios; the exception is nonlinear confounding by prior trends, where Callaway-Sant’Anna excels. Coverage rates were unreasonably high for the augmented synthetic control method (e.g., 100%), reflecting large model-based standard errors and wide confidence intervals in practice. Overall, no single method consistently outperformed the others, but a researcher’s toolkit should include all methodologic options. Our simulations and associated R package can help researchers choose the most appropriate approach for their data. I will share findings from this work as well as highlight R code for running the same type of simulations on user provided data.

About the Speaker

Beth Ann Griffin is a senior statistician at the RAND Corporation and codirector the NIDA-funded RAND/USC Opioid Policy Tools and Information Center (OPTIC) whose goal is to foster innovative research, tools, and methods for tackling the opioid epidemic. Her statistical research has focused on methods for estimating causal effects using observational data. Her public health research has primarily fallen into three areas: (1) the effects of gun and opioid state policies on outcomes, (2) substance use treatment evaluations for adolescents, and (3) the impact of nongenetic factors on Huntington's disease. She was co-founding directed the RAND Center for Causal Inference between 2013 and 2018 and has served as the principal investigator on four grants sponsored by the National Institute of Drug Abuse (NIDA), each of which have been devoted to developing new tools and methods to understand causal effects estimation using observational study data. Beth Ann recently became a Fellow of the American Statistical Association in 2023.

Event Organizers

Nicholas Rios

David Kepplinger