John V. KaneNew York University
Yamil R. VelezColumbia University
Jason BarabasDartmouth College
Respondent inattentiveness threatens to undermine experimental studies. In response, researchers incorporate measures of attentiveness into their analyses, yet often in a way that risks introducing post-treatment bias. We propose a design-based technique—mock vignettes (MVs)—to overcome these interrelated challenges. MVs feature content substantively similar to that of experimental vignettes in political science, and are followed by factual questions (mock vignette checks [MVCs]) that gauge respondents’ attentiveness to the MV. Crucially, the same MV is viewed by all respondents prior to the experiment. Across five separate studies, we find that MVC performance is significantly associated with (1) stronger treatment effects, and 2) other common measures of attentiveness. Researchers can therefore use MVC performance to re-estimate treatment effects, allowing for hypothesis tests that are more robust to respondent inattentiveness and yet also safeguarded against post-treatment bias. Lastly, our study offers researchers a set of empirically-validated MVs for their own experiments.
pdf : 1mb
pdf : 1mb