Coe, R. (2009) 'Unobserved but not unimportant : the effects of unmeasured variables on causal attributions.', Effective education., 1 (2). pp. 101-122.
The objective of the present study was to estimate how much difference the inclusion of plausibly important but unmeasured variables could make to estimates of the effects of educational programmes. Two examples of policy-relevant research in education were identified. A sensitivity analysis using Monte Carlo simulation was conducted to estimate the size of a possible spurious 'effect' that could actually be entirely due to the failure to incorporate a plausible unobserved variable. In both examples the effect size reported in the original study was within the range of possible spurious effects. What appeared to the original researchers to be substantial and unequivocal causal effects were reduced to tiny and uncertain differences when the effects of plausible unobserved differences were taken into account. Evaluators who rely on statistical control should be more cautious in making causal claims, consider possible effects of unmeasured variables and conduct sensitivity analyses. Alternatively, stronger designs should be used.
|Keywords:||Selection bias, Unobserved variables, Causal inference, Sensitivity analysis, Monte Carlo simulation, Education. policy.|
|Full text:||(AM) Accepted Manuscript|
Download PDF (794Kb)
|Publisher Web site:||http://dx.doi.org/10.1080/19415530903522519|
|Publisher statement:||This is an electronic version of an article published in Coe, R. (2009) 'Unobserved but not unimportant : the effects of unmeasured variables on causal attributions.', Effective education., 1 (2). pp. 101-122. Effective education is available online at: http://www.informaworld.com/smpp/content~db=all?content=10.1080/19415530903522519|
|Record Created:||10 Feb 2010 15:05|
|Last Modified:||06 Jul 2016 13:51|
|Social bookmarking:||Export: EndNote, Zotero | BibTex|
|Look up in GoogleScholar | Find in a UK Library|