Coe, R. (2009) 'Unobserved but not unimportant : the effects of unmeasured variables on causal attributions.', Effective education., 1 (2). pp. 101-122.
The objective of the present study was to estimate how much difference the inclusion of plausibly important but unmeasured variables could make to estimates of the effects of educational programmes. Two examples of policy-relevant research in education were identified. A sensitivity analysis using Monte Carlo simulation was conducted to estimate the size of a possible spurious 'effect' that could actually be entirely due to the failure to incorporate a plausible unobserved variable. In both examples the effect size reported in the original study was within the range of possible spurious effects. What appeared to the original researchers to be substantial and unequivocal causal effects were reduced to tiny and uncertain differences when the effects of plausible unobserved differences were taken into account. Evaluators who rely on statistical control should be more cautious in making causal claims, consider possible effects of unmeasured variables and conduct sensitivity analyses. Alternatively, stronger designs should be used.
|Keywords:||Selection bias, Unobserved variables, Causal inference, Sensitivity analysis, Monte Carlo simulation, Education. policy.|
|Full text:||(AM) Accepted Manuscript|
Download PDF (794Kb)
|Publisher Web site:||http://dx.doi.org/10.1080/19415530903522519|
|Publisher statement:||This is an electronic version of an article published in Coe, R. (2009) 'Unobserved but not unimportant : the effects of unmeasured variables on causal attributions.', Effective education., 1 (2). pp. 101-122. Effective education is available online at: http://www.informaworld.com/smpp/content~db=all?content=10.1080/19415530903522519|
|Date accepted:||No date available|
|Date deposited:||01 April 2011|
|Date of first online publication:||September 2009|
|Date first made open access:||No date available|
Save or Share this output
|Look up in GoogleScholar|