Simpson, A. (2017) 'The misdirection of public policy : comparing and combining standardised effect sizes.', Journal of education policy., 32 (4). pp. 450-466.
Increased attention on ‘what works’ in education has led to an emphasis on developing policy from evidence based on comparing and combining a particular statistical summary of intervention studies: the standardised effect size. It is assumed that this statistical summary provides an estimate of the educational impact of interventions and combining these through meta-analyses and meta-meta-analyses results in more precise estimates of this impact which can then be ranked. From these, it is claimed, educational policy decisions can be driven. This paper will demonstrate that these assumptions are false: standardised effect size is open to researcher manipulations which violate the assumptions required for legitimately comparing and combining studies in all but the most restricted circumstances. League tables of types of intervention, which governments point to as an evidence base for effective practice may, instead, be hierarchies of openness to research design manipulations. The paper concludes that public policy and resources are in danger of being misdirected.
|Full text:||(AM) Accepted Manuscript|
First Live Deposit - 11 January 2017
Download PDF (352Kb)
|Publisher Web site:||https://doi.org/10.1080/02680939.2017.1280183|
|Publisher statement:||This is an Accepted Manuscript of an article published by Taylor & Francis Group in Journal of education policy on 15/01/2017, available online at: http://www.tandfonline.com/10.1080/02680939.2017.1280183.|
|Record Created:||11 Jan 2017 12:43|
|Last Modified:||15 Jul 2018 00:52|
|Social bookmarking:||Export: EndNote, Zotero | BibTex|
|Look up in GoogleScholar | Find in a UK Library|