Skip to main content

Research Repository

Advanced Search

The misdirection of public policy: comparing and combining standardised effect sizes

Simpson, A.

The misdirection of public policy: comparing and combining standardised effect sizes Thumbnail


Authors



Abstract

Increased attention on ‘what works’ in education has led to an emphasis on developing policy from evidence based on comparing and combining a particular statistical summary of intervention studies: the standardised effect size. It is assumed that this statistical summary provides an estimate of the educational impact of interventions and combining these through meta-analyses and meta-meta-analyses results in more precise estimates of this impact which can then be ranked. From these, it is claimed, educational policy decisions can be driven. This paper will demonstrate that these assumptions are false: standardised effect size is open to researcher manipulations which violate the assumptions required for legitimately comparing and combining studies in all but the most restricted circumstances. League tables of types of intervention, which governments point to as an evidence base for effective practice may, instead, be hierarchies of openness to research design manipulations. The paper concludes that public policy and resources are in danger of being misdirected.

Citation

Simpson, A. (2017). The misdirection of public policy: comparing and combining standardised effect sizes. Journal of Education Policy, 32(4), 450-466. https://doi.org/10.1080/02680939.2017.1280183

Journal Article Type Article
Acceptance Date Jan 5, 2017
Online Publication Date Jan 15, 2017
Publication Date Jul 4, 2017
Deposit Date Jan 10, 2017
Publicly Available Date Jul 15, 2018
Journal Journal of Education Policy
Print ISSN 0268-0939
Electronic ISSN 1464-5106
Publisher Taylor and Francis Group
Peer Reviewed Peer Reviewed
Volume 32
Issue 4
Pages 450-466
DOI https://doi.org/10.1080/02680939.2017.1280183

Files




You might also like



Downloadable Citations