We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.

Durham Research Online
You are in:

The misdirection of public policy : comparing and combining standardised effect sizes.

Simpson, A. (2017) 'The misdirection of public policy : comparing and combining standardised effect sizes.', Journal of education policy., 32 (4). pp. 450-466.


Increased attention on ‘what works’ in education has led to an emphasis on developing policy from evidence based on comparing and combining a particular statistical summary of intervention studies: the standardised effect size. It is assumed that this statistical summary provides an estimate of the educational impact of interventions and combining these through meta-analyses and meta-meta-analyses results in more precise estimates of this impact which can then be ranked. From these, it is claimed, educational policy decisions can be driven. This paper will demonstrate that these assumptions are false: standardised effect size is open to researcher manipulations which violate the assumptions required for legitimately comparing and combining studies in all but the most restricted circumstances. League tables of types of intervention, which governments point to as an evidence base for effective practice may, instead, be hierarchies of openness to research design manipulations. The paper concludes that public policy and resources are in danger of being misdirected.

Item Type:Article
Full text:(AM) Accepted Manuscript
Download PDF
Publisher Web site:
Publisher statement:This is an Accepted Manuscript of an article published by Taylor & Francis Group in Journal of education policy on 15/01/2017, available online at:
Date accepted:05 January 2017
Date deposited:11 January 2017
Date of first online publication:15 January 2017
Date first made open access:15 July 2018

Save or Share this output

Look up in GoogleScholar