Back to Top

Press release: A digital divide?

New research questions results from the OECD PISA 2015 study

Validity of some the more “surprising” findings questioned by leading academic

The Centre for Education Economics (CfEE) today publishes a new paper [1] that questions the findings from the latest wave of the OECD’s flagship Programme for International Assessment (PISA) study due to important changes made to the methodology.

In 2015, PISA was conducted on computer in 58 countries, while 14 others used a standard paper test [2]. This was a significant departure from previous cycles of PISA, when all countries assessed their children using paper-based assessment.

The new research [3] was led by Professor Jerrim from the UCL Institute of Education. Using data from three countries [4] that took part in the PISA 2015 pilot, the paper illustrates how this change could have had a significant impact upon the results, with students tending to perform worse when taking a computer-based (rather than a paper based) test, to the equivalent of around six months of schooling.

Professor Jerrim questions whether the methodology the OECD has used to “adjust” for this problem has worked sufficiently well, and if results from the PISA main study are truly comparable to previous cycles.

Taking a test on computer is very different to the standard procedure of taking a test using paper and pencil. Yet the OECD has provided scant evidence on the impact this is likely to have had upon the PISA 2015 results” said Professor Jerrim.

Could this have driven some of the more surprising findings from the PISA 2015 study, such as Scotland’s plummeting performance on reading and science compared to 2012, the significant drop in science performance in Ireland and Germany compared to 2012, or the significant decline in several East Asian countries mathematics scores?  I certainly don’t think we can currently rule out such possibilities.

James Croft, founder and chair of CfEE, said “it is vital that there is clarity around the methodology of these assessments as governments clearly rely on them when setting education policy. We hope that by publishing this paper today, governments across the world will carefully reflect upon how comparable the 2015 results are both to other countries and to those from previous PISA assessments”.

The research is based upon data from more than 3,000 15-year-olds from across three countries. These pupils were randomly assigned to complete either a paper or computer version of the PISA test in a pilot study conducted in 2014. By comparing test scores across these two groups, the research team was able to establish that students who took the computer version found the PISA questions more difficult to answer correctly.

The paper explains that there are several reasons why this may have occurred; including the computer assessment not allowing students to go back to questions they may have skipped, along with potential differences in the test environment.   

Notes for editors

1.       A digital divide? Randomised evidence on the impact of computer-based assessment in PISA. London: CfEE. 

2.       A list of all participating countries can be found on the PISA website: http://www.oecd.org/education/pisa-2015-results-volume-i-9789264266490-en.htm

3.       The CfEE report summarises results from a peer-reviewed academic paper led by Professor Jerrim: Jerrim, J., Micklewright, J., Heine, J., Salzer, C., and McKeown, C. (forthcoming). “PISA 2015: how big is the ‘mode effect’ and what has been done about it?” Oxford Review of Education.

4.       Germany, Sweden and the Republic of Ireland

5.       The Centre for Economics in Education is an independent think tank working to improve policy and practice in education through impartial economic research.