PISA 2018: does reading on screen make a difference?

Posted on 19-12-2019

In my last blog, I showed how the PISA tests have been updated to take account of the fact we do more reading online. Not only are the tests computer-based, but they also feature tasks that mimic the type of reading we do online: for example, finding the right link to click on a page of search results, flicking between three tabs to work out the differences of opinion between different authors.

This raises some interesting questions about the nature of reading and assessment. Are PISA right to make these changes? Is reading online really different to reading on paper? And what does this mean for how we should teach reading?

What is different about reading in the 21st century

We do spend more time reading on screen now than in the past. Unlike other international assessments, PISA has always tried to be a test of real-world skills, not more advanced or specialist content. Its science and maths assessments don’t involve using much algebra or applying scientific formulas, and the reading assessments have never involved the analysis of literature. PISA have always been very clear about this, and given this focus on the real world, it’s sensible that they would want to update their tests to reflect the changing nature of reading.

One problem with creating such real-world tasks is that you can often end up sacrificing the reliability of the assessment – this is the case with portfolio assessments, for example. But the types of digital reading questions PISA have designed are very reliable, so that’s no problem here.

Still, you might argue that whilst we do more reading on screen now, ultimately that’s a fairly cosmetic change – reading is reading, after all. Well, yes and no. The PISA results themselves, along with lots of other data, show that most students tend to do worse on on-screen assessments than on paper-based ones. The most striking find here is the ‘mode effects’ research of John Jerrim, who did an experimental analysis of nearly 6,000 students in Germany, Ireland and Sweden. Half took the 2015 PISA test on paper, half on computer. For the reading, science and maths assessments, students did worse on the computer versions than the paper versions.

There’s also more general research on the differences between how we read on screen and on paper. There is evidence that suggests when we read on screen we tend to skim and scan more than we do when reading on paper, and that, for longer texts, we find it harder to orient ourselves and more difficult to integrate information from different parts of the text.

What is not different about reading in the 21st century

Still, whilst reading online may well be more difficult than reading on paper, that doesn’t mean it is completely different.

Here’s an analogy: suppose you are driving in a car on a traffic-free road in good daylight. Suppose you drive on the same road in the same car the next day, but with heavy traffic and in dark and icy conditions. The second task is more challenging, but you will still be using lots of the same underlying skills. And if you were teaching someone to learn to drive, you’d begin by teaching them to steer, change gears, brake, etc., regardless of the exact conditions they would be driving in later.

Similarly, one possible interpretation of the ‘mode effects’ research is that reading on screen is simply more challenging than reading on paper. Rather than come up with a brand new way of teaching reading, we probably just need to ensure that we teach the fundamentals really well, and aim to get everyone to a higher standard of reading. These fundamentals will include knowing how letters are combined to form words, what words mean, and to have background knowledge that helps you infer deeper meanings. This is a large part of what schools do, and there is no reason why this should change.

Changing the assessment doesn’t mean you should change the teaching

So, whilst I’m supportive of PISA’s changes, and find the results from them fascinating, I’m wary of assuming that these changes, or the changes we see in real-world reading, should lead to similar changes in the classroom. Often, the best way to prepare for real-world challenges is not to directly mimic those challenges in the classroom.

Unfortunately, I think the PISA report itself falls into this trap. For example, it says:

In the past, students could find clear and singular answers to their questions in carefully curated and government-approved textbooks, and they could trust those answers to be true. Today, they will find hundreds of thousands of answers to their questions online, and it is up to them to figure out what is true and what is false, what is right and what is wrong.

This sounds very plausible, but if you unpick it, it doesn’t really make sense. Yes, it is true that students and indeed adults can find hundreds of thousands of answers to their questions on the internet. Why does that mean we should no longer use textbooks in the classroom? Why does it mean that carefully curated textbooks have stopped being useful? Surely, in a world full of unreliable sources of information, the carefully curated textbook is even more useful?  What if the best way to prepare for the firehose of unverified information in the real world is not to extend that torrent of misinformation into schools – but to make schools a place where students don’t have to encounter fake news and misinformation all the time, a place where they can acquire a body of knowledge that will help them tease out the fake news?

PISA’s own data seems to suggest that this might actually be a more fruitful strategy. Singapore, for example, is a high performer on PISA’s reading tests, and is famous for its well-planned textbooks. Finland, too, before it began its slide down the rankings, relied on high-quality textbooks. Similarly, PISA also provide us with information about how much time students spend online in school on a school day. The variation is enormous: 26 minutes in Japan and Korea, 48 minutes in the UK, 165 in Denmark. Of course, ‘spending time online’ is pretty vague and could cover a whole range of valuable and less valuable activities. But again, it is striking that a number of the top performers in reading – that is, the students who did best at answering these digital reading questions – are the ones where they do not spend that much time online in school. So you clearly don’t need to be spending time reading online to be good at reading online. 

Is there anything we should change?

Whilst ‘practising on screen reading’ might not be a helpful strategy, we should certainly look at techniques that might be useful, and we need more research on this. Books and paper-based reading tools have evolved over several centuries. Computer screens and online reading are very new, and there are probably ways they need to evolve too.  E-readers are an interesting innovation – e-ink is easier to read than backlit tablet or computer screens, and causes less eye strain. There might already be some simple, quick habits we could try and develop in older readers, such as switching off notifications and switching on full screen whenever you want to read something lengthy on a screen. This is certainly an area where we need more research, but in the mean time, abandoning the textbook and expecting students to pick things up via Google is not the right way to go.

The impact of technology on education is also the topic of my new book, Teachers vs. Tech, which will be published by Oxford University Press next year. You can sign up to my mailing list here for more updates about it.