Why 21st century skills are not that 21st century

Posted on 08-01-2012

Whenever I hear anyone talk about preparing students for the 21st century, I am always sceptical. Partly this is because it is never made clear exactly what is so different about the 21st century that requires such different preparation. For the American organisation Partnership for 21st Century Skills (P21), which is sponsored by a number of multinational corporations, the four important 21st century skills are ‘critical thinking and problem solving; communication, collaboration; and creativity and innovation.’[i]  For the Royal Society of Arts, the skills that are needed for the future are: ‘citizenship, learning, managing information, relating to people and managing situations’.[ii] For Sir Ken Robinson, in the 21st century people need to be able to ‘adapt, see connections, innovate, communicate and work with others’.[iii]  Of course, I would agree that these skills are important. But I fail to see what is so uniquely 21st century about them. Mycenaean Greek craftsmen had to work with others, adapt and innovate. It is quite patronising to suggest that no-one before the year 2000 ever needed to think critically, solve problems, communicate, collaborate, create, innovate or read. Human beings have been doing most of these things for quite a long time. The alphabet, a fairly late development of civilisation, was invented in the 21st century BC.  It probably is true that the future will require more and more people to have these skills, and that there will be fewer economic opportunities for people who don’t have these skills. But that would suggest to me that we need to make sure that everyone gets the education that was in the past reserved for the elite. That’s not redefining education for the 21st century; it’s giving everyone the chance to get a traditional education.

And that is where my real problem with the concept of 21st century education lies. To the extent that it says that creativity and problem solving are important, it is merely banal and meaningless; to the extent that it says such skills are unique to the 21st century, it is false but harmless; to the extent that it proposes certain ways of achieving these aims, it is actually pernicious. This is because very often, the movement for ‘21st century skills’ is a codeword for an attack on knowledge.  Of course, one way the 21st century really is different to other eras is in the incredible power of technology. But this difference, whilst real, tends to lead on to two more educational fallacies. Firstly, it is used to support the idea that traditional bodies of knowledge are outmoded. There is just so much knowledge nowadays, and it is changing all the time, so there is no point learning any of it to begin with. The Association of Teachers and Lecturers argue, for example, argue that : ‘A twenty-first century curriculum cannot have the transfer of knowledge at its core for the simple reason that the selection of what is required has become problematic in an information rich age’.[iv]  The popular youtube video ‘Shift Happens’ tells us that 1.5 exabytes of unique new information are generated each year, and that the amount of new technical information is doubling each year. [v] It then concludes that this flow of new information means that for students starting a four year college or technical degree, half of what they learn in their first year will be outdated by their third year of study. This is simply not true. Of course people make new discoveries all the time, but a lot of those new discoveries don’t disprove or supersede the old ones – in fact, they’re more likely to build on the old discoveries and require intimate knowledge of them. The fundamental foundations of most disciplines are rarely, if ever, completely disproved. Universities can turn out as many exabytes of information as they like – they are unlikely to disprove Pythagoras’s theorem or improve on Euripides’s tragedies. And there are very many such ancient, fundamental ideas and inventions which have stood the test of time: perhaps more than we are willing to admit. The alphabet and the numbering system, for example, are two of the most valuable inventions we have. As far as we know, these were invented in about 2000 BC and 3000 BC respectively. So far they show no signs of wearing out or being superseded. All of the most modern and advanced technological devices depend on them in one way or another. Indeed, if anything the sheer proliferation of knowledge should lead to selective bodies of knowledge becoming more important, as mechanisms for sorting the wheat from the vast amounts of chaff.

Secondly, advances in technology are used to do down knowledge because it is said that they remove the need for pupils to memorise anything. This is the ‘Just Google It’ fallacy which I dealt with briefly here and here, and which E.D. Hirsch deals with comprehensively here.[vi] Put simply, to be able to effectively look things up on the internet requires a great deal of knowledge to begin with.

What I think you can see from this is that too often the idea of 21st century skills is just a codeword for an attack on knowledge and memory. This is ironic because, as I now want to explain, the message of late 20th century and 21st century science is that knowledge and memory are unbelievably important.

As Kirschner, Sweller and Clark put it.

‘our understanding of the role of long-term memory in human cognition has altered dramatically over the last few decades. It is no longer seen as a passive repository of discrete, isolated fragments of information that permit us to repeat what we have learned. Nor is it seen only as a component of human cognitive architecture that has merely peripheral influence on complex cognitive processes such as thinking and problem solving. Rather, long-term memory is now viewed as the central, dominant structure of human cognition. Everything we see, hear, and think about is critically dependent on and influenced by our long-term memory.’[vii]

You will see that Kirschner et al say that our understanding of human cognition has altered dramatically over the last few decades. A large part of this is down to the work that artificial intelligence pioneers have done. In the 50s and 60s, scientists wanted to try and create artificial intelligence in computers. They realised as they did this that their understanding of real, human intelligence was incredibly hazy. The research they did to try and understand real intelligence is fascinating and has huge implications for the classroom. And as Kirschner et al suggest, one of their strongest findings was that knowledge plays a central part in all human cognition. The evidence on this is solid. Much of the early research on this was done involving chess players, including one fascinating experiment by Adriaan de Groot. The electronic chess games that can beat you are based on the research these AI pioneers did.  And all the research in different fields confirms this. Dan Willingham sums up all this research with this line:

Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not just because you need something to think about. The very processes that teachers care about most – critical thinking processes such as reasoning and problem solving – are intimately intertwined with factual knowledge that is stored in long-term memory (not just found in the environment).[viii]

And yet, as we have seen, many advocates of ‘21st century skills’ speak disparagingly of knowledge and want to marginalise its place in the curriculum. This is despite the fact that there is no research or evidence backing up their ideas. Indeed, the guilty secret of the 21st century skills advocates is that it is their ideas which are rather old hat and outdated. Diane Ravitch notes how, at the beginning of the 20th century, many educators wanted to throw away traditional knowledge and embrace ‘20th century skills’. [ix]

The most depressing thing about all of this, therefore, is that old ideas which are thoroughly discredited are being warmed over and presented as being at the cutting edge. And it is particularly ironic that the actual cutting edge science is telling us to do the complete opposite of what most of the ‘21st century skills’ advocates want.

 


[i]   ‘Shift Happens’, http://www.youtube.com/watch?v=ljbI-363A2Q Accessed 19 July 2011.

[ii] Royal Society for Arts Opening Minds. What is RSA Opening Minds? http://www.rsaopeningminds.org.uk/about-rsa-openingminds/ Accessed 19 February 2011.

[iii] National Advisory Committee on Creative and Cultural Education. All Our Futures: Creativity, Culture and Education. 1999, p.14. < http://www.cypni.org.uk/downloads/alloutfutures.pdf> Accessed 19 February 2011.

[iv] Association of Teachers and Lecturers. Subject to Change: New Thinking on the Curriculum. London, 2006. http://www.atl.org.uk/Images/Subject%20to%20change%20-%20curriculum%20PS%202006.pdf Accessed 19 July 2011.

[v] Fisch, Karl. ‘Shift Happens’. http://www.youtube.com/watch?v=ljbI-363A2Q Accessed 21 January 2011.

[vi] http://www.aft.org/pdfs/americaneducator/spring2000/LookItUpSpring2000.pdf

[vii] Kirschner, P. A., J. Sweller, and R.E. Clark, ‘Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure Of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching.’ Educational Psychologist (2006) 41:2, 75-86, p.76. http://www.cogtech.usc.edu/publications/kirschner_Sweller_Clark.pdf

[viii] Willingham, Daniel T., Why Don’t Our Students Like School?, San Francisco: Jossey-Bass, 2009, p.28.

[ix] http://commoncore.org/_docs/diane.pdf