Evidence and Learning Styles

Posted on 26-02-2012

In my previous post I spoke about the different types of evidence used in education. Here is a concrete example of what I mean.

The Educational Endowment Fund have been set up to review the effectiveness of certain educational reforms. They have an overview of a range of different reforms on their website, together with an estimate of their effectiveness. One of these is learning styles, which they say has a ‘low impact’ of +2 months. The further detail on this suggests that whilst the impact is low, ‘one or two pupils in a class of 30 might benefit from being taught in this way.’ They draw this conclusion from meta-analyses of the effect of learning styles.

They do link to a good article explaining the science behind learning styles more fully. But they don’t summarise this article, and nor do the mention that the general scientific consensus is that learning styles don’t exist. Let me repeat that: learning styles do not exist. And yet the EEF is considering their effectiveness as a teaching strategy.

Although their summary makes it clear that learning styles lack impact, and that the evidence for this is robust, the fact that they are given a +2 month rating is confusing, as is the suggestion that one or two students in a class might benefit from the approach. I think if you didn’t know any of the research, you could easily conclude that  learning styles are a valid if low-impact intervention that might have an impact on a couple of students, and that as they are fairly low-cost they might be worthwhile retaining.

This is exactly what I meant in this post about scientific evidence being neglected in the English educational establishment.To my mind, this page should focus far more on the scientific reasons why we know learning styles don’t exist.

And a part of me wonders if the EEF should even be considering learning styles as a candidate for inclusion in these studies, especially if doing these meta-analyses costs a lot of money. I wouldn’t expect them to do meta-analyses of Brain Gym, for example. The significant thing about Brain Gym wasn’t that the research showed it didn’t have an impact. It was that it was scientifically implausible.