I think DT Willingham has something to say about education. He has identified, or at least publicised, a real issue for teachers and that is the relationship between skills and knowledge. In particular, how we teach skills and such ephemeral issues as critical thinking, and indeed whether it is possible to teach these things. I think you can and we must, but that is a topic for another blog perhaps.
I think, though, DT Willingham has gone too far, particularly with the Core Knowledge Foundation and has left scientific evidence way behind.
DT Willingham makes a lot of assertions. He says things like:
Most of the teachers I know entered the profession because they loved school as children
Perhaps this is true. I know a lot of teachers and they have never mentioned their own school experience, but anyway the point being is that as a consequence teachers are ill prepared for life in classrooms, being simply bemused by students lack of desire to learn. I think this trivializes the problems teachers face particularly in the UK.
It stems from an ideology that suggest that the problems in education lie with teachers, and their approach to classroom practice, in particular a type of teaching loosely (very loosely), described as progressive.
What teachers haven’t realised is that the reason that they don’t like school is because:
Contrary to popular belief, the brain is not designed for thinking
You do wonder what it is designed for, and what DT Willingham, compares it to. As far as I am aware the brain is the only thing that does any thinking. Of course you begin to realise that DT Willingham thinks, that thinking is, solving parlour games. Somewhat reminiscent of Kirschner et al, Willingham, likes to use puzzles to highlight his views:
In an empty room are a candle, some matches, and a box of tacks. The goal is to have the lit candle about five feet off the ground. You’ve tried melting some of the wax on the bottom of the candle and sticking it to the wall, but that wasn’t effective. How can you get the lit candle to be five feet off the ground without your having to hold it there?*
Apparently the answer to the problem is not easy. People, presumably even smart people, can’t solve it. The reason being that we have not stored the answer in long term memory. Or rather, in cognitive scientific terms, we haven’t stored the answers to similar problems in long term memory. I suppose, even I, can’t argue with the view that having the answer to a problem makes solving the problem easier.
The solution is that you use the tacks to secure the box to the wall then balance the candle on top of the box. As DT Willingham explains, the difference between “real life” and the artificial world of education is that in real life most people would have picked up the box of tacks, and made the connection between it, and the tacks inside.
Often when we are engaged in purely problem solving, our visual system isn’t interacting properly with the physical properties of the objects we are studying, or the fact that the box has tacks inside it. It seems to me this has little to say about long term memory and more about the kind of evidence used by cognitive scientists to make their points.
In the “parlour game” above, we visualise the box and not the tacks. Our cognition is tricked until, of course, previous experience of the trick makes us aware of it. Often DT Willingham’s examples don’t seem to really vindicate the point he is trying to make, I’m not even sure what these “parlour game trick” type research really achieve. Often knowledge is implicit to solving them, and then it is discovered that, well, knowledge is implicit to solving them. I wonder whether this is really the answer to all our educational woes. It is, I suppose, if solving parlour game is how you define education.
DT Willingham also says things like this:
Data from the last 30 years leads to a conclusion that is not scientifically challenge able: thinking well requires knowing facts, and that’s true not simply because you need something to think about. The very processes that teachers care about most—critical thinking processes like reasoning and problem solving—are intimately intertwined with factual knowledge that is in long-term memory (not just in the environment).
A somewhat bemusing statement. I wonder if driving a car can be achieved without a car. I mean I’m not sure that anyone ever said that you could think well without facts. Long term memory is implicit to cognition.
I actually think substantively the statement is unchallengeable because it states the obvious, but on another level it is entirely wrong because it conflates expertise with “thinking well”. I think it is perfectly possible to “think well”, and not be an expert in the subject. That does not mean you reach the correct answer but you can still “think well”, and get it wrong. Or, at least, someone somewhere thinks you have got it wrong. In fact philosophically speaking is a fact, a fact, if it is wrong. If in some future time, Einstein’s theory is debunked, can Einstein be described as not “thinking well”?
As most of social life exists external to causal events, and the synthetic languages (Math etc) used to describe them, right and wrong is a subjective matter. “Thinking well” is an epistemic perspective, and not an empirical one. Quite simply put you can “think well” with little knowledge and be completely wrong. Let’s be honest we all do it all the time, but you can’t think well and not use long term memory because that’s not how cognition works nor can you drive a car without a car. You don’t need to be a cognitive scientist nor an AA roadside engineer to know these things.
DT Willingham also says things like this:
The human mind does not work that way. When we learn to think critically about, say, the start of the Second World War, that does not mean that we can think critically about a chess game, or about the current situation in the Middle East, or even about the start of the American Revolutionary War. The critical thinking processes are tied to the background knowledge.
I honestly cannot see why someone can think critically about the second world war but not be able to think critically about the current situation in the Middle East. It seems to me that this, yet again, conflates critical thinking with expertise. I think someone could think critically about either, depending on the definition of “thinking well”, but I certainly think that knowing one, can help you think better about another, that is unless of course you persist in conflating “thinking well” with expertise. Or at least have a definition of “thinking well” that is the same as expertise.
Here is another example:
Here’s a classroom-based example. Take two algebra students—one is still a little shaky on the distributive property, whereas the other knows it cold. When the first student is trying to solve a problem and sees a(b + c), he’s unsure whether that’s the same as ab + c or b + ac or ab + ac. So he stops working on the problem, and substitutes small numbers into a(b + c) to be sure that he’s got it right. The second student recognizes a(b+ c), and doesn’t need to stop and occupy space in working memory with this sub component of the problem. Clearly, the second student is more likely to successfully complete the problem.
This seems to me to be back to the same problem. So one student knows more about a problem than another and solves the problem more easily. That does not say that much about who is “thinking well” and who isn’t. And would seem to have little to do with working memory. In fact, you can go on forever finding different variants of the same point and using them as examples.
Here is another one:
Until about 20 years ago, most researchers seemed to have the sense that the range of intelligence was mostly set by genetics, and that a good or poor environment moved one’s intelligence up or down a bit within that range.
A real turning point in this work came during the 1980s with the discovery that IQ scores over the last half century have shown quite substantial gains. For example, in Holland, scores went up 21 points in just 30 years (1952–1982), based on scores from Dutch military draftees.This is not an isolated case. The effect has been observed in over a dozen countries throughout the world, including the United States.* Not all countries have data available to be tested—you need very large numbers of people to be sure that you’re not looking at a quirky subset—but where the data are available, the effect has been found.
These increases in IQ scores are much too large to have been caused by changes in genes.Some of the increase may have come from better nutrition and health care. Some of it may have come from the fact that our environment has gotten more complex, and people are more often called on to think abstractly, and to solve unfamiliar problems—the exact sorts of things you’re often asked to do on IQ tests. Whatever the cause, it must be environmental. But how does that fit with previous intelligence is mostly determined by genetics?
This seems to be a little more than a re-iteration of the nature / nurture debate. Surely a cognitive scientist isn’t suggesting that IQ tests measure intelligence and yet that’s what he seems to be doing. IQ tests do not reflect intelligence or anything else for the that matter other than IQ tests, we are back to solving parlour games. Again we are in the realms of pseudo sciences; like Learning Styles and Cognitive Load Theory, so, what’s the point? The likelihood is that fifty years ago most people were semi literate and working in manual labouring jobs. No one doubts that we can learn. If I suggested that more people are computer literate now than 50 years ago would it be meaningful?
So what’s my point? I think that DT Willingham makes some good points but he doesn’t do it on the basis of science, rather he is socially constructing a position, based on a lot of very scanty pieces of research, leveraged to the point where his position bares little resemblance to the science.
Of course, thinking requires long term memory and knowledge, that is implicit to cognition. “Thinking well” is an epistemic question, and not an empirical one and that Willingham’s position is based upon conflating “thinking well” and expertise.
It seems to me that expertise requires a lot of knowledge, but “thinking well” does not. As fellow Core Knowledge conspirator Ed Hirsch suggests:
“How much do I really need to know about DNA in order to comprehend a newspaper text directed to the common reader”.
The answer Hirsch gives: “Not much”. Presumably we can “think well” about newspaper articles even if we are not experts.
Most of us, most of the time, are thinking based on a limited amount of knowledge or pre-conceptualised knowledge. Often that knowledge is wrong or means something other than it has come to represent, often knowledge becomes commonly re-iterated that is wrong or which the person purported to say it didn’t, in fact say.
Education does not exist to deliver expertise to learners. In some case it needs to deliver enough knowledge to facilitate a career in academia or other knowledge based industries, but for the most part, most of us, are not experts in anything and never will be. We can still “think well” though or, at least, well enough or nothing would ever get done.
Often the most intelligent people don’t know the most, however they are intelligent enough to; listen, be open to ideas and to take the time to consider positions. All these seem to be cognitive skills that have little to do with knowledge or facts. My guess is that these skills rely on personality, genetic inheritance and non cognitive attributes as much as long term memory or facts.
Traducing cognition to the issue of long term memory or facts is, in my view at least, about as much use as a chocolate tea pot. In the longer term we need to look at cognition holistically, even if, in the here and now progressives might say point taken, too much about skills and not enough focus on knowledge.