Uncategorized

Do multiple choice questions assess “higher order thinking”?

I’ve been pondering whether multiple choice questions assess “higher order thinking“.

Daisy Christodoulou thinks they do:

It is true that many selected-response questions do measure only shallow learning, but well-designed selected-response items can probe student understanding in some depth.’

Citing Dylan Wiliam:

Wiliam gives some good examples of these types of question

This example is from Dylan Wiliam’s site:

dylan wiliam

What can you say about the means of the following two data sets?

Set 1: 10 12 13 15
Set 2: 10 12 13 15 0

A. The two sets have the same mean.
B. The two sets have different means.
C. It depends on whether you choose to count the zero.

Daisy Christodoulou says this about it:

As he says, ‘this latter option goes well beyond assessing students’ facility with calculating the mean and probes their understanding of the definition of the mean, including whether a zero counts as a data point or not.’ I would add that these types of questions can offer better feedback than open-response question. If you deliberately design a question to include a common misconception as a distractor, and the pupil selects that common misconception, then you have learnt something really valuable – far more valuable than if they simply don’t answer an open-response question.

Don’t get me wrong attempting to improve multiple choice questions for “learning” is no bad thing. Mention “higher order thinking” and the alarm bells start ringing.

In my view, this type of assessment encourages teaching to the test. A good teacher knows that questions are oft repeated, tricks used “year in / year out”. Exam boards get lazy and the cost of maintaining quality must be high.

The question above is a good example. Knowing whether a zero is included in the data set, or otherwise, is an “erroneous proposition”. If you know the definition of “mean” you do not need to know whether a zero is included, it is implicit to the definition.

Even so, teachers will have to teach the “erroneous proposition” when it crops up on a past paper, just in case. Even good students can simply think themselves into errors. Students who are insecure in their knowledge or simply prone to challenging their own beliefs could start to try and game the question. Look for complexity that isn’t there.

I think we can disregard the notion of a “misconception”. If someone believes something incorrectly (as opposed to being confronted by an erroneous proposition) then they will get it wrong. They may know the definition of the mean, and how to do it, but the misconception is their undoing. Testing incorrect beliefs (misconceptions) does not test higher order thinking. It just tests for the misconception.

Another example is this:

15. How did the Soviet totalitarian system under Stalin differ from that of Hitler and Mussolini?

A. It built up armed forces.
B. It took away human rights.
C. It made trade unions illegal.
D. It abolished private land ownership.

Daisy Christodoulou says this:

I think this is an excellent question which very definitely asks for higher order thinking. The reason why I think this question is so good is that it tests a finer gradation of understanding. Everyone knows the Nazis and Soviets were evil, and because they were evil, it is easy for pupils to just think that their regimes were the same. And of course the regimes were very similar. But they were different in interesting ways too, and this question probes that. A pupil who got this question right would have understood something important. A pupil who didn’t would have misunderstood something quite important.

This question seems to me to be entirely dependent upon knowledge propositions not higher order thinking. I know both did “A” and “B”. Mussolini did “C” (arguably so did Soviet Russia) and I suspect Soviet Russia did “D” based on Marx’s view on private ownership so I presume the answer is “D”. Even so, I’m not sure that any higher order thinking is going on. I know Marx’s view on private ownership, I don’t need to know why he had that view, in order, to answer the question. A more sophisticated answer would suggest that all totalitarian regimes do all four but use different discursive structures to justify what they are doing.

More problematic a teacher,  without the accompanying thought process, cannot know how, or why, a student arrived at an answer. Multiple choice questions are blunt instruments. They do not differentiate between those who don’t know the definition or proposition of “things”, those who hold common misconceptions, those who are insecure in their knowledge and those who get lucky. The ability to explain how an answer was derived is surely valuable. A teacher cannot rely on the assumption that someone got it wrong based on the misconception. It would be necessary to find out why someone got it wrong, which essentially means reverting back to open questions.

Multiple choice questions are a quick and dirty form of assessment. A very useful, and cheap, way to triangulate data generated by qualifications that are largely institutionally assessed such as BTEC’s. As a teacher I like them because they can be gamed.

The problem for me is that multiple choice questions seem symptomatic of the neo-traditionalist desire to reduce knowledge to simple formulaic propositions. Assessment easily replicated by computers. Purposeful to check that “the best that has been said and done”, in the past, has been transmitted faithfully. “Higher order thinking”? I remain somewhat unconvinced.

Advertisements

4 thoughts on “Do multiple choice questions assess “higher order thinking”?

  1. Nice post.

    I made the very same points when Daisy C blogged on this issue. She did not respond so don’t hold your breath here.

    I think there are more benefits to multiple choice question than you suggest and I am quite a fan.

    I believe the issue is actually quite a straightforward one. I can ask a question without any cues that requires students recalling information from memory, analysing information and/or applying procedural knowledge. I would simply ask the student to write the answer and explain why they gave the answer they did.

    I can ask the same question but offer 4 alternative answers from which the student will choose one. Possibly the student will recall the answer or recognise the answer from the cues.

    I can use any sort of assessment I wish and the information I am able to glean from the response will differ depending on the design of the question and the type of response.

    If the student has no idea of the answer they can guess, or indeed they may be able to recognise the correct response while they couldn’t recall it.

    A free response question can also be guessed but the range of possibilities is greater.

    I can pose a problem and ask students to answer the question and provide working. This will give me an idea whether you have gussed or used an appropriate process for solving the thing.

    You can never guarantee that a student will have used HOT skills in solving a MCQ if they simply choose a response. Having said that you can pose a number of problems that require the same basic problem solving process/knowledge and maybe the statistics will give you a clue to whether they can use the HOT in this context.

    Well designed questions can I believe test HOT skills reliably. One of the main issues with the Daisy C blog was that I thought the questions she used as exemplars were actually quite dire and did not support her assertions.

    I will monitor this post with interest to see what others think.

    1. Thanks for the response, appreciate it. Firstly I agree the questions used are poor particularly the “mean” one. I think MCQ’s are useful but in terms of HOT thinking skills I reckon we could sit here all day and struggle to agree what they are. I go back to my Derrida post (a couple of BLOG’s ago) and say that as often as not we can recognise HOT skills by what they aren’t as much as what they are.

      I don’t write off MCQ’s in truth when you are putting forward a counter argument you tend to come across less positive than you are. Having said that I’m not very in favour of them as much as anything because I’ve never seen really good questions and as often as not I feel sorry for the students because the questions on MCQ papers are often incomprehensible.

      But you know I’m happy to change my mind.

  2. Interesting post – I too made comments on the DC blog post but have found that there is often a lack of willingness to enter into debate on the areas that do not concur with her thinking. I would argue that MCQs have some place and these might be in the use of software for quick and simple feedback for knowledge acquisition – this can be very useful. I use an app called Neapod which allows me to give a series of questions to students which they can then see the feedback on very quickly – the marking is all done by the machine – but I would want to recognise the cognitive limits of this.

    DC sits firmly in the “knowledge is best” camp and so am not surprised at her reasoning though like you I am unconvinced.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s