Is multiple choice really so bad?
For the humanities, we answer unequivocally: “sorta.” If any of the truly important questions about the human condition had unambiguously correct answers, human history would have been a long boring tale of comity and well-being.
As a result, any attempt to assess subjects in the humanities through multiple choice cannot and do not broach the questions dearest to humanists. Instead they must skirt and skulk around looking for secondary signs of an engaged and thinking mind. We ask our students to identify metaphors and supporting arguments without also asking: How evocative is the metaphor? How convincing is the argument?
Why? Because multiple choice questions as we all know must by definition contain at least one unambiguously correct answer. And we also know, the more interesting the question the less clear the answer.
Still, is there a way to do multiple choice that’s open-ended and allows for asking questions with no right answer? We think we have a (nuanced, textured) answer to that question. We have embraced multiple choice, sorta …by turning it inside-out and upside-down.
In Ponder multiple choice, the questions are predefined and the possible answers are infinite. The questions are not questions for the students (e.g. Can you identify the topic sentence of this paragraph?), they’re questions for the author of the text, opportunities to “talk back” to the reading and it’s up to the student to figure out where and how to ask them.
After all, we learn by asking questions, not answering them.
Our readers have a more important task than simply finding examples of Metaphor versus Simile, Hubris versus Pride, Compromise versus Conciliation (themes predefined by teachers). They are also asked to identify examples that perplex them, intrigue them, shock, disgust, inspire, agitate, make them wonder if someone might be a touch hysterical or someone else is oversimplifying something that deserves more serious consideration (reactions predefined by us). They don’t simply observe and identify, they analyze and evaluate, both the author’s ideas and their own reactions to those ideas. In a word, they think.
Still, assessment is simple. Though Ponder is not a Scantron machine that can tell you automatically who was right and who was wrong, it’s concise, data-rich, and designed to call attention to good work. It is easy for both teachers and classmates to evaluate each response.
- Has it substance?
- Is the reaction apt?
- Are the themes apt?
If not, let’s talk about it!
And Ponder is only getting smarter. While we can’t pass absolute judgement on student responses, what we can do automatically is build a nuanced profile of each student over time.
- Who’s taking the time to read in-depth articles?
- Who’s expanding beyond their comfort zone to read about new subject areas?
- When confronted with something confusing, who’s able to identify exactly how they’re confused?
- Who’s reacting emotionally? Who’s able to evaluate the soundness of logic?
- Who’s figured out how to get their classmates interested in what they’re interested in?
- Who’s good at starting conversations?
What we’re interested in is the ability to paint a portrait of readers that reflects their level of curiosity, comprehension, self-awareness and awareness of others.
We can’t produce a number with the finality of a Scantron machine.
But really, what does a 67 versus an 83 really mean when we’re talking about the Bill of Rights or the Leaves of Grass?
You’ll be able to explain that number better with the insights Ponder affords you into your students’ thinking. Multiple choice is really not so bad if you don’t let it kill the open-ended nature of intellectual inquiry. We like it, sorta.