Assessment Feedback: How Much Should You Give?
By: Annette Wisniewski, MS, CPT
When just starting out as an instructional designer, one of my colleagues had been told that, “All learners should be getting the exact same content in their feedback. We don't want to single out the people who got it wrong.” Ever since then, she has “been providing the same information in both correct and incorrect feedback, and hating it because it [feels] wrong; it feels like it's limiting the learning. It's not prompting the learners to think.” She asked for guidance on how to provide feedback.
I had never heard that "We don't want to single out the people who got it wrong." That's the whole point of assessments – to differentiate between those who "can" from those who "can't". People who answered a question correctly might find explanations on "why" they got the answer right to be excruciating; then again, by their very nature, multiple choice questions supply the right answer – respondents only have to identify which answer(s) is correct. If a question is poorly written, respondents might have correctly guessed the right answer, so they might appreciate the additional explanation, but I would rather write good questions, assume that the respondents weren’t just guessing, and honor their accomplishment by NOT tormenting them by regurgitating what they have just demonstrated they already know.
But how much feedback you provide to right or wrong responses really depends on the purpose of the assessment, the learning goals, and the budget.
Pre-assessment: If the goal of the assessment is to determine what a person knows BEFORE any training, limit the amount of information you provide in feedback to wrong answers. Entice them into taking optional training, or help them realize the value of upcoming mandatory training. Give them a teaser answer, and then direct them to the training that will help them fill in their knowledge gaps. For example, if you were trying to assess how much a new Pharma employee knew about the effects of artificial sweeteners on gut health, your feedback to an incorrect response might be. “Incorrect. A packet of one specific artificial sweetener can negatively affect gut flora for two years. To learn more, take [insert training module name].”
Certification: If the goal of the assessment is to award certification for achieving mastery, do not provide any corrective feedback, other than indicating whether a response is correct or incorrect. For this type of assessment, your goal is not to help respondents learn from their mistakes, but to determine if they are worthy of the credential. And you don’t want to make it easy for them to retake the assessment by spoon-feeding them the correct responses.
Learning: If the goal of the assessment is to help respondents recognize what they learned from a training program and what they didn’t, here is where you would provide correct answers, as well as inform learners where they can get more information or opportunities to practice. If you want to allow them two attempts to answer correctly, then wait to provide detailed feedback until after the second attempt. Any more than two attempts is usually pointless as learners have already reduced possible right answers by two options.
Note: Assessments are NOT the time to teach learners something new. That’s the purpose of training. If you feel compelled to provide new content during an assessment, go back and add it to the associated training material too.
Sometimes, learning goals influence the type of feedback you should provide to a learner. Think about the purpose of the course and your company culture. Are you trying to encourage learners to take initiative? Conduct research? Feel comfortable with failure as a learning strategy? Depending on the circumstances, you may not want to provide learners with corrective feedback, but make them find the answers on their own (or not).
Writing up corrective feedback tailored to individual responses takes time and resources. It is much easier and cheaper to provide the same feedback for both correct and incorrect answers. Think about what you are trying to accomplish with your assessment in terms of respondents and your organization. Count up the number of questions involved. Then, determine if the extra expense is worth it.
I wrote this blog mostly thinking about multiple-choice/multiple-response questions, but these principles apply to any type of assessment instrument. Sometimes, you need to use more than multiple choice to elicit the desired behavior. As one of my mentors is known to say, “If you’re testing swimming skills, there better be water involved.”
Whether you are measuring knowledge, skills, performance, or proficiency, Judge Learning Solutions knows how to create well-written, performance-focused, subjective assessments tailored to your business needs. Contact Judge at JLS@judge.com to learn more.
Keep in Touch
Join Our Winning Team
Let’s Continue the Conversation.
Tell us how we can help you and we’ll be in touch soon.