One of the two great challenges of online formative assessment is how to actually get the students to do it if it does not contribute directly to a final grade. Assessment drives behaviour. Students will do almost anything (bar study!) to get grades. The rational utilitarian human, beloved of economists, fills our campus, if not our lecture halls. A cunningly devised 1000 question MCQ bank might be a great tool for them to assess their level of knowledge, but if there is no grade attached to it, and they know the final exam is two essay questions, it'll never see a click. They may agree with their lecturers that the formative assessment will help them learn, and be of great benefit, but they won't do it.
Medical Students, of course, are seldom far from their MCQ practice books, always testing their knowledge against practice questions - self directed formative assessment. This is because so much of their assessment appears to rest on questions of this type (if not here in UCC, beyond it). The practice questions are a valid simulation of the real assessment event - they have authenticity to the students. That doesn't mean you need to turn your final exam into an MCQ fest. Questiosn about the content of an audio file might be great prep for an oral exam in languages. Questions about an image could be useful prep for a practical exam in mineralogy or pathology.
Another approach is to get them to do the formative tests in class. This is tricky for an online assessment, as they may not have easy web access in class. If you revert to paper, you are back to marking again (unless you take the time in class to do a peer marking exercise).
Feedback is also important. If the students know that you will take note of the results to guide your teaching - revising topics where students scored poorly, they are more likely to pay attention to the assessments as worthwhile expenditure of their precious time.
Bear in mind is that machine gradable questions are not all MCQs - 'Who want's to be a millionairre' style fact recall tests. Blackboard alone supports 14 different question types. With creativity and thought, they can be genuine tests of high level reasoning an critical thinking. I'm currently collating an example set of such 'smart questions' for a workshop on the topic (contributions welcome!).
There is also no reason you can't mix in non-machine gradable questions in your online assessment. Most systems will support questions with free text responses, and even two or three in the mix could lift the test to a higher level. For example, in the humanities, you might ask some MCQ style questions about whether a given historical text (or audio of a speech) is incomplete, biased or unbalanced in one direction or other to test a readers critical reading skills. Then have a short free text question where they elaborate on why they imagine the bias exists. It will slow down the assessment, but not as much as running full pen and ink test.
No comments:
Post a Comment