Enabling open‐ended questions in team‐based learning using automated marking: Impact on student achievement, learning and engagement
Background Different types of assessments influence learning and learning behaviour. Multiple‐choice questions (MCQs) reward partial knowledge and encourage surface learning, while open‐ended questions (OEQs) promote deeper learning. Currently, MCQs is part of team‐based learning (TBL) curriculum, a...
Saved in:
Published in: | Journal of computer assisted learning Vol. 38; no. 5; pp. 1347 - 1359 |
---|---|
Main Authors: | , , , |
Format: | Journal Article |
Language: | English |
Published: |
Chichester, UK
John Wiley & Sons, Inc
01-10-2022
Wiley Wiley Subscription Services, Inc |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Background
Different types of assessments influence learning and learning behaviour. Multiple‐choice questions (MCQs) reward partial knowledge and encourage surface learning, while open‐ended questions (OEQs) promote deeper learning. Currently, MCQs is part of team‐based learning (TBL) curriculum, and it is challenging to implement OEQs as immediate feedback is necessary.
Objectives
We asked if MCQ and OEQs affect student achievement, student learning and student engagement differently in a TBL classroom.
Methods
MCQs and OEQs test scores of N = 66 students were automatically captured in Learning Activity Management System (LAMS) and were compared using a switching replications quasi‐experimental design with pre‐ and post‐tests to answer the research questions. Student learning approaches and engagement in the team activities were assessed using the study process questionnaire and the structure of observed learning outcomes taxonomy respectively.
Results and Conclusions
Students get significantly higher MCQ scores than OEQs for the same set of questions, but the reverse is true for application exercises (AEs), which focus on higher‐level application. Most students significantly deepened their learning approaches before OEQs, while poorly prepared students were less engaged during OEQ discussions. Interestingly students subjected to OEQs took less time and scored higher in AE discussions, suggesting better focus on higher‐level thinking.
Implications
This project is significant as it bridges our understanding of the value of OEQs and TBL. Our approach is transferable to other courses, and thus it can improve the quality of teaching and learning in tertiary education.
Lay Description
What is already known about this topic?
Multiple‐choice questions (MCQs) do not promote deep learning as much as open‐ended questions (OEQ) do.
Team‐based learning (TBL) requires MCQs because of the need for immediate feedback.
Automated marking can make it more efficient to provide immediate feedback on OEQs as required by the TBL approach.
What this paper adds?
OEQs changed student‐learning approaches in the TBL context.
OEQs changed the quality of team discussions during the team‐readiness assessment.
Test (tRAT) and application exercise phases.
Students were concerned about the validity of automated grading.
Implications for practitioners
Consider using OEQs in TBL context for deepening student learning.
Consider how to defuse students' scepticism of new technology.
Consider forming teams with a mix of deep and surface learners. |
---|---|
Bibliography: | Funding information Nanyang Technological University, Grant/Award Number: EdeX Sophia HueyShan Tan and Guillaume Thibault should be considered joint senior authors. |
ISSN: | 0266-4909 1365-2729 |
DOI: | 10.1111/jcal.12680 |