Transparency of Cognitive Complexity in Performance Assessments: A Validity Study

Performance assessments (PAs) offer a more authentic measure of higher order skills, which is ideal for competency-based education (CBE) especially for students already in the workplace and striving to advance their careers. The goal of the current study was to examine the validity of undergraduate...

Full description

Saved in:
Bibliographic Details
Published in:The journal of competency-based education Vol. 6; no. 2
Main Authors: Hayes, Heather, Demeter, Marylee, Morris, John G, Trajkovski, Goran
Format: Journal Article
Language:English
Published: Wiley 01-06-2021
Subjects:
Online Access:Get more information
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Performance assessments (PAs) offer a more authentic measure of higher order skills, which is ideal for competency-based education (CBE) especially for students already in the workplace and striving to advance their careers. The goal of the current study was to examine the validity of undergraduate PA score interpretation in the college of IT at a CBE online, higher education institute by evaluating (a) the transparency of cognitive complexity or demands of the task as communicated through the task prompt versus expected cognitive complexity based on its associated rubric aspect and (b) the impact of cognitive complexity on task difficulty. We found that there is a discrepancy in the communicated versus expected cognitive complexity of PA tasks (i.e., prompt vs. rubric) where rubric complexity is higher, on average, than task prompt complexity. This discrepancy negatively impacts reliability but does not affect the difficulty of PA tasks. Moreover, the cognitive complexity of both the task prompt and the rubric aspect significantly impacts the difficulty of PA tasks based on Bloom's taxonomy but not Webb's DOK, and this effect is slightly stronger for the rubric aspect than the task prompt. Discussion centers on how these findings can be used to better inform and improve PA task writing and review procedures for assessment developers as well as customize PAs (their difficulty levels) to different course levels or individual students to improve learning.
ISSN:2379-6154
2379-6154
DOI:10.1002/cbe2.1244