Date of Award

Spring 2013

Document Type

Thesis

Degree Name

Master of Arts (MA)

Department

Department of Graduate Psychology

Abstract

This thesis investigated the assessment of student résumés by a career services office. Specifically, the dependability of assessment scores was examined prior to making inferences regarding the value added by a career office’s résumé appointment program. Systematic errors in performance assessment ratings of student résumés were examined to determine the overall dependability of the assessment scores and the precision with which raters score student performance. The absolute dependability of scores was excellent when rubric elements were fixed. Recommendations regarding training and measurement tool improvement were provided given information regarding rater precision around rubric element scores. Such evidence adds to the assessment scores’ validity argument and, specifically, to the validity of inferences regarding the generalizability of student performance assessment scores (Kane, Crook & Cohen, 1999). The value-added analyses revealed medium to large standardized effect sizes across most résumé elements. Measurement information revealed that the area associated with no improvement – objective statement - was affected by inconsistent scoring rules.

Included in

Psychology Commons

Share

COinS