Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Date of Graduation

Spring 2013

Document Type

Thesis

Degree Name

Master of Arts (MA)

Department

Department of Graduate Psychology

Abstract

This thesis investigated the assessment of student résumés by a career services office. Specifically, the dependability of assessment scores was examined prior to making inferences regarding the value added by a career office’s résumé appointment program. Systematic errors in performance assessment ratings of student résumés were examined to determine the overall dependability of the assessment scores and the precision with which raters score student performance. The absolute dependability of scores was excellent when rubric elements were fixed. Recommendations regarding training and measurement tool improvement were provided given information regarding rater precision around rubric element scores. Such evidence adds to the assessment scores’ validity argument and, specifically, to the validity of inferences regarding the generalizability of student performance assessment scores (Kane, Crook & Cohen, 1999). The value-added analyses revealed medium to large standardized effect sizes across most résumé elements. Measurement information revealed that the area associated with no improvement – objective statement - was affected by inconsistent scoring rules.

Included in

Psychology Commons

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.