Date of Award

Spring 2015

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Department

Department of Graduate Psychology

Advisor(s)

Donna L. Sundre

Abstract

Assessment practitioners in higher education face increasing demands to collect assessment and accountability data to make important inferences about student learning and institutional quality. The validity of these high-stakes decisions is jeopardized, particularly in low-stakes testing contexts, when examinees do not expend sufficient motivation to perform well on the test. This study introduced planned missingness as a potential solution. In planned missingness designs, data on all items are collected but each examinee only completes a subset of items, thus increasing data collection efficiency, reducing examinee burden, and potentially increasing data quality. The current scientific reasoning test served as the Long Form test design. Six Short Forms were created to serve as the planned missingness design that incorporated 50% missing data. Examinees mid-way through their educational career were randomly assigned to complete the test as either a Long Form or Short Form. Multiple imputation was used to estimate parameters for both conditions. Group mean test performance was higher in the Short Form condition compared to the Long Form. Although slightly higher examinee motivation and less shared variance between test-taking effort and test performance was observed in the Short Form condition, these effects were not statistically significant. Internal consistency coefficients and item parameter estimates were also similar between the form conditions. Although effect sizes were small, the implications of these results for assessment practice are substantive. This study supported the use of planned missingness designs for accurate estimation of group student learning outcomes without jeopardizing psychometric quality. The synthesis of planned missingness design and examinee motivation literatures provide several opportunities for new research to improve future assessment practice.

 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.