Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.
Date of Award
Doctor of Philosophy (PhD)
Department of Graduate Psychology
Dena A. Pastor
Christine E. DeMars
In higher education, we often collect data in order to make inferences about student learning, and ultimately, in order to make evidence-based changes to try to improve student learning. The validity of the inferences we make, however, depends on the quality of the data we collect. Low examinee motivation compromises these inferences; research suggests that low examinee motivation can lead to inaccurate estimates of examinees’ ability (e.g., Wise & DeMars, 2005). To obtain data that better represent what students know, think, and can do, practitioners must consider, and attempt to negate the effects of, low examinee motivation. The primary purpose of this dissertation was to compare three methods for addressing low examinee motivation following data collection (i.e., “post-hoc” methods): (1) leaving the data as they were observed (leaving rapid responses intact), (2) motivation filtering (listwise deleting examinees with more than an acceptable amount of rapid responses), and (3) using multiple imputation with auxiliary variables to impute plausible solution-behavior responses in place of rapid responses. The data analyzed in this study came from the Natural World Test (NW-9; Sundre, 2008), which was administered to James Madison University students before and after completing coursework designed to improve their quantitative and scientific reasoning skills (and thus their NW-9 scores). After applying the three methods, mixed ANOVAs were performed to investigate the main effects of time and number of courses completed, and their interaction, on examinees’ scores. These analyses aligned with the following overarching question: Do the inferences we make about student learning depend on the post-hoc method used to address low examinee effort? Of the three methods, motivation filtering produced the highest estimates of examinee ability. Leaving the data as they were observed produced the lowest estimates. Multiple imputation produced estimates between those from the previous two methods. Although the estimates differed by post-hoc method, the same substantive conclusions were reached. For this study, regardless of post-hoc approach, we concluded that examinees’ scientific and quantitative reasoning abilities changed over time, and that examinees who completed more relevant courses did not change significantly more than examinees who completed fewer relevant courses.
Foelber, Kelly J., "Using multiple imputation to mitigate the effects of low examinee motivation on estimates of student learning" (2017). Dissertations. 148.