Preferred Name
Juste Mehou
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
Date of Graduation
5-15-2025
Semester of Graduation
Spring
Degree Name
Master of Arts (MA)
Department
Department of Graduate Psychology
First Advisor
Brian Leventhal
Abstract
International assessments, such as the Programme for International Student Assessment (PISA), are instrumental in understanding educational progress, assessing teaching effectiveness, and ensuring institutional accountability (OECD, 2023). Central to the validity of these assessments is the assumption that students will exert sufficient effort and perform to the best of their abilities (AERA, APA, & NCME, 2014). This assumption is essential to ensure that scores accurately reflect students' true abilities (Wise & Kong, 2005). However, there is a documented concern that students often exhibit low effort in low-stakes tests, such as the PISA, where individual results have no direct consequences (Wise & DeMars 2010). This lack of personal stake can lead to disengagement, introducing construct-irrelevant variance into the scores. Research into disengagement in international assessments like PISA to examine predictors of students’ engagement at the test design- and item-level is limited, with the only notable study on the PISA being by Rios and Soland (2022) who employs a probit link function to model disengagement as a dichotomous variable. However, this approach may not account for the correlation between disengagement and ability, impacting estimates of both. This research addresses this gap by employing a latent variable approach that accounts for the potential correlation between ability and disengagement, leveraging the flexibility of the Item Response Tree (IRTree) model for Disengagement developed by Leventhal and Pastor (2024). Specifically, I integrate an explanatory model within the IRTree framework, incorporating test design and item-level characteristics hypothesized to influence disengagement. Overall, the results confirm previous findings while also introducing new insights regarding test design and item position. I discuss these results in detail and explore avenues for future research.
Included in
Educational Assessment, Evaluation, and Research Commons, Educational Methods Commons, Educational Psychology Commons, Science and Mathematics Education Commons
