Preferred Name

Rory A. Lazowski

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Date of Graduation

Spring 2015

Document Type


Degree Name

Doctor of Philosophy (PhD)


Department of Graduate Psychology


Dena A. Pastor


Abstract (Paper 1)

Dating back to only the early 1970’s, the use of meta-analysis has recently grown steadily in the fields of psychology and education, after initially being used in the physical sciences. Meta-analysis is often lauded as an effective analytic tool to inform practice and policy, disentangle conflicting results among single studies, and identify areas that require additional information for a certain topic. However, because routine use of meta-analysis is relatively recent, there remain methodological issues that require clarity. In addition, as more advanced analytical and statistical techniques emerge, there is a need to examine how these techniques can be applied to meta-analysis and how these techniques differ from more traditional approaches to meta-analysis. Using data from a recent meta-analysis conducted by Lazowski and Hulleman (2015), this work is intended to be a tutorial to examine some of the methodological issues associated with meta-analysis. More specifically, the tutorial first examines the concept of effect size use in meta-analysis, the choice of analytic technique (fixed versus random effects models using traditional approaches), and comparisons of traditional approaches to a more recent approach to meta-analysis, multilevel modeling. The tutorial highlights differences in results that can be obtained depending on whether a fixed effects or random effects model is adopted. The tutorial also largely demonstrates similarities in the results obtained between traditional approaches to meta-analysis and the multilevel approach, although some differences are discussed in areas of notation, output, initial models used, and the advantage of additional flexibility associated with the multilevel analyses. Next, the issue of publication bias is discussed and the methods to detect publication bias (funnel plot, Orwin’s fail safe n, and the trim and fill method) are presented and subsequently illustrated using the Lazowski and Hulleman (2015) data. Finally, the present investigation concludes with an examination of best practices related to the inclusion of both published and unpublished (grey) literature in meta-analyses.

Abstract (Paper 2)

Intervention studies are a particularly important and valuable facet of educational research. This paper first discusses how intervention work can be used to help inform theory, research, and policy/practice in a multitude of ways. However, despite these benefits, intervention research in the field of education has been on the decline over the past two decades (Hsieh et al., 2005; Robinson et al., 2007). The field of academic motivation research is no different. Notwithstanding the considerable volume of theoretical, qualitative, observational, and correlational studies, there have been fewer experimental tests of motivation theory in the field of education (Wentzel & Wigfield, 2007). In order to systematically evaluate what has been done to date, Lazowski and Hulleman (2015) conducted a meta-analysis examining motivation interventions that were conducted in authentic educational field settings (e.g., classrooms, workshops) and found that the motivation interventions in this meta-analytic review were promising, averaging approximately a half a standard deviation effect size (d = 0.49; 95% CI = [0.42, 0.56]). However, although formal meta-analytic techniques can provide a quantitative analysis that can be useful in summarizing the interventions, one limitation is that there is often not enough space to also provide a comprehensive narrative review of the studies included. Thus, a narrative review can offer qualitative insight that can complement the quantitative analyses found via meta-analysis. Toward this end, in this paper I offer a more thorough narrative review of the studies included in our meta-analysis. Given the conceptual overlap among the theories and constructs therein, the expectancy-value framework is proposed as a means to organize the various intervention studies. In accordance with this organization, theories are categorized based on whether the studies primarily target student (a) expectancies, (b) values, or (c) cost. In addition, within the general categories of expectancies, values, and cost I identify specific sources or pathways of expectancies, values, and cost that can be targeted by interventions. These sources or pathways refer to the underlying psychological processes that both serve as antecedents and that are potentially amenable to intervention by educational practitioners, including teachers, parents, and administrators (Hulleman et al., in press).



To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.