Extreme responses: Impacts and interventions
Faculty Advisor Name
Brian C. Leventhal
Department
Department of Graduate Psychology
Description
When researchers and evaluators collect information on people's attitudes, they often assume that the individual responses represent the respondent’s true attitude. However, attitude-irrelevant factors such as response styles (RSs) may contribute to individuals’ responses. A RS is the tendency for a respondent to use a response scale in a systematic way regardless of the scale’s content (Leventhal & Stone, 2018; Paulhus, 1991; Plieninger & Meiser, 2014). For example, Extreme Response Style occurs when respondents consistently choose the most extreme response option on a scale (e.g. Strongly Disagree or Strongly Agree), despite that not being their true attitude toward the question. This is a problem given that the presence of a response style hinders the measurement of the attitude. Thus, we may not be able to make accurate conclusions about student attitudes from attitudinal measures if response styles are present.
For the current study, I investigated if a different item format can affect response tendencies. Six hundred first-year undergraduate students responded to a Task Value (TV) and Self-Efficacy (SE) subscale on Fall 2018 Assessment Day. These subscales address student perceptions of the General Education program. I presented the TV and SE subscales to two separate groups, with each group receiving a different item format. For one group I manipulated each item to include a two-stage forced-decision format (Figure 1 and Figure 2). The other group received a traditional item format (Figure 3).
At the first stage, students responded to the question with the following response options: disagree, neutral, agree (Figure 1). This first stage of the decision-making process requires respondents to specify their level of (dis)agreement toward a question. That is, this stage represents the direction of the response. If the student chooses neutral at the first stage, then the student moves on to the next question. If the student expresses an agreement or disagreement to the question at the first stage, they then provide a response at the second stage. The response options at this second stage are dependent on the respondents’ response at the first stage. For example, if a respondent chose Disagree at the first stage, then their response options for the second stage would be: Strongly Disagree, Disagree , and Neutral (Figure 2). Thus, this stage represents the strength of the response. This two-stage decision process can be depicted through a decision-making process (see Figure 4). The arrows depict the decision or direction of the response options, and the circular nodes depict the stages of the response process. Note how this decision making process leads to the final response at the bottom of Figure 4.
In sum, this two-stage item presentation allows researchers to investigate how different item presentations may affect response styles. Thus, I will present the differences in response frequencies between traditional item presentation and this new two-stage item presentation format. Given the results of this current study and future studies, this new proposed item format may transform the way we collect data here at JMU, particularly on Assessment Day.
Extreme responses: Impacts and interventions
When researchers and evaluators collect information on people's attitudes, they often assume that the individual responses represent the respondent’s true attitude. However, attitude-irrelevant factors such as response styles (RSs) may contribute to individuals’ responses. A RS is the tendency for a respondent to use a response scale in a systematic way regardless of the scale’s content (Leventhal & Stone, 2018; Paulhus, 1991; Plieninger & Meiser, 2014). For example, Extreme Response Style occurs when respondents consistently choose the most extreme response option on a scale (e.g. Strongly Disagree or Strongly Agree), despite that not being their true attitude toward the question. This is a problem given that the presence of a response style hinders the measurement of the attitude. Thus, we may not be able to make accurate conclusions about student attitudes from attitudinal measures if response styles are present.
For the current study, I investigated if a different item format can affect response tendencies. Six hundred first-year undergraduate students responded to a Task Value (TV) and Self-Efficacy (SE) subscale on Fall 2018 Assessment Day. These subscales address student perceptions of the General Education program. I presented the TV and SE subscales to two separate groups, with each group receiving a different item format. For one group I manipulated each item to include a two-stage forced-decision format (Figure 1 and Figure 2). The other group received a traditional item format (Figure 3).
At the first stage, students responded to the question with the following response options: disagree, neutral, agree (Figure 1). This first stage of the decision-making process requires respondents to specify their level of (dis)agreement toward a question. That is, this stage represents the direction of the response. If the student chooses neutral at the first stage, then the student moves on to the next question. If the student expresses an agreement or disagreement to the question at the first stage, they then provide a response at the second stage. The response options at this second stage are dependent on the respondents’ response at the first stage. For example, if a respondent chose Disagree at the first stage, then their response options for the second stage would be: Strongly Disagree, Disagree , and Neutral (Figure 2). Thus, this stage represents the strength of the response. This two-stage decision process can be depicted through a decision-making process (see Figure 4). The arrows depict the decision or direction of the response options, and the circular nodes depict the stages of the response process. Note how this decision making process leads to the final response at the bottom of Figure 4.
In sum, this two-stage item presentation allows researchers to investigate how different item presentations may affect response styles. Thus, I will present the differences in response frequencies between traditional item presentation and this new two-stage item presentation format. Given the results of this current study and future studies, this new proposed item format may transform the way we collect data here at JMU, particularly on Assessment Day.