American Educational Research Association (AERA) Annual Meeting
Theme: Leveraging Educational Research in a “Post-Truth” Era: Multimodal Narratives to Democratize Evidence
April 5-9, 2019
Title: Language and Gender Equity in Online Classrooms (Paper)
Event: Gender Inequality in Achievement, Self-Efficacy, and Classroom Interactions in US Education
Authors: Lily Fesler, Thomas Dee, Brent Joseph Evans, Rachel Baker
Abstract: A long-standing body of evidence indicates that female students experience bias in education, starting in early elementary school and continuing through college and into the workplace (e.g., Bian, Leslie, & Cimpian, 2017; Moss-Racusin, Dovidio, Brescoll, Graham, & Handelsman, 2012). However, we still know little about how teachers and students in online classes interact with male and female students. A recent study has shown that teachers in online classrooms are twice as likely to respond to white males than to other students (Baker, Dee, Evans, & John, 2018), but differences in response rates could mask additional biases in the content of the responses. The increasing availability of text data in educational contexts, combined with innovative analytic methods that can be applied to such data, enable a large-scale analysis of the content of interpersonal interactions between instructors and students and between student peers.
We investigate whether female students experience qualitatively different interactions, from both students and teachers, in Massive Open Online Courses (MOOCs). To identify the causal impact of a student’s perceived gender on the language of peers and instructors, we rely on a field experiment in which fictitious students with gendered names post comments in the discussion forums of 124 MOOCs. We then compare responses from real students to the randomized fictive male and female posters to investigate whether the language of instructor and student responses varies by the initial poster’s gender in online education forums.
To measure instructor and student language, we utilize both more traditional qualitative coding with text-as-data methods. We use qualitative coding methods to measure the amount of assistance, acknowledgement, individual attention and disconfirmation (Johnson & Labelle, 2016). We then use dictionary-based (word-count) methods to analyze the sentiment and linguistic style of each response and use an unsupervised machine learning method to investigate whether respondents discuss different topics with the male and female randomized posters.
Despite male and female students discussing the same topics with the male and female randomized posters, we find that females are 28 percentage points less likely to receive assistance and 22 percentage points more likely to receive acknowledgment from the instructional team. This is consistent with qualitative research from the 1980s, which also found that teachers gave more assistance to male students than female students (Sadker & Sadker, 1986). Additionally, instructors use more positive language with females than males, as well as use linguistic styles that contain less analytical thinking and clout (although the linguistic styles are not significant). Instructors’ provision of less substantial feedback to female students may discourage females from continuing their online studies (Ho et al., 2015). Additionally, in STEM courses, female students receive eight percentage points more disconfirming responses than male students, as well as eleven percent less acknowledgment. This result may help to explain why female students often avoid STEM courses in college.