"Improving Academic Outcomes through Time Management: A Randomized Control Trial of a Scheduling Intervention to Improve Performance in Online Postsecondary Courses"
Association for Education Finance and Policy (AEFP) 43rd Conference
March 15-17, 2018
Title: Improving Academic Outcomes through Time Management: A Randomized Control Trial of a Scheduling Intervention to Improve Performance in Online Postsecondary Courses
Authors: Bianca Cung, Qiujie Li, Brent Evans, Rachel Baker
Online education is taking an increasingly prominent role at postsecondary institutions. Public, non-profit, and for-profit colleges offer for-credit online courses that fulfill requirements in certificate and degree programs. Research has found success in online courses varies by student characteristics, such as student motivation, interaction in the course, proclivity towards self-regulated learning, and time management skills (Cochran et al., 2014; Hart, 2012; Park & Choi, 2009; Rostaminezhad et al., 2013). Several studies suggest that time management skills are critical to the success in online courses (Elvers, Polzella, & Graetz, 2003; Michinov, Brunot, Le Bohec, Juhel, & Delaval, 2011; Nawrot & Doucet, 2014; Roper, 2007), and one meta-analysis found a positive relationship between time management and academic outcomes (Broadbent & Poon, 2015).
In prior work, we experimentally studied the effect of a time management, scheduling intervention on academic performance in a for-credit online course (Baker, Evans, Li, & Cung, 2017). Students who received the scheduling treatment were asked before the first two weeks of class to plan and indicate when they would watch the course lecture videos. Students in the control condition received a theoretically inert survey. We found statistically significant treatment effects for the first week’s quiz scores, but not for other outcomes. We further examined two potential mechanisms that could explain the scheduling nudge’s effect on achievement: procrastination (defined as the weekly average time when students start watching lecture videos) and cramming (defined as the weekly standard deviation in time when students start watching lecture videos). Although we did find that both procrastination and cramming were correlated with quiz scores, we did not find any significant treatment effects on these mechanisms.
We hypothesized that the results in our first study could be due to two factors: the fact that we offered the intervention for only the first two weeks, and the fact that we offered incentives for scheduling but did not offer any incentive to comply with the schedule.
To test our hypotheses, we have extended our research to examine a similar scheduling intervention administered to students across all five weeks of the same course in a subsequent year. The course was offered during the summer term at a selective four-year public institution. Students who were assigned the treatment survey were asked before the start of each week to schedule when they would watch the course lecture videos. Unlike our previous study, students in the treatment group only received extra credit if they both completed the survey and followed through as scheduled, which we observe on the course platform. Students who were assigned the control survey received a theoretically inert survey that asked them about their experience in taking an online course. Control students received full extra credit for completing the survey.
We have access to achievement outcomes, clickstream data from the course management platform, and survey responses of enrolled students. The survey data includes a number of validated measures of time management, effort regulation, and self-efficacy. These measures allow us to examine heterogeneous effects of the intervention across subgroups that we theorize might be differentially affected by scheduling structure.
We conducted regression analyses using weekly quiz and final exam scores as outcomes. Assignment to treatment did not have a statistically significant effect on achievement scores. However, we found suggestive evidence of treatment heterogeneity across time management characteristics. Specifically, we focused on three covariates from the pre-course survey: whether student expected to work more than 11 hours per week on this course, whether students perceived themselves to be good at time organization and scheduling, and whether students usually procrastinated and postponed their study until close to the deadline. Among those who reported fewer expected working hours, lower ability in time organization and scheduling, and higher procrastination, treatment students scored consistently lower than control students in quizzes in the first four weeks out of the five-week course. While consistent in sign, our findings were only marginally significant for some estimates.
Ongoing analyses will investigate how the treatment affects student study habits in the online course, including whether students start work earlier or spread out their study sessions. We will analyze student clickstream data and compare the time that students start on course lecture videos and post-video questions. We will use regression analyses to compare the weekly average and spread of video start times.
Many educational institutions are adopting and expanding online education, and improving student success in those courses is a notable goal. Our study investigates a cheap and easy-to-implement intervention that could potentially affect achievement and persistence in online courses.