Data, Information Collection Helps Improve MOOC Experiences

Massive online open courses are evolving thanks to extensive data collection from the early efforts to offer college-level classes to a worldwide online audience for free. According to MIT Technology Review, as the number of MOOCs offered grows, course designers are already looking at ways to make MOOCs 2.0 better.

Andrew Ng, who is the co-founder of Coursera, one of the largest MOOC providers, explains that each course offered by the company provides an opportunity for collecting information on how students learn, giving insight on an unprecedented scale. And it is all being used to tailor the courses in such a way as to help students succeed in their goal of completing their course.

How detailed is the data being collected? According to Ng, the company tracks “every mouse click,” which includes every student interaction with course materials – like knowing which part of the lecture videos get watched and which get fast-forwarded. So important has data collection become, that it is quickly emerging as the chief focus of MOOC technology efforts, taking priority over infrastructure improvements in order to accommodate an ever-greater number of students.

Some recent findings have vindicated aspects of MOOCs’  design. Princeton researchers used data from Coursera to show that the company’s system of peer grading, which calculates grades for coursework based on feedback provided by other students, is effective. Other findings have challenged assumptions about how an online course can successfully cater to hundreds of thousands of students or more.

The information being collected is already having an impact on course designs at another MOOC provider – Udacity. Since the beginning, the chief way the company’s courses put information across was short videos. But according to the data, students seem to get bored even with such bite-sized chunks of information, preferring to skip videos instead of watching them.

This means that the company’s focus has shifted. Sebastian Thrun, who left Stanford University’s robotics department to found Udacity, explains that since the discovery the courses offered on the website now rely on videos much less. Even in those classes that continue to use them, many have been re-recorded to make them more engaging to watch.

Some of the analyses taking place at MOOC companies appear to be answering more-modest questions. “A/B testing,” a methodology common at Internet companies, is being used to try out small design tweaks that might nudge students to do better. A/B testing shows different versions of a service to different segments of a site’s audience to see how they react.
Through A/B testing, says Ng, Coursera recently found that its practice of e-mailing people to remind them of upcoming course deadlines actually made students less likely to continue with the courses. But sending e-mails summarizing students’ recent activity on the site boosted engagement by “several percentage points.” One A/B test by Udacity pitted a colorized version of a lesson against a black-and-white version. “Test results were much better for the black-and-white version,” says Thrun. “That surprised me.”