Measuring MOOC Success Continues to Evolve

How do you measure whether a Massive Open Online Course is actually doing anything when only a small percent of students ever completes the material? John Markoff's New York Times blog says that researchers are working on ways to define success and compare MOOC progress to traditional materials.

When we think of MOOCs, we usually think of open-enrollment courses that present lecture material and optional home work to thousands of anonymous students around the world. Coursera's service is structured this way, and each semester more universities announce partnerships to teach or promote classes. San Jose State University just announced a new pilot program with Coursera, and the American Council on Education has recommended five of Coursera's offerings for possible "real" college credit. In order to study whether MOOCs should be accepted, a new component of San Jose's plan is that it will use online data to try to study the effectiveness of its pilot courses, with funding from the National Science Foundation. Although the students may come in unimaginable numbers and never meet the instructors, they all come through its web portal and leave a lot of data behind.

Advocates of MOOCs point out that while online options may not do some things as well as classrooms, there are also many things they can do better. The key may be to capture these advantages with new ways to teach.

Udacity, along with other MOOC designers, is moving rapidly away from the video lecture model of teaching toward an approach that is highly interactive and based on frequent quizzes and human "mentors" to provide active online support for students.

Moreover, there are early indications that the high interactivity and personalized feedback of online education might ultimately offer a learning structure that can't be matched by the traditional classroom.

The best-studied learning platform so far is Carnegie Mellon University's language courses, called Duolingo. Markoff explains that this program uses "crowd-sourcing" as a learning incentive, which isn't surprising considering that the senior academic adviser is Luis von Ahn, the inventor of "reCaptcha." Von Ahn's insight in reCaptcha was that when people type words into boxes in order to prove that they are using human vision, not robotic text-reading, they are spending effort that could be used to retype historical newspapers and books.

Von Ahn's team designed Duolingo to be like a game, with "lives" and points, so that students go up in stature while learning more of their target language. The service is free because businesses can submit texts they need translated, and sentences are "crowd-sourced" to students, who vote for the best translation. With at least one million users, texts can be translated quickly.

CMU also commissioned a research study to see if Duolingo was teaching language as efficiently as a classroom.

Conducted this fall by Roumen Vesselinov, a visiting assistant professor of economics at Queens College, City University of New York, and John Grego, chairman of the statistics department at the University of South Carolina, the study compared Duolingo to offline learners and found that on average a person with no knowledge of Spanish would need an average of 34 hours to cover the material for the equivalent of a first college semester.

In a standard 15-week semester, at least 3 hours is spent in the classroom. 45 class hours are used, excluding extra time in language labs and on homework. If 34 hours online can teach the same material, it is probably more efficient. Because of Duolingo's game-like structure, students may be more motivated to put in more time than students who are working for a grade. Game structure is much harder to use in a classroom, so this innovative type of online learning could turn out to be a hands-down winner.

Privacy Policy Advertising Disclosure EducationNews © 2020