Do online courses really work? Only 5% of registrants complete MOOCs

Researchers from MIT/Harvard have a study on completion statistics of MOOCs. They analyzed data from 17 MOOCs offered by Harvard and MIT in 2012 and 2013, and found this:


– 841,687 people registered for the 17 MOOCs from Harvard and MIT.
– 5 percent of all registrants earned a certificate of completion.
– 35 percent never viewed any of the course materials.
– 54 percent of those who “explored” at least half of the course content earned a certificate of completion.
– 66 percent of all registrants already held a bachelor’s degree or higher.
– 74 percent of those who earned a certificate of completion held a bachelor’s degree or higher.
– 29 percent of all registrants were female.
– 3 percent of all registrants were from underdeveloped countries.

So, only 5% of registrants complete. And that is after you define completion as “explored at least half of the course content”

The rest of the article tries to argue that completion rates aren’t the best way to judge a MOOC:

A MOOC is more of a blank canvas, said Mr. Ho. Some students who register for MOOCs have no intention of completing, and some instructors do not emphasize completion as a priority. Success and failure take many forms.

I don’t buy it. Most of the people I know who registered for MOOCs would have loved to complete it, but did not have the discipline/motivation to complete. (This list includes me.)

Read the full article

Giving teachers in India bonuses for performance really works, says research

Alex Tabarrok has an interesting article on which points to a very large, randomized experiment by Karthik Muralidharan and others on giving teachers monetary incentives based on the performance of their students in specific subjects, and reports that this significantly improves performances in not only those subjects, but also in other subjects.

Here, it is important to note that we are talking about Indian Schools, run by the government, mostly in rural India. It is also important to note that this is not an armchair philosopher spouting opinions, but actually data from a large, randomized experiment with controls.

Here is some data that really needs to be shouted from the rooftops:

Students who had completed their entire five years of primary school education under the program scored 0.54 and 0.35 standard deviations (SD) higher than those in control schools in math and language tests respectively. These are large effects corresponding to approximately 20 and 14 percentile point improvements at the median of a normal distribution, and are larger than the effects found in most other education interventions in developing countries (see Dhaliwal et al. 2011).

Second, the results suggest that these test score gains represent genuine additions to human capital as opposed to reflecting only ‘teaching to the test’. Students in individual teacher incentive schools score significantly better on both non-repeat as well as repeat questions; on both multiple-choice and free-response questions; and on questions designed to test conceptual understanding as well as questions that could be answered through rote learning. Most importantly, these students also perform significantly better on subjects for which there were no incentives – scoring 0.52 SD and 0.30 SD higher than students in control schools on tests in science and social studies (though the bonuses were paid only for gains in math and language). There was also no differential attrition of students across treatment and control groups and no evidence to suggest any adverse consequences of the programs.

… Finally, our estimates suggest that the individual teacher bonus program was 15-20 times more cost effective at raising test scores than the default ‘education quality improvement’ policy of the Government of India, which is reducing class size from 40 to 30 students per teacher (Govt. of India, 2009).

Read the full article