His subject is an ambitious public-health project in Mumbai, India, run by the Municipal Corporation of Greater Mumbai and University College London. The project was not successful in driving meaningful infant-health improvement in the city despite trying a number of approaches. The researchers, to their credit, published their findings on the web, concluding, in part, "Facilitating urban community groups was feasible, and there was evidence of behaviour change, but we did not see population-level effects on health care or mortality."
What is noteworthy is that when the project did not work as planned, the team reported it openly and in detail, providing potentially valuable information for other researchers.
The risk is that too few people will follow. Especially in tough economic times, the pressure is on to show that they are getting bang for their buck. Last year an Obama administration official called on the aid community to adopt a “permanent campaign mind-set,” in which fund-raising and promotion are on the front burner. This creates an incentive to go for easy victories, highlight successes and bury failures. Even with the new fad in the aid world for metrics and impact assessments, their public reports are rarely forthcoming about missteps.
Here's a quotation we used in our previous post: "Science is very inefficient. You try an experiment, fail, try again, fail, try again, it works. And what works is what you publish. All the data about failure is wasted.”
Very true. Let's hope others follow the lead of the Mumbai government & UCL, and not reserve for publication only those projects that succeed.