Skip to content

Development Social Science in medical journals: diagnosis is caveat emptor

Aid Watch has complained before about shaky social science analysis or shaky numbers published in medical journals, which were then featured in major news stories. We questioned creative data on stillbirths, a study on health aid, and another on maternal mortality.

Just this week, yet another medical journal article got headlines for giving us the number of women raped in the DR Congo (standard headline: a rape a minute). The study applied country-wide a 2007 estimate of the rate of sexual violence in a small sample (of unknown and undiscussed bias). It did this using female  population by province and age-cohort  — in a country whose last census was in 1984. (Also see Jina Moore on this study.)

We are starting to wonder, why does dubious social science keep showing up in medical journals?

The medical journals may not have as much capacity to catch flaws in social science as in medicine. They may desire to advocate for more action on tragic social problems. The news media understably assume the medical journals ARE vetting the research.

We could go on and on with examples. The British Medical Journal published a study of mortality of age cohorts in five year bands for both men and women from birth to age 95 for 126 countries—an improbably detailed dataset. (The article was searching through all the age groups to see if any group’s mortality was related to income inequality.).  Malaria Journal published a study of nationwide decreases in malaria deaths in Rwanda and Ethiopia, except that the study itself admitted that its methods were not reliable to measure nationwide decreases  (a small caveat left out later when Bill and Melinda Gates cited the study as progress of their malaria efforts).

The Lancet published a study that tested an “Intervention with Microfinance for AIDS and Gender Equity (IMAGE)” in order  “to assess a structural intervention that combined a microfinance programme with a gender and HIV training curriculum.” The conclusion: “This study provides encouraging evidence that a combined microfinance and training intervention can have health and social benefits.” This was a low bar for “encouraging:” only 3 out of the 31 statistical tests run in the paper demonstrate any effects — when 1 out of every 20 independent tests of this kind show an effect by pure chance. (The Lancet was also the culprit in a couple of the links in the first paragraph.) Economics journals are hardly foolproof, but it’s hard to imagine research like this getting published in them.

Medical journals would presumably not tolerate shaky medical science in the name of advocacy; why in social science? We also care about rape, and stillbirths, and dying in childbirth. That’s why we also care about the quality of social science applied to these tragic problems.

Postscript: we are grateful to Anne Case and Angus Deaton for suggestions and comments on this article, while not attributing to them any of the views expressed here.

This entry was posted in Academic research. Bookmark the permalink. Follow any comments here with the RSS feed for this post. Both comments and trackbacks are currently closed.


  1. Mozza wrote:

    Interestingly enough, your tweet announcing this blog post was next to a tweet by @timharford announcing this article about a Harvard economist helping doctors.

    “The experience was as illuminating to Roth as it was to some of the doctors who had been skeptical that an economist would have any clue how to help them.”

    And further down, he says that to get his idea about kidney transplants accepted, he’ll need to publish them in a medical journal.

    Maybe the solution is having social scientists review the articles with social science submitted to medical journals, rather than isolating the two disciplines.

    Posted May 16, 2011 at 11:34 am | Permalink
  2. I coincidentally ended up blogging about a related issue on the Development Impact blog today (
    which was about a study in the journal Science that got headlines this week.

    However, I also wonder whether in our quest to nail all the arrows in your napkin drawing (and let’s not forget there are more steps than you show, like “then provide theory, and mechanisms, and discuss external validity, and, and, and…) we sometimes miss some interesting proofs of concept or ideas …

    But totally agree with the point about shoddy statistics being used for great headlines.

    Posted May 16, 2011 at 11:37 am | Permalink
  3. Not to put too fine a point on it, but the last census was actually taken in 1982, not 1984. That said, voter registration for the 2006 elections did provide for some better estimations of populations in the DRC.

    Posted May 16, 2011 at 12:12 pm | Permalink
  4. Cathy Watson wrote:

    For a really good study that is not getting any attention on the development blogs, please see The Height of Nations study by Professor SV Subramanian et al at Harvard. Using data from 54 countries, 31 of them in Africa, he shows that women in 14 African countries are getting shorter and that women in 15 African countries are stagnating in height — when the worldwide trend is to get taller. This shows that conditions are getting worse for women and children. We need to discuss what this means. I wrote this up for The Star in Nariobi and The New Vision in Kampala.

    Posted May 16, 2011 at 1:21 pm | Permalink
  5. Gabo wrote:

    Great point. Well, the medical pharmacuetical complex had been pulling a fast one on us here at home too. Wsj and nytimes have both repeatedly written about the dubious claims of psycho tropic drugs with their relationship to depression. And the list is long. It’s just endemic to the field given the rampant corruption and influence of drug lobbies. And they carry that on with international stuff. Very good point.

    Posted May 16, 2011 at 3:07 pm | Permalink
  6. Ash S wrote:

    I feel these criticisms are strangely vague. It seems to me like most of the potential biases (old population data, the fact that it’s 2006-7 instead of now and obviously the under-reporting of sexual assault to unknown questioners bias) are negative biases, that the estimates are more likely to be under- than over-estimates.

    Difficult to tell about the sample size and biases since I don’t have the article, but since it’s not really that difficult to do random sampling, and it has been done in a few excess mortality studies of the DRC conflict, I don’t see why this would necessarily be a problem.

    This is the first study to try to put some kind of figure on the wider problem. It provides a shockingly high figure that seems to me to be probably an underestimate. Yes it creates headlines, but it seems to me that you are taking exception to it for precisely that reason rather than anything really methodological.

    Posted May 17, 2011 at 7:33 am | Permalink
  7. Neal wrote:

    This problem is also found in the environment field, with very dodgy economics getting published in journals like Science

    Posted May 18, 2011 at 3:36 am | Permalink
  8. Grant Rhodes wrote:

    I think there are more than 6 or 7 thousand medical journals or not? What a lot of paper this must be. It must serve some other use than the NYT? Presuming this has been going on a while, let’s say at least 20 years, how many PhDs in ‘economics and/or finance’ from medical and public health schools would this make? And what jobs might such people be interested in going on to do? i wouldn’t fancy the chances of an average MOF or PFM person facing such massed ranks of highly qualified expertise in a policy meeting.

    Maybe development economists should have been paying a bit more attention to the ‘action research’ theories and practices of their (medical anthropology and )sociological colleagues. Luckily, Social Network Analysis makes quantitative analysis of the application of such ‘soft’ skills more possible these days.

    Posted May 18, 2011 at 12:09 pm | Permalink
  9. Canada sucks wrote:

    I’ve been thinnking about lobbies and how they are partially equivalent to dictatorships corruption-wise, in that central bank money is being siphoned to loud brats. I think investors look at corruption indexes and penalize corruption. Much of inefficent or evil lobbying is similiar yet is untouched or is even encourage (future floods and famines) by NATO and the CIA. A partial solution would be for captured countries to be penalized. The standard lobby voice I’m aware of in this hemisphere is obviously rich people since Reagan and precursor Chicago School. These countries should depreciate currencies or pay more to borrow.
    In the developing world, foreign aid can be tied to best practises. But in the developed world there isn’t a similiar incentive to make a happy healthy population. The Christian Right has plateaued quality-of-living in North America. At the very least things like recent coffee and prstate cancer reduction link, the social cost-estimate of university….these should be subject to rigorous studies if novel and the research be conducted in a way that ports easy to other countries. The researchers have a bit of this networking, but the granting Crowns should immediately be able to follow up the research.
    The 2006 GPI had a flaky valuation of $16000 for the social benefit of an individual university degree. Where there is that much uncertainty the ROI must be higher. Long-term Bonds are the most efficient funding mechanism, indexed to pay off a year after the new middle class enjoys higher earnings. There is more to developing a tax base than Crowns, and if some cheaper future funding avenue comes along there will need to be an ealry payout penalty….
    For example, if university were assured to return around 16%/yr for the developing world….the bonds for a university construction might start paying off in eight years (construction, classes and time until first degree job), maxing out at highest expected wage. These are expensive loans. But the Bonds would merely need to be marketed and demoed for mainstream finance the way microfinance was absorbed after 15 years of Yunus demos. Getting info and data is expensive and would only be cost-effective if future funds were dependant on data. Another example, some USA States have healthy lunch programmes. Why not track grade scores and earnings potentials. Is surely worthwhile if world uses data. Maybe a fund for the data and some sort of Thomsons-Reuters database Crown ROI researchers (not lobbyists, people looking at the science of longevity and happiness).

    Posted May 18, 2011 at 2:58 pm | Permalink
  10. Reuben Waltz wrote:

    Nice post… I just wonder if you’re as critical about your research as you are with the other research.

    In any case, I must agree with the overall picture.


    Posted May 19, 2011 at 12:00 am | Permalink
  11. Chike wrote:

    @Cathy Watson.

    I don’t think you can conclude that women in Africa are getting shorter. My hunch is that women born in the 80s and 90s (during the IMF mandated “structural adjustment programmes”) would have less access to medical services and nutrition. So they will tend to be shorter than their mothers.

    I was brought up in a middle class family during 80s and 90s and my parents could barely afford milk. It must have been much worse for poorer people.

    The middle class in Africa is growing so I suspect, the next generation of women will be taller.

    Posted May 20, 2011 at 12:25 pm | Permalink
  12. Michael wrote:

    I find the idea that the economic research is better to be amusing. Better obfuscated, don’t get me wrong. But it’s hardly like uncritical acceptance of a dataset is a non-economics problem.

    Posted May 22, 2011 at 12:06 am | Permalink
  13. bengoldacre wrote:

    i think journals publishing things outside of their field of expertise is risky, but i wld caution against developing a world view that economics journals are in a better shape overall than medical ones. as someone who flits into both, there are lots of things that are routine in medical journals, to a greater or lesser extent, but notably almost unheard of in economics. stuff like declarations of conflict of interest, structured write-ups, registering a protocol in advance of doing a study, etc. all of which wld be great to see more of outside medicine.

    Posted May 24, 2011 at 1:13 pm | Permalink

6 Trackbacks

  1. […] Link to the original site Filed in Aid by Mark Oppenneer SHARE THIS Twitter Facebook Delicious StumbleUpon E-mail « Economics professors’ favorite economics professors » Inter‐Agency Real Time Evaluation of the Humanitarian Response to Pakistan’s 2010 Flood crisis No Comments Yet […]

  2. […] Visit link: Development Social Science in medical journals: diagnosis is caveat emptor […]

  3. […] If your methods and data are crap but the message of your paper is progressive, just get it accepted in a medical journal. […]

  4. […] Yesterday, one of my wonderful PhD students directed me to this brilliant blog post from aid watch: shaky research to solid headlines via medical journals: […]

  5. By International Health Policies » IHPNews #118 on May 23, 2011 at 9:29 am

    […] blog which closes this week, medical journals like BMJ and the Lancet are accused of consistently publishing dubious social sciences studies using unduly creative data. Even clinical studies in high impact journals are not without […]

  6. […] William Easterly and Laura Freschi have determined the decision tree for publishing crappy social science research: […]

  • About Aid Watch

    The Aid Watch blog is a project of New York University's Development Research Institute (DRI). This blog is principally written by William Easterly, author of "The Elusive Quest for Growth: Economists' Adventures and Misadventures in the Tropics" and "The White Man's Burden: Why the West's Efforts to Aid the Rest Have Done So Much Ill and So Little Good," and Professor of Economics at NYU. It is co-written by Laura Freschi and by occasional guest bloggers. Our work is based on the idea that more aid will reach the poor the more people are watching aid.

    "Conscience is the inner voice that warns us somebody may be looking." - H.L. Mencken

  • Archives