Tuesday, October 17, 2006

Fundamentally Dishonest Codswallop

I don't link to Iraq Body Count and I don't cite their figures. There's a reason for that, and the reason is that I regard them as so flawed as to be worse than useless. Now they're at it again, defending their turf that is. Nobody, but nobody, should trespass on their turf. Especially not people who know what they're doing. I've long regarded Iraq Body Count as little better than fraudulent. This fundamentally dishonest press release merely confirms me in that opinion. They don't dare try to attack the methodology, about which one could raise questions, so instead they resort to "examining a number of implications" which they describe as "anomalous." Ergo the study can be dismissed. This type of thing is called "circular reasoning" and is the sort of logical fallacy for which, aged 14, I would have been sent out of the classroom in disgrace had I attempted to engage in it.

I'm very short of time this week but this sort fundamentally dishonest codswallop and shameful parasitism upon the misery of the people of Iraq shouldn't be allowed stand. There will be no postings on Friday, and probably not on Saturday either, I'll be too busy going over this deeply dishonest document and pointing out what's wrong with it. In the meantime if you want to know what an epidemiologist with wide experience of conflict mortality thinks of the study I'll refer you this from Francesco Cecchi published on Reuters Alertnet which I reproduce below in full.

markfromireland


IRAQ DEATH TOLL

Francesco Checchi, an epidemiologist at the London School of Hygiene and Tropical Medicine, looks at the lambasting a new report on Iraq deaths has got from hostile governments. He has worked on mortality surveys in Angola, Darfur, Thailand and Uganda, and written a publication "Interpreting and using mortality data in humanitarian emergencies" for the Humanitarian Practice Network.

Reaction to the latest estimates of conflict-related death toll in post-invasion Iraq - about 655,000 according to a study published by a joint U.S.-Iraqi team in the eminent medical journal The Lancet - re-confirms a worryingly unscientific trend in reporting and discussion of the effects of modern conflict on human health.

Commenting on mortality estimates, especially when they suggest that a war initiated on disputed grounds has resulted - and continues to result - in catastrophic loss of life, should be done with the greatest caution.

Unfortunately, this is far from the case. As soon as the Lancet went online, U.S. critics dismissed the study, designed by reputable academics, as "politics".

Meanwhile, the U.S. president stated that he believes that the study methodology "is pretty well discredited", and added: "I stand by the figure a lot of innocent people have lost their life. Six hundred thousand - whatever they guessed at - is just not credible."

The British government issued a response very much similar to that which followed release of the first Lancet survey in 2004: "The problem with this is they're using an extrapolation technique from a relatively small sample from an area of Iraq which isn't representative of the country as a whole".

The Australian PM stated that the estimate is "not plausible, it's not based on anything other than a house to house survey".

Answering back

Much of the above is either arbitrary, or needs urgent rectification. For example:

  • The choice of method is anything but controversial. In theory, representative household surveys are always a better approach than body counts, which, as Burnham et al. point out in their interesting discussion, have always turned out to significantly under-estimate true death tolls. There's nothing wrong about estimation per se, so long as one provides a confidence range (which in this case we have); provided that the sample size is reasonable (it is), the only risk is to incur in some error unrelated to sample size (bias), such as, for example, systematically interviewing households that were particularly affected by violence, or getting distorted information from interviewees.
  • The Lancet survey does not perform "extrapolation" from a small sample, as the British government claims. It estimates a death rate, and merely applies it to the time period, and population, within which that death rate was measured - a statistically transparent procedure, given that the survey covered the entire country with the exception of two Governorates.
  • It is not the case that every point in the confidence interval range is equally likely. In fact, assuming that there was little bias, the true death toll is much more likely to be close to the point estimate (655,000) than to the lower (393,000) and upper (943,000) bounds of the confidence range. It isn't a dartboard.

I too find the survey's estimates shockingly high: while the trend of increasing violence seems indisputable, the sheer death toll seems at first glance inconsistent with even the most pessimistic rule of thumb guesses, at least from an outsider perspective.

However, dismissing Burnham et al.'s work simply on gut feeling grounds seems more than irrational. A very similar methodology is routinely applied in many other settings for the same purpose.

The Lancet's publication approach, while obviously prone to human error, is designed to identify only the most scientifically solid medical research, thanks to the anonymous review of recognised experts in the field.

The survey features a lot of compelling aspects - for example, reported deaths were certifiable; non-violent death rates were broadly consistent with pre-war conditions, suggesting no over-reporting by families; the profile and typology of violent deaths reflects what is expected; and, crucially, findings mirrored closely those of a previous Hopkins/Mustansiriya survey in 2004.

No study is perfect

Of course, no study is perfect, all the more so when conducted in the most insecure country on Earth - so insecure, in fact, that carrying around a harmless GPS unit so as to randomly select households to be interviewed places one at risk of being mistaken for a bomb detonator, as Burnham et al. point out.

Indeed, not being able to use GPS for sampling, the research team settled for a less ideal approach based on a random selection of residential streets, which is probably more prone to bias.

But therein lies the rub - that in Iraq today, insecurity has made it almost inhumanly difficult to conduct proper research on the harms and benefits of war. Indeed, what both media and pundits seem to never highlight as a deeply troubling anomaly is that, were it not for the work of a few courageous researchers such as the Hopkins/Mustansiriya University team, or the painstaking work of concerned members of the citizenry such as the Iraqi Body Count project, quantifying the effects of the U.S.-led intervention on human health would largely be a matter of divination.

Twenty-four hours later, the Lancet study is fast disappearing off the news headlines. Dismissing and, worse, ignoring this and other alarming findings simply because "they sound wrong" is no way to move forward - if they can't be proven wrong (or partly wrong) on scientific grounds, they must certainly stand, until better evidence emerges.

Indeed, coalition powers should, in the interest of public accountability and the very success of their mission in Iraq, promote and facilitate more accurate and transparent monitoring of all humanitarian law violations, and of the true effects of violence on Iraqi civilian health.

Any views expressed in this article are those of the author and not of Reuters.