Page last updated at 12:29 GMT, Thursday, 21 May 2009 13:29 UK

Death rates - stick, twist or bust?

Playing cards

Michael Blastland
GO FIGURE
Different ways of seeing stats

When you see that a hospital has higher mortality rates than is expected, you might choose to avoid it at all costs. Or you might sensibly see it as a blip, writes Michael Blastland in his weekly column.

Hundreds of hospitals in England are triggering alarms over how many of their patients are dying.

The news follows the scandal at Staffordshire General Hospital where about 400 people might have died unnecessarily. Among a list of failings, the hospital asked receptionists to assess patients arriving at casualty.

But there's always a danger of overhyping a sensitive story.

And now there is a clamour to know which other hospitals have triggered alarms. The alarm system set up to monitor patient mortality, that detected something awry in Staffordshire, currently has about 100 other alarms outstanding.

But are these hospitals dangerous, or just unlucky? And until we know for sure, what do we say about them?

Here's some data from a report last year.

MORTALITY ALERTS FOR HOSPITALS IN ENGLAND
  Aug
-07
Sep
-07
Oct
-07
Nov
-07
Dec
-07
Jan
-08
Feb
-08
Mar
-08
Apr
-08
May
-08
Jun
-08
Jul
-08
Source: Healthcare Commission
Cases dealt with 4 1 5 7 1 9 9 0 2 2 9 5
Cases escalated 0 0 0 0 2 5 0 0 3 0 0 0
Open cases 11 22 26 21 30 15 13 30 20 33 18 34
Total 15 23 31 28 33 29 22 30 25 35 27 39

It shows:

  • mortality rates can rise far enough outside what's expected to cause an alert (the "Total" cases row)
  • how many were easily dismissed ("Cases dealt with") - usually because there was a quick, simple and innocent explanation
  • how many prompted further investigation ("Cases escalated"), and
  • how many were left unresolved ("Open cases"). The cumulative total soon climbs into the hundreds.

Our first reaction - after shock, perhaps - is to want to know of any alert in our neighbourhood. But there's a statistical objection at this point.

You can come down in favour of the story, or in favour of those hesitant statisticians.

The argument for the story is simple - what if you are about to go to one of these hospitals? What if there is an increased risk that it will kill you when by going elsewhere you might live? Why should that risk be kept secret from those whose lives it most affects? And even if it is a mere suspicion without proof, isn't any evidence worth having to help us in these judgements?

The statistical objection takes a little longer, but only a little. To see it, we need:

  • some graphs
  • a deck of cards, and
  • a street of parked cars

The problem is that people die in hospital all the time and usually, we hope, for unavoidable reasons. What's more, they don't die in the same quantities every day. Chance varies the numbers. The lines on our graphs go up and down anyway. How do we know if the numbers are high by chance, or high because something is wrong?

You could just wait until it's persistent and obvious. But waiting might be killing people.

Here's the mortality rate for emergency admissions in the Mid Staffordshire NHS Trust in which Staffordshire General is the main hospital, compared with the average for hospitals serving similar populations.

Mortality 1

Pretty incriminating, I think.

But what we know at the time of an alert is more typically this...

Mortality 2

...a shorter run of data that might turn out to be a Mid Staffs scenario or a chance blip that soon goes the other way altogether, such as this:

Mortality 3

Maybe this hospital dealt, briefly, with the casualties of a local fire. We might have to watch for some time before the underlying pattern emerges. We might make elaborate checks that turn out to have been unnecessary. Meanwhile, should we publish the alert?

Looking for unusual numbers of deaths is a little like looking for aces in a pack. If you draw one at the first attempt, do you think anything of it? Probably not, it's bound to happen now and then. What about twice in succession?

Ok, now for the playing cards analogy.

Let's say you have thousands of decks of cards each representing just one of dozens of areas and procedures in each of the hundreds of hospitals in England, any of which might trigger an alarm. What would you think if one of those yielded two aces in a row?

How about three?

Cheat or chance?

In other words, how high should the threshold be before an alert is triggered and how high before we decide something is wrong? That's the investigators' dilemma. They want to know about every ace in order to check it, understanding that it might be critical. But they also know that aces turn up all the time and sometimes in bunches. When are the aces down to chance and when are they suspicious?

Hand of cards
What are the odds of drawing two consecutive aces?

What if every card game was stopped if anyone was dealt three aces because it was assumed to imply cheating? What if it meant you were hauled outside and accused and frisked on the TV news? Would we doubt the pack, or the player, or put it down to chance?

If we publicise every alarm and the hospitals concerned are shunned, it increases pressures elsewhere and those pressures might contribute to someone's death. If we don't and it turns out there was a real problem, we failed to act when we might have saved lives.

An investigation of Papworth Hospital in 2007 after 7 of 20 patients for heart transplants had died, with another seriously ill at the time, took the view that the mortality rate was so high that the unit should temporarily close. But it transpired what happened was analogous to a bad run of cards.

Another analogy. Winning the lottery is so unlikely that anyone who does must have cheated. Yes?

Obviously, no. Chance alone produces remarkable outcomes.

Or, finally, imagine your street full of cars with alarms that ring every night, but most are faulty. It usually turns out there's no problem. Would you want them heard or silenced? Or does it depend whether one of the cars is yours?

Or would you want someone else woken first who could investigate and then tell you the result?

* For those interested in a fascinating case study of how to sift chance from real cause, the Care Quality Commission's document on following up mortality outliers is here .


Below is a selection of your comments.

It is important to note that the mortality rates at this hospital don't seem to be completely unpredictable- they rise in line with colder and darker months, something we might expect. Furthermore, there are various statistical tests that can be employed to see whether the "excess" mortality rates are significant at a certain level of confidence, or whether they should be deemed as part of an ordinary probability distribution.
Antony, London

This is why proper auditing is a necessity and why the paperwork behind it helps. Whereas the Bristol heart scandal showed a deficiency in the skills of some surgeons, proper auditing can show that what seems like a deficient service is actually caused by a run of bad luck.
Mike, Newcastle-upon-Tyne

The card analogy is a false analogy. There is a clear statistical reason why you could get four aces in a row. A more realistic problem would be when a deck produced five, six or seven aces over the course of a long run of cards. Alerts for mortality rates are given after a series of unexpected numbers of deaths - not over a short period. Giving people the information is never a bad thing when it is explained properly. Making the assumption that people will not be able to understand the subtleties of this information is to underestimate the intelligence of the public.
Andrew Kliman, London

The other message from the mortality rates graph is presumably "try not to end up in A&E in the middle of any year". Presumably the reasons for this cycle are known - is it something to do with taking on newly-trained staff?
Ed, Oxford



Print Sponsor


COMPLETE MICHAEL BLASTLAND ARCHIVE
 


FEATURES, VIEWS, ANALYSIS
Has China's housing bubble burst?
How the world's oldest clove tree defied an empire
Why Royal Ballet principal Sergei Polunin quit

BBC iD

Sign in

BBC navigation

Copyright © 2019 BBC. The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.

Americas Africa Europe Middle East South Asia Asia Pacific