Wednesday, February 14, 2007

Lies, Damned Lies, And Unicef Reports

Time to break out Statistics Made Simple

The BBC and the rest of the media have been having a field day with the Unicef Report on Child Poverty in Rich Countries. Apparently we are the worst country in the developed world at looking after our children. And even more disgraceful- as Newsnight reported- it's all down to Mrs T (the first one that is).

As always, it's useful to look at the actual report behind the hysteria (here). According to the authors:

  • European countries dominate the top half of the overall league table, with Northern European countries claiming the top four places.

  • All countries have weaknesses that need to be addressed and no country features in the top third of the rankings for all six dimensions of child well-being (though the Netherlands and Sweden come close to doing so)

  • The United Kingdom and the United States find themselves in the bottom third of the rankings for five of the six dimensions reviewed

Once more, European Model good, Anglo-American Model bad. And as the York Uni Prof who's been the Report's main spokesman here has been telling us all day, since 1979- ie the first Mrs T- we have hugely "underinvested" in childcare. Much MUCH more taxpayers' money is required.


OK, there has been some debate- the politicos have been biffing each other over who, after a decade of Labour, is really to blame. But the underlying stats have barely been challenged by anyone.

And they should be.

First, as the Report itself admits, some of them are distinctly flakey:

"Findings that have been recorded and averaged may create an impression of precision but are in reality the equivalent of trying to reproduce a vast and complex mountain range in relatively simple geometric shapes. In addition, the process of international comparison can never be freed from questions of translation, culture, and custom."

It's a standard problem with international comparisons in the area of social policy- especially those that use opinion surveys, as this one does.

With such uncertainties, the responsible and professional approach is to be very circumspect about interpreting apparent differences. Especially if those differences are relatively small.

But that's not what Unicef has done. They've taken relatively small differences and blown them out of all proportion.

Consider the measures they used. There are forty of them, ranging from relative income stats, to poll data on cannabis use. For each measure, they rank countries from best to worst. Those rankings are then aggregated together to produce the overall rankings that have been so widely quoted today.

Well, straightaway you start asking questions. Like, have they chosen the right measures? For example, their first is a measure of relative income income distribution, with the UK and US marked down for having a less equal distribution than say Greece, and therefore ranked as worse places, even though in reality their people are much richer.

And what about the weights used to combine the individual measures? Because obviously falling short on some measures is likely to be much more significant than falling short on others. And the answer is they didn't use any proper weighting system at all. That is to say, they simply assumed equal weighting within each of the six indicators, and then equal weights to combine them into their overall league table.

Pretty crude.

But nothing compared to the distortion involved in blowing up relatively small observed differences on the raw measures into big differences in terms of their derived rankings.

We can see how that works by looking at their summary data tables in the Report's appendix. There, for each of the underlying measures, they report not just each country's score, but also the mean average across all countries, and the standard deviation of the measure (ie its "average variability" across countries). That's important because if a country falls a long way short of the average on a measure that that is generally not widely dispersed, it's much more significant than if it falls only a little way short on a measure that is generally widely dispersed anyway.

And guess what. Although the UK is ranked worse than average on most of the 40 measures, in almost all cases, the differences are statistically insignificant. Specifically, on only 10 of the measures is the UK even outside of one standard deviation (see here for a serious formula headache).

And the only measures where we see Big Jump-Off-The-Page (two standard deviation) Differences are:

  • the percentage of 15 year olds who claim to have had sex- at 38% we're off the scale

  • the percentage of 11, 13, and 15 year olds who claim to have been drunk two or more times- again, at 31% we're off the scale

  • the percentage of 11, 13, and 15 year olds who rate their health as only "fair or poor"- at 23% the highest in the OECD

Just three- and as you will have spotted, all three are opinion poll answers rather than cold hard stats. And on all the stuff you can measure- like childhood obesity- statistically, we're more or less in the OECD pack.

Not for the first time (cf Stern), we have a huge bigged up report based on the flimsiest of foundations.

Of course, we can all understand why it's had so much lib media coverage, playing as it does to all the themes so beloved of Big Government supporters everywhere.

But couldn't Newsnight etc have found just one person to go on and point out some of these obvious facts, and ask the real questions?

Like why is an international agency funded by us supposedly to help the world's real poor, wasting our money attempting to spin up support for Big Government in rich countries?

No comments:

Post a Comment