Official statistics

Written 24 June 2021

The story this week in the BBC on the EU resettlement scheme seems to me to have pretty wide ranging implications. Maybe it’s just me though: I find stories in the statistics/economics/public policy/Brexit space pretty interesting, but I accept that this puts me in a small minority!

Anyway, the story concerns the numbers of EU nationals applying for settled status in the UK as the deadline approaches and the BBC coverage is here. It appears that as at 31 May 2021 some 5.6 million applications had been made under the scheme. The UK government is currently advertising quite heavily to publicise the looming deadline, so it would appear that this number will increase slightly.

5.6 million compares to an official estimate from March 2019 that there were 3.9 million (non-Irish) EU nationals in the country. Looking at the published figures from March here (the data tables for April and May don’t seem to be up yet) the figures at March 2021 shows 5.3 million, which includes 11 thousand from Ireland: Irish citizens don’t need to apply for the scheme, their rights are protected under long standing agreements that long predate EU membership. So I think Ireland can be ignored and the comparison of 5.6 million (actual) to 3.9 million projected in 2019 is essentially valid.

It’s a big difference: 44% or 1.7 million people (which Google tells me is the population of the cities of Birmingham and Manchester combined).

Reading the BBC coverage, it appears that “the system relied on the International Passenger Survey, which involved the Office for National Statistics (ONS) asking travellers at UK ports whether they were planning to enter or leave the country for more than six months.” 

I’m not going to get into the rights and wrongs of freedom of movement and Brexit, that is separate to the points that interest me. Firstly, there is the issue of basic government and civil service competence in its broadest sense of being capable of usefully measuring things. Immigration is an important topic of (sometimes rancorous) political debate and has been for a number of years, particularly since the accession of the Eastern countries to the EU in 2004. The Government’s job is to make policy, but how can this be done if there is such seeming uncertainty about the facts of the debate? How has the ONS got this so wrong? And for how long?

Secondly, the BBC story referred to the fact that issuance of NI numbers has run ahead of immigration figures for some time, a fact the ONS felt was explained by temporary immigration. There are good reasons to be sceptical of using NI numbers as a sole measure, but as one measure amongst many and one that seemed to throw a question mark over the accuracy of ONS’s own data they do point to significant cause for scepticism over the figures provided by ONS. Has ONS really tested its own data as it should have? Has an institutional mindset been a factor in the story?

Thirdly, and on a very practical point, I have long been highly sceptical of ONS data for the simple reason that I have seen how it is compiled. Working in finance departments, I have filled in quite a few requests for data (recently with threats of punishment if I ignore the request). The questions are vague and often ask for breakdowns using classifications that we don’t use in our business. Of course I always respond with data that is as accurate as I can manage. Not everyone is the same: I remember working with someone years ago who on a slow friday afternoon would simply make up that month’s figures, it provided a bit of fun.

Fourthly, if a basic measure like how many people live in the country this been so badly measured, and likely for so long, what else is wrong? The ONS provides information across a vast array of fields that inform political debate and decision making. How reliable is this data? I don’t want to be too down on the ONS, I am sure that in comparison with similar agencies in other countries it is one of the very best. But there’s a more fundamental point: we are told we must develop “evidence based” policies. If the evidence is produced by agencies that get this sort of thing so wrong, then the evidence should be treated with very large pinches of salt. The people in the gilded circle of politicians, civil servants, academics, think tanks, lobbyists, NGOs and campaigners who assure us that they know what’s best and have a handle on things are emperors with no clothes. How many policies, how much academic research and what official measures are based on data that is this shonky? Regarding data that informs our politics, what if this is as good as it gets?

Fifthly, there are a great number of people in this country entirely unsurprised that there appear to be 1.7 million more EU nationals in the country than the ONS thought there would be. These people generally, though not always, will have lower levels of education than those who work for the ONS, and they will certainly have fewer credentials. If you told them the official estimates were arrived at on the basis of interviewing some people at airports they would have laughed and wondered how anybody could be so stupid as to think that could work. They are not politically ideological, they don’t take assurances that policies are good for them at face value, and on the whole they tend to prefer things as they are rather than signing up for huge change advocated by political radicals. They take official data with a pinch of salt. Thankfully they all have a vote and they are the people I am happy to rely on to choose our political leaders, usually on the basis that one lot are less bad than the others. They are also the people who sit on juries and who I would generally trust to keep me safe. Perhaps it’s time we started listening to them a bit more.

Leave a Reply

Your email address will not be published.