Big Data and Government Covid19

Misinformation, Covid and the BBC

Since my appearance on BBC Radio Ulster Talkback on Friday I’ve had to face a fair amount of criticism on social media – but, thankfully, I’ve also received lots of messages of support. The criticism tended to focus on the fact that I clearly lost my temper on-air. My defence is that it’s tricky to maintain calmness when the interviewer (in this instance, William Crawley) was, from the outset, hostile.  But, I also wanted to present an alternative to the calm and clerical sounding Chief Scientific Advisor who was allowed to ramble, unchallenged, about the need for yet another draconian lockdown in Northern Ireland from Boxing Day. 

The day after that broadcast we heard from the Prime Minister that London and the SE of England was to have Christmas effectively cancelled as an opportunity for families to get together – as London moves into “Tier 4” of a tier system that starts with Tier 1/Medium and goes to Tier 4/Be Scared, Be Very Scared. The Covid Scare Taxonomy is running out of severity descriptors. 

My attempts, on Friday, to provide an antidote to the scary stuff from public health “experts,” were stifled at every opportunity by Crawley. When I attempted to present evidence to counter the fear narrative, I was closed down with accusations that my evidence essentially wasn’t allowed. It was, apparently, misinformation. Only the correct sort of evidence is allowed by the BBC these days.

But there were three central points I was trying to get across on Crawley’s programme that were stifled.

The first relates to lockdowns themselves. They don’t work. An excellent article to this effect was published by Nick Hudson (of PANDA) on Medium today. Although there’s a growing body of evidence that viruses will do their thing until populations achieve near herd immunity. There’s good evidence that we’re there already in the UK. Winter seasonal flu, which has nearly overwhelmed A&Es in the past (e.g. in December 2017/January 2018) is now being reclassified, it would appear, as Covid (see chart, below). Bed occupancy in London has been pretty consistent since mid September, but patients are being reclassified as Covid to fit the narrative.

NHS Bed Capacity, London, September to December, 2020

The second is that the PCR test is a duff test. I’m not going to re-hash a huge volume of work and back catalogue of research. There are a lot of resources on that justify the view that the testing is creating a false narrative of “cases” – when they are, in fact, simply false positives. The PCR test produces large volumes of false positives simply because it is not fit for purpose for testing large populations of asymptomatic people.

Because PCR is the test of choice for NHS personnel it means that perfectly healthy healthcare workers are being sent home to isolate at a time when they are most needed i.e. during a Winter respiratory disease spike. 

And the final point I wanted to make relates, inevitably, to the social cost of all of this – especially when the concept of asymptomatic transmission must be questioned. Covid-19 is not a particularly dangerous disease for most people. Many have innate immunity (probably around 50% from T-cell immunity). Many have acquired immunity from having had the disease, even in a mild form. Therefore, people who are concerned that they may suffer severe ill-health if exposed to it can choose to isolate or take precautions. But those of us prepared to take the risk should be allowed to get out and socialise or engage in trade. The restraint of trade, and restraint of education, that are the the cornerstones of lockdown are extraordinarily damaging – especially to business owners, children and young people. 

These are not extraordinary claims or misinformation. These are valid concerns about a policy that has been tried, repeatedly, in all four home nations since March. It is a policy that’s is hugely damaging to our society and our way of life. If we are not allowed to question government policy, it starts to feel conspiratorial. Freedom of speech is at the heart of this. And curtailing that freedom – by the BBC or government – is the stuff of misinformation-spreading and propaganda. 

Big Data and Government

Check out this BIG data dump…

When Grant Shapps, the Transport Secretary, was asked by Julia Hartley-Brewer on TalkRadio if the cost-benefit analysis would be presented to MPs later in the week, before they vote on another new-variant lockdown, Shapps retorted with a killer answer. Not only has such analysis been done, apparently, but a “huge data dump of a lot of analysis” would be delivered. This great dump, he suggested, could be pored over by MPs and, after their foraging in the dump, they would be super-wise to make a decision. 

Shapps’ answer says it all. Government policy, regardless of what it is, can be justified by obfuscation and lots of references to the bigness of the data. He made clear that data analysis would be part of the smoke and mirrors process to make sure that MPs were no wiser at all about the rationale for locking people down from meeting, drinking, eating and shopping before Christmas.

Shapps, apparently, got his big job back in government (after being side-lined by previous administrations) because he was seen to be a good media performer. But his past, littered with get-rich-quick schemes and dodgy pyramid-selling, has been all about saying one thing and meaning something else entirely.  Much of his dealings with the media in the past have been about justifying behaviour that was incompatible with high public office. 

But being good with the media seems, whatever form this takes, is all.  The Shapps approach involves knowing just enough about “data” to be able to evade what the data mean – what story it tells.

However, the arguments against the lockdown-based Covid response of the government have been made very well by scientists and very effective number-crunchers outside of government – notably Ivor Cummins, Carl Heneghan and Michael Yeadon.

Between them, they have meticulously destroyed the arguments for lockdown based on claimed Covid threat (centred around the R number). Instead, they have argued that PCR tests result in high levels of false positives, that the pandemic is probably over (based on reduced hospital admissions and evidence indicating T-cell immunity in a large percentage of the population).  They have also made compelling cases indicating that lockdowns just don’t work.  Along the way they have used precise and relevant data – not data dumps – to provide evidence for their assertions. But the government ministers responsible for implementing policies that are increasingly seen as damaging can run away from empirical evidence and use arguments that are based, frankly, on mumbo jumbo or just plain obfuscation.

The cost-benefit analysis of lockdown is likely to be complex and is likely to require some considerable evidence. But the result that we are seeking is an answer to this question: Is continued lockdown justified if the result is massive economic destruction, huge curtailment of non-Covid related treatment in hospitals, and significant damage to our civil society – even if a few more people get infected with what is, for most, a mild disease or one that results in no symptoms.  The answer cannot be a “huge data dump of a lot of analysis”.  That’s just not good enough anymore. 

Big Data and Government

Northern Ireland’s “Pandemic”

Every day people die. It’s an unfortunate fact. But people die. We’ll all die. And, when we do, our deaths – little data-points in the aggregate – will get added to the mix. Northern Ireland, like every administrative region in every part of the developed world, has a statistics agency that counts up all the deaths every day and collates them by month. And this year, 2020, is no different to any other year. Deaths have been tallied. There’s no denying the data. Or is there? 

In this, a ‘pandemic’ year, the month by month death data are more interesting than most. I’ve been looking at the numbers. And one month really jumps out. The month?  January 2018.  Why is this month particularly note-worthy?  Well in most months, most Winter months, around 1,200 or 1,300 or even 1,400 people die. But in January 2018 2,101 people died. In fact, more than 500 more people died in January 2018 than the average number of deaths for the five years previously. This is the excess deaths number.  506 to be precise. 

But, we’re told, 2020 is a pandemic year? So presumably we’ve knocked that previous record out of the park? Well, no. In not one month of 2020 have we seen that number of deaths. In April we came close with 1,933 deaths. In January this year we had much fewer deaths than in January 2018.

In 2018 it was well publicised that the health service came close to being overrun.  There was a crisis. Seasonal deaths were very high. But, of course, no convoluted tests were being used to determine what people were dying of. No doubt, respiratory diseases played a big part in causing end of life, especially among older people. But no new test was conceived. After all, if people are dying from chronic respiratory failure the symptoms are obvious, the diagnosis easy. No fancy tests are needed to test for what is normally described as Winter flu.  Or perhaps a particularly virulent form of cold.

Now let’s focus on April 2020 again. This is the month this year that was nearly as bad as January 2018 in terms of death count. If the health service was completely overrun in 2018, surely that was the case again in April, in a pandemic year? Well, no. That’s not the case. Because in 2020 – in March and April and May – the chronically ill patients, mostly elderly, were sent to care homes to die – for fear of hospitals being overwhelmed. So there was no real crisis in the hospitals. And, of course, elderly people with co-morbidities aren’t eligible for the limited ICU beds (there are less than 100 of them in Northern Ireland).  So ICU beds never got to capacity in April. 

So what about the so-called “second wave” in 2020 in Northern Ireland?  Well, we’re told, hospitals, many of them, are at more than 100% capacity. But in September deaths were pretty average for time of year at 1,384. This number isn’t significantly worse than the death number in September 2015. And yet, in 2015, most of the economy hadn’t been closed down. People weren’t on state-funded furlough. We could all still get out for a meal or a pint, and still get our hair cut.  But in 2015 we didn’t have the PCR test – a test contrived to define people as sick who clearly aren’t. And one result of this PCR test is that perfectly healthy medical staff are being sent home to “self-isolate” – meaning that they can’t help the spike in Winter patients – spikes that have occurred frequently in the past. 

And remember, the most severe spike was in January 2018, not in any month, so far, in 2020. 

Big Data and Government Open Data

The People and Government Data

In one of his most recent posts on Citizen 2015 my colleague, Larry Larkin, provides an overview of a recent study undertaken by the Pew Research Centre. The study outlined how Americans were using government data and information.

The study showed that people tended to use government data – and relatively simple data at that – only from time to time and to address a relatively simple need (like getting library opening times).  But often it’s lack of availability of data that results in citizen frustration – and citizens often aren’t even aware that this is the case.

One of the issues that governments face in terms of providing “service” to citizens is that citizens don’t consume government services in quite the same way as they consume commercial services. They tend to consume services on an as-needed basis. And they often don’t readily appreciate the relationship between data and service.

To date, attempts to make government more open and accountable have focused on the provision of information – giving data (or information) to people that want it.  Opening up data is often the result of a citizen movement and many government bodies haven’t been entirely keen to let go of their monopolistic ownership of data.  But there’s evidence that this war is being won.

But the next step for government is allowing data to be used to do things in better ways. Because often when citizens most need data they aren’t actually seeking it. Data is simply the means of providing service. In the commercial world data isn’t such a big deal. Rather it’s simply the enabler of service. There is an expectation that if one calls a contact center, for example, the contact center staff will be able to access data and answer questions quickly. Often this simply isn’t the case when citizens attempt to avail of government service.

For example, let’s assume a citizen makes a planning application for an extension to a house. Despite attempts to make the planning application process easier it’s often the case that a lack of data in the right place at the right time makes the overall service experience miserable for the applicant. The ability to submit all information via a self-help portal may be missing. The system may not be sufficiently ‘intelligent’ to be able to guide the user through all of the necessary processes for filing – resulting in incomplete or non-compliant applications. The work-flow may not create appropriate or timely communications. Contact center staff may not have the necessary information in order to deal with queries about applications. The contact center may constantly defer to planning specialists – resulting in bottle-necks.

Citizens who have to deal with these frustrations may not identify data as the main reason why a government service fails to deliver or results in frustration. But it clearly is a data problem if workflows are stunted, contact staff can’t deal with queries or systems contain fundamental bottle-necks. Data – or lack of it – results in poor performance.

It’s for this reason that the ‘government as a platform’ (GaaP) movement has to be the next big thing in government. GaaP is all about getting the data where it needs to be by thinking about processes and data calls. This is a poor definition of GaaP – and not strictly accurate. But I’m trying to make the point that without data in the right place at the right time services can be highly frustrating and utterly inefficient.

On the subject of GaaP, John Jackson of Camden Council in London was featured on the GDS website recently – and discusses how the concept is now very relevant at local government level too. John spoke at our Citizen2013 conference.

Big Data and Government Open Data United States

Americans and Open Government Data

In April of this year, the Pew Research Center published a report titled Americans’ View on Open Government Data. This very interesting study provides a measure of how the public views federal, state and local governments’ efforts to become more open and transparent through the dissemination of their data. The report is based on a late 2014 survey of 3,200+ individuals. The study examined:

  1. how aware the public is of governments’ data-sharing initiatives,
  2. if these initiatives are actually resulting in people using the data to monitor government performance,
  3. public’s view of whether these efforts have been – or have the potential to be – successful in making government perform better or become more accountable, and
  4. how the public is using this data.

Some of the study’s key findings:

  • PewFig1
    Figure 1

    While most (65%) of the individuals surveyed are using the internet to find government information/data; they are using it to perform simple tasks such as finding out public library hours or paying a traffic ticket (Figure 1)

  • Just a minority of respondents indicated they paid much attention to how governments share data – and only a relative handful said they were aware of instances where government had done a good or bad job of sharing data.
  • Less than one-quarter of those surveyed use government-generated data to track the performance of services such as hospitals, healthcare providers, school systems, etc.
  • PewFif2
    Figure 2

    People were divided on whether the sharing of data has the potential to improve government transparency, accountability and performance – it’s also not clear to them that this will even happen (Figure 2).

  • Only 23% of respondents indicated that they trusted the government to do the right thing – at least most of the time. Of this group – the “Trusting Minority” – roughly three quarters believe open government data is beneficial and contributes to better government (Figure 3)
  • PewFig1
    Figure 3

    The study found that smartphone users (68% of the individuals surveyed) have embraced apps that are based on government-generated data or capabilities, such as weather and GPS – what I call government-enabled applications:

    • 84% have used weather apps
    • 81% have used map apps
    • 66% have used apps that provide information about nearby stores, bars or restaurants
    • 31% have used apps to get public transit information
    • 14% have used apps to hire transportation such as Über or Lyft

Interestingly enough – but not surprising, I suppose – only 9% of all survey respondents believed that government-provided data helped the private sector “a lot” to develop new products and services (41% felt it helped “somewhat”).

I found the results of this study to be consistent with the findings of Dr. Donald Norris and Dr. Christopher Reddick: Citizens will use government data when it fulfills a need and won’t when it doesn’t.

Note: all figures Copyright 2012, Pew Research Center. All rights reserved.

Big Data and Government Digital Policy Government Cloud

New Citizen Expectations: Smartphone Generation

In this interview with Bill Annibell of Sapient Government Services, Bill outlines how citizen to government engagement processes have had to change. To some extent it’s about the smartphone. To another it’s about much greater citizen expectations.

But delivering service, given these expectations, is challenging.

Bill outlines how too many government requests are for technology solutions rather than solutions that enrich the relationship between citizen and government entity. He talks about how so many government processes are separated and distinct rather than joined-up.

Bill makes a strong case for joined-up thinking and about all levels of government getting together to think about things from citizen perspectives.

Bill covers quite some ground in this interview…how he uses research to better understand customer need, how some levels of government ‘get it’ better than others. He talks about 18F, FedRAMP, and Generation Y.

Great insight from someone who understands how citizens and government can work better together.

Big Data and Government Digital Policy United States

Data and a $32B Travel Operation

When a government office is as significant as the Defense Travel Management Office of the US DoD, insights and analytics are critical. One fact highlights the magnitude of the operation. The office oversees annual spending of $32B.

Last week we interviewed Harvey Johnson, SES, Director of the Travel Management Office. Harvey is responsible for one of the world’s largest commercial travel operations – moving civilian and military personnel from one place to another using a vast network of travel providers.

We’ll feature more video content from the interview in the coming weeks. But to whet your appetite, in this short sequence, Harvey discusses the importance of data and insight in providing quality service and to help set the policy agenda.