Ditchley’s virtual programme is designed in response to the pandemic and the dramatic effects it is having on our lives. The programme considers the impact on our personal lives, our communities and changes in wider society. We start with a focus on individuals, communities, and, in this session, on the value we attribute to individual freedom, privacy and social responsibility as we work through the pandemic.
The focus of May is on communities and the economy as an aggregation of communities connected by systems. In time this will progress to a broader focus on the systems underpinning societies, from ones we can determine such as the economic, to systems we depend upon – the climate. At the core of this programme is an ambition to understand the challenge and questions the pandemic raises for democratic societies. What the pandemic is surfacing as to what people care about and why, and how can democracies respond?
First Half: Civil liberties and rights
When discussing civil liberties in the age of coronavirus, at a simple level we need to strike a balance between individual freedoms and the responsibility to the collective. In response to an extreme challenge, determining how much the former are compromised by the latter is immensely difficult—what is the basis for this balance and what guardrails do we need to keep this in check?
There was general acceptance in the group that extraordinary times warrant extraordinary measures but also an insistence that these must be necessary and proportionate, effective and rational. Rights are important in limiting powerful states but rights are written for normal circumstances and limitation and coercion may be required and consistent with a belief in rights in extreme circumstances. Duration really matters. Most dictatorships emerge out of an extended state of emergency in response to “threats to the nation” and we have to make sure that creeping authoritarianism does not become the norm.
The initial crisis response of lockdown was clearly justified to win time and to regain a degree of control over the progression of the pandemic. But we are now moving into more grey and nuanced decisions and this is happening just as it is becoming very clear that some parts of society are suffering more than others from both the effects of the disease and the effects of the countermeasures.
Protecting the most vulnerable in society is important but at what point should such decisions be placed in individuals’ own hands? Urging the elderly to stay inside was an interesting case – the argument that they were more likely to become seriously ill and therefore potentially more responsible for the possible overwhelming of the health service was on the edge of acceptable in its chilling of the freedom of an individual to choose which risks to accept and which to avoid.
When death is suddenly brought to the fore it is natural for people to react to this, but is its sudden visibility skewing the debate and obscuring the risks inherent in normal everyday life? What does it mean for someone to die “before their time” and who decides what the limit of medical expense on preservation of a life should be and who should carry the liability of that expense? Could the reaction to this extend to measures to reduce workplace risk that turn out to be discriminatory? For example, could we see discrimination against the obese, those with existing health conditions, or those with disabilities, with an expectation that they should shelter at home and be “safe”? Will employers be allowed or required to determine who is safe to work and who not? Will this have to be mandated by law?
With the pandemic and the lockdown affecting different parts of society to different degrees then trust in the good intentions and reliability of leaders and experts becomes crucial. The British government had tried to rely on common sense and trust the people, (in the style of Sweden), but that requires a national contract of trust with institutions and expertise. It had proved hard to turn on voluntary common sense at scale. Good leadership needs to balance clarity and honesty, acknowledging the complexities and uncertainties of the situation transparently but also giving clear and simple instructions.
What new ways of acting are required for trust to be maintained and, ideally, deepened? What does this say about how our institutions should act? For example, when authorities impose a period of self-quarantine for a person, how is this enforced and will this be followed if the person does not have trust in the body issuing the edict? We perhaps need new ways to build trust between people and institutions, and thus compliance with advice, so that we avoid the trap of having to enforce draconian laws.
There is strain in the contract between scientific advice and the community, but it is vital that this contract is not destroyed. Governments make decisions in the face of uncertainty and there are always risks. People can get used to a certain level of risk, particularly when the immediate threat to life is not high for most people, but how do we balance these risks and who adjudicates? Some countries have taken an open approach, holding discussions with experts from a range of backgrounds and making these discussions public, which can help to build trust. Leaders have to take decisions on incomplete and sometimes unreliable evidence. How much should the nuances and uncertainties of scientific advice be exposed to the public? How can trust be sustained when decisions are on complex issues, for example the prioritisation of the coronavirus response in hospitals at the expense of existing needs like cancer treatment?
Second Half: Can technology save us
Data for good?
The case was made for a more expansive use of data in the interests of the public good. The wide reach of the British National Health Service allows for coordinated data collection, providing an invaluable resource for tackling epidemiological crises. Marshalling resources like these could give us great insights but will need public support, resting in turn on trust. The ground is rapidly shifting as the crisis evolves and there might be an opportunity to make a clear shift in public opinion, with appropriate safeguards. There is huge potential in such data, with the chance to deliver real public-infrastructure benefits, and significant efforts are underway in the private sector to help government. It should be noted that this potential is not new. We could have put data to better use to improve our modelling efforts on the pandemic. Better use of data would fuel our ability to innovate faster and be more agile.
But potential obstacles and concerns were also aired by participants. The public distrusts company motivations and fears that the data will be misused. There is even more concern about government mishandling and misuse of personal data, particularly from sectors of society that have suffered from abuse, or perceived abuse, in the past. Sections of the public are not convinced that there is sufficient accountability and doubt that “data for good” will outweigh the risks. The existing debate around whom do we trust to handle our data, either the government or the private sector, is set to intensify and the question arises who will be able to regulate this? During this crisis privacy regulators have been put on the back foot. They are struggling to deal with questions of how to appropriately use data to solve massive societal problems, e.g. health or employment.
This challenge of regulation is especially hard when the level of technological literacy within government can sometimes be lacking. The complex systems built around personal data require expertise. Partly as a result, the trend has been to push liability and the responsibility for control and policing back on to the companies. This has been going on a few years but is now accelerating. We perhaps need a new set of institutions that can deal with moral and technocratic issues at the same time. There are currently bodies with the remit of dealing with these issues, but it’s unclear if they have the requisite muscle. There is a risk that for much of the public the technological detail will obscure the underlying ethical dilemmas. We must not lose sight of first principles - accountability, transparency, legitimacy.
In some ways technology is already playing a successful role in dealing with the crisis, particularly for services that were already established before the crisis hit. Connectivity through technology (e.g. through videoconferencing or online collaboration software) has become essential to daily life, without which many millions more would have been pushed out of work. What this means for the large companies that provide such services is not yet fully clear. In some senses they are becoming “too big to fail” and too big to control through traditional anti-trust and monopoly action. There is speculation that this could lead in the longer term to much more intrusive and regular government inspection and regulation. It is not clear how this could be successfully combined with the multinational nature of the companies.
Technology is not a panacea. When it becomes clear that technology cannot provide immediate and comprehensive solutions, then this may well result in a rapid backlash against the “technology will save us” viewpoint. For example, contact tracing apps need a broad uptake to be successful but mandating their use could bring its own problems. Even with universal uptake, they will still require extensive support from manual tracing efforts. An over-reliance on, and over-selling of, technology should be avoided.
In this time of crisis who is taking the long-view? This is increasingly rare amongst democratic governments focused on the immediate crisis and the next election. It is possible that we will see a reversal where it is the giant technology companies that look to the longer term future, confident that they have the momentum and resources to survive deep into the future. There are obvious downsides to monopolies in terms of competition, but the point was noted that companies that might previously have focused solely on their quarterly returns are now considering their long-term roles in society and developing biodiversity and sustainability plans. As argued most provocatively by Peter Thiel, monopolies have the rare luxury of time to consider such issues without worrying about short-term profits and competition. Everyone expected technology companies to rise yet further as result of the crisis and to be yet more deeply woven into the fabric of society and business. The question remained though, who can regulate them and through what means?
One fear expressed was that as soon as the high point of the crisis is passed, then there will be no longer term thinking. Investment in resilience for the next crisis will soon fade, along with deeper reflection on what we – human societies – should want to achieve and should value. It was noted that the countries that appear to have dealt best with this crisis are those which learnt the lessons from earlier pandemics such as SARS and MERS. We need to keep our crisis experiences in mind and not allow collective relief to turn into collective amnesia, kicking the long-term issues into the grass in the rush to resume our focus on day-to-day life.
Paticipants: Rohit Aggarwala, Head of Sidewalk Urban Systems, Sidewalk Labs; James Arroyo, Director, The Ditchley Foundation; Nigel Biggar, Regius Professor of Moral and Pastoral Theology, University of Oxford; Julie Brill, Corporate Vice President, Deputy General Counsel and Chief Privacy Officer for Global Privacy and Regulatory Affairs, Microsoft; Paul Clarke, Chief Technology Officer, Ocado; Rebecca Finlay, Vice-President of Engagement and Policy, Canadian Institute for Advanced Research; Sir Nigel Shadbolt, Principal of Jesus College Oxford, Professor of Computer Science, University of Oxford; Martin Smith, Data Scientist, The Ditchley Foundation; Charles Songhurst, Technology Investor, Formerly General Manager of Global Corporate Strategy, Microsoft; Martha Spurrier, Director, Liberty; Dr Catherine Wills, Art Historian and Trustee of the Dulverton Trust, the HDH Wills Charitable Trust, Rendcomb College and Farmington Trust.
The text is a summary of the discussion. No participant is in any way committed to its content or expression.