View event album here
Context and why this was important
The Internet has become essential to modern life. The Internet of Things is extending the risks of the Internet, from information assurance to issues of life and limb. Cyber crime is eclipsing all other forms of crime in terms of economic value. Cyber warfare and digital age information warfare are set to become a central instrument of state power. The question of regulation will not go away and is becoming more insistent. We sought to explore how we could make progress on cyber security and the reduction of unacceptable behaviour and content on the Internet (with all the problems of definition that implies), whilst not destroying the value of the Internet to human connection, the economy and innovation. We aimed to avoid as far as possible existing cul-de-sac discussions between states and companies, for example on encryption.
American science legend, Vint Cerf, one of the original fathers of the Internet, chaired an eclectic group that brought together strong representation from the major tech companies and tech entrepreneurs, alongside extensive national security and political experience on the one hand, and privacy and liberty campaigners on the other.
The vision of a global Internet where any device could try to connect to another and it would be up to the individual receiver of the communication to decide whether they would accept the communication or not, was almost in the past, with China and Russia increasingly creating separate realms online.
There was extensive discussion but no consensus on whether Tor (free software for enabling anonymous communication) or other forms of identity obfuscation and virtual private networks were a net gain or loss to the global community. Everything depended on context – a net gain for those living in repressive regimes, a net loss for victims of cyber crime. The value and right to anonymity was debated. The right to identity on the Internet, with the underlying right of access to the Internet, was also posited as increasingly crucial to citizens being able to take full advantage of the new economy and to interact effectively with government. The point here is that there are conditions under which Internauts (internet users) need to be able to strongly authenticate their identities to assure that the expected parties are involved in a transaction, for example. At the same time, it was acknowledged that users should be able to make use of the Internet without having to identify themselves if they wish to remain anonymous. It was pointed out that anonymity is becoming more and more difficult to assure.
This linked to another major theme of the discussion. A safer and accessible Internet was seen as an increasingly important factor in driving equality and inclusion in society and an unsafe or inaccessible Internet as a motor for inequality and exclusion, both within western societies and globally between the richer north and poorer south. There was a risk that an Internet where cyber crime was rampant would hurt most the people living on the edge of the economy, or those who were struggling to adapt to the technological revolution, often the elderly. An Internet full of hate speech and trolling would tend to marginalise and silence the voices of those less willing to shout and to trade insults online.
AI was going to make a complex system even more complex. AI in cyber offence will need to be matched by AI in cyber defence. Attacking will remain easier than defending but AI could be very helpful in mapping vulnerabilities in systems. It was clear to all that more concerted international action is needed to improve coordination on cyber defence. There remains cautious scepticism as to the effectiveness of AI and machine learning as these technologies have also been shown to be brittle and subject to unexpected failures. There are, however, dramatic examples of success, not the least being self-driving cars and natural language translation.
The role and potential advantages of China were debated without resolution. China has the advantages of the unfettered aggregation of data and this will deliver significant advantages in developing certain types of AI – those suited for example to an authoritarian state in applications like China’s social responsibility scoring of its citizens, and for weapons systems. But others argued that western pluralism would retain its innovation advantage in the long run and that the “single stack” Chinese model would not be able to keep pace. This fed into a discussion of strategic advantage – were we willing to take the risk that this would play out to our benefit in the end? Should our approach be to maximise the openness and agility of our intellectual landscape and markets or should we be looking to close down the free flow of intellectual property in China’s direction? Should we be maximising the numbers of Chinese students studying in the West or minimising them? Was there a risk that AI combined with cyber would become a singularity of power, a space race that China was determined to win? Was this really what we should be worrying about or was the race on bio-engineering a bigger issue?
There was a sense that the cyber security conversation was more important ultimately than a discussion on unacceptable content. Making the Internet safe enough to reap the advantages of the Internet of Things is going to be crucial to economic prosperity and technological progress. It was argued that there can be no international agreement on a definition of unacceptable content because of the different views of China and Russia on the one hand and the US First Amendment on the other.
Governments were far from match fit for countering the growing tide of ever more sophisticated cyber crime. Some argued that this was because politicians, the police and agencies focused too much on a few high profile crimes (terrorism and child abuse) and not enough on industrial petty crime (which was not petty for the victims who were often already vulnerable members of society). As yet, states had no answer for international and industrialised petty crime – for example German criminals selling fake accommodation in Cambridge to Chinese students. Some saw a need for the capabilities of the intelligence agencies to be extended to the police. Others worried about the mission creep risks inherent in this. AI might help provide an answer to industrialised Internet crime, bringing automation to defence, attribution and pursuit, which at present were still very manual processes.
The dominant mood in the group was that regulation was coming to the Internet, like it or not. Western governments saw the companies as having lost their gloss and mythic status – vulnerable to political pressure because they had lost the underlying assumption of the people that they were well intentioned. Unfair and inaccurate comparisons were beginning to be made with the tobacco industry. What was at stake was whether the regulation that was coming would be useful, or damaging to the vitality of the Internet.
GDPR, much contested in gestation, was now seen as having set the standard simply by virtue of its existence. There was a big first mover advantage as companies were looking for clarity and standards which they could track.
Done well, effective regulation and governance could be attractive to business, giving clarity and a level playing field. But this was not guaranteed. There remained a risk that GDPR, as an example, would inhibit rather than enable innovation. It would depend how it was implemented in practice. It could benefit large established technology companies, particularly those with lots of data, and raise the bar and costs for start-ups in Europe.
We needed better definitions of Internet harm and useful academic work on this was referenced. Talk of a cyber Pearl harbour or 9/11 was not helpful when it conflated what might be minor inconveniences to individuals with what could be devastating attacks on critical national infrastructure. We needed a categorisation of different forms of risk, along with different possible solutions and a sense of whether this risk was high or low probability. We also needed to be realistic about the extent to which we could use a fundamentally open network to connect everything from the banal transactions of everyday life to the functioning of a nuclear power station.
It was already clear that information warfare was going to be as important, and probably much more often resorted to, than actual cyber warfare. In both cases there was no established model for deterrence which meant uncertain risks.
Until now most of the Internet’s development had been about connecting people through devices but the Internet of people would soon be overwhelmed by the transactions and connections through the Internet of Things. This might prove an advantage as the industrial Internet was emerging out of already heavily regulated industries that had evolved efficient systems to set globally accepted standards without central control, such as the International Standards Organisation.
Recommendations – not necessarily consensus, but ideas arising
The concept of a right to identity on the Internet was put forward as counterweight to a qualified right to anonymity and was compelling to many. People should be able to prove who they were when they needed to. Companies, services and governments had a right to ask for identity in certain circumstances, as in the physical world.
Anonymity should not be treated as a zero sum concept but qualified by the questions: Anonymous to whom? And for how long? In real life we could walk anonymously through a crowd as far as the people around us were concerned but we accepted that we were not permanently anonymous if a police officer had reason to review the CCTV footage of the area. If we went into a bar then we could be asked to prove age but we should not expect to have to provide an address. Building on this we could imagine different degrees of anonymity applying to different situations on the Internet, whereby we reveal only as much of our identity as is appropriate to the situation. There was interest in pursuing this as a design solution. It had been conceived of before but technology had moved on and it might now be possible. One formulation of this notion was expressed as differential traceability. The need to track down parties harming the Internet, other infrastructure, or the users of the Internet leads to the notion that parties on the network should be traceable under the proper conditions. By way of analogy, license plates on cars are generally just random strings as far as other motorists are concerned, but law enforcement has the authority to penetrate this surface anonymity to discover who owns the car with that license. We accept this limited breach of anonymity as necessary in a society in which law breakers should be brought to justice. This has to be a part of making the Internet safer.
This differential anonymity or traceability could be a component in what needed to be a systemic approach to cyber security that increased the friction for cyber criminals in pursuing crime on an industrial and automated scale. This would need to involve education of citizens and companies as well as improvement of the capabilities of the police. Cyber security resilience is becoming an important facet of societal and state resilience to attack from foreign governments. Poor cyber security and unfettered access of information warfare agents (whether human or bot) to social media could undermine public confidence in the society and system. A key observation was that the Internet infrastructure can amplify the digital voices and actions of individuals through the use of botnets and that this amplification creates an asymmetric condition in which small amounts of effort can create disproportionate damage. Cyber-terrorists are taking advantage of this persistently.
The dominant view was that it would be a mistake to focus too single-mindedly on the Dark Web (specifically Tor) as a one-stop shop for reducing cyber crime. It was only one facet of obfuscation of identity on the Web. But it was right that law enforcement and the agencies should patrol the Dark Web and take action when necessary. There was wide scope to do this under existing laws which was not being exploited by state agencies. Others felt that on balance Tor was not a net gain to humanity and the ungoverned areas of the Internet should be suppressed as far as possible.
We should talk to ISO (International Organization for Standardization) and industrial companies about standards setting as there could be an opportunity to seize on the Internet of Things. Internationally recognised certification of IoT (and other) devices for strong encryption and basic safety and security should be considered and would be another element of a systemic approach to making the Internet safer and more resilient to exploitation by bad actors.
One approach to the security challenges of the Internet of Things might be an expanded role for routers and firewalls as guardians of devices behind a firewall in the home. Platforms too might have a role to play in blocking malware. In the UK, the National Cyber Security Centre aims to play this role on a national scale with the explicit intention of conferring advantage on the UK as a safer place to do online business.
We should look for societal solutions to damaging behaviours on the Internet that fall short of illegality but are nonetheless damaging, such as trolling or other abuse. This meant education but also expectations of standards of behaviour (norms) from communities and companies on platforms. The (often illusory) sense of anonymity was a problem when it encouraged people to act online in ways they would not in real life. We are accustomed to different expectations of behaviour, dress and comportment in analogue life, depending on the situation and the community we are in at that moment. It was questioned why this should be different in digital life. In this context, another notion surfaced: international norms for the protection of the public core of the Internet. The public core is the basic transport infrastructure (routers, Domain Name Servers/Resolvers, transport media) upon which the Internet is based. The idea is to establish international norms that such infrastructure should not be the target of deliberate attack by nation states or others, not unlike the Geneva Convention that protects schools and hospitals from deliberate attack under warfare conditions.
Strong cryptography is the basis of trusted transactions. We should not take it for granted but begin preparing now for quantum breakthroughs by moving to quantum resistant protocols. Governments should be arguing for stronger encryption in this context.
We should look at the potential regulation of design of software systems. Could we conceive of “equality by design” for example?”
There was not agreement on the right approach to China – openness and engagement or more control of intellectual property. But there was consensus that we should not be complacent and, whatever strategy we favour, we should have in mind the first mover advantage to shape a new territory according to one’s own assumptions, interests and values.
This Note reflects the Director’s personal impressions of the conference. No participant is in any way committed to its content or expression.
CHAIR:Dr Vinton G. Cerf
Vice President and Chief Internet Evangelist, Google Inc.; co designer, TCP/IP protocols and the architecture of the Internet; Chairman, American Registry for Internet Numbers; Visiting Scientist, Jet Propulsion Laboratory (1998 ); Foreign Member, British Royal Society and Swedish Academy of Engineering; Fellow of IEEE, ACM and American Association for the Advancement of Science, the American Academy of Arts and Sciences, the International Engineering Consortium, the Computer History Museum, the British Computer Society, the Worshipful Company of Information Technologists, the Worshipful Company of Stationers and a member of the National Academy of Engineering.
Ms Monik Beauregard
Senior Assistant Deputy Minister, National Security and Cyber Security Branch, Public Safety Canada.
Professor Abby Goodrum PhD
Founder and Program Co‑ordinator, User Experience Design; Director, Experience Lab; Full Professor, Faculty of Liberal Arts, User Experience Design, Wilfrid Laurier University, Ontario. Formerly: Vice President Research and Innovation, Wilfrid Laurier University; Founding Director Social Science Research for the Network of Centres of Excellence for Graphics, Animation and New Media (GRANDE); Rogers Research Chair in News, Media, and Technology at Ryerson University, Toronto, Ontario.
Professor Michele Mosca
Co‑Founder, Institute for Quantum Computing, University of Waterloo, Ontario; Full Professor and University Research Chair, Combinatorics & Optimization Department, Faculty of Mathematics; Founding member, Perimeter Institute for Theoretical Physics; co‑founder and Director, CryptoWorks21; co‑founder, evolutionQ Inc.; author, 'An Introduction to Quantum Computing' (OUP). Formerly: Robin Gandy Junior Research Fellow, Wolfson College, Oxford, 1998‑99.
Mr Tom Jenkins OC, CD, FCAE, CBHF, LLD
Chair of the Board, OpenText™, Waterloo; Chair, National Research Council of Canada; tenth Chancellor, University of Waterloo; Chair, Ontario Global 100 (OG100); co‑founder, Communitech, Waterloo; board member, Manulife Financial Corporation; Advisory Council member, Royal Canadian Air Force; honorary colonel, RCAF 409 Tactical Fighter Squadron. 2017 Companion of the Canadian Business Hall of Fame; Fellow, Canadian Academy of Engineering.
Mr Arjun Gupta
Director, Advisory Corporation, Inc. Canada. Formerly: Projects Officer, Brookfield Institute for Innovation and Entrepreneurship; President's Scholar, University of Waterloo; World Economic Forum Global Shaper; Canada Millennium Scholar. A Member of the Advisory Committee, The Canadian Ditchley Foundation.
Ms Isabel Skierka
Cybersecurity and digital policy analyst, Digital Society Institute, ESMT Berlin; PhD Candidate in e‑governance, innovation and cybersecurity, Tallinn University of Technology, Estonia; non‑resident fellow, Global Public Policy Institute (GPPi), Berlin; co‑Chair, IGF Germany Steering Committee.
Professor Bogdan Warinschi
Professor of Computer Science, University of Bristol.
Professor Ross Anderson FRS FREng
Professor of Security Engineering, University of Cambridge.
Mr Richard Bach
Co‑founder and Director, XQ Cyber, London.
Professor Paul Cornish
Director, Coracle Analysis Ltd; Chief Strategist, Cityforum Public Policy Analysis Ltd; participant in UK‑China Track 1.5 discussions on cyber security and global cyber governance (2013‑); member: Cyber Futures Council, GLOBSEC Policy Institute, Bratislava; Programme Committee, CYBERSEC; European Cybersecurity Forum, Kraków; advisory board, Journal of Cyber Security; editor, Oxford Handbook of Cyber Security (forthcoming, Oxford University Press, 2018). Formerly: Associate Director, Global Cyber Security Capacity Building Centre, University of Oxford (2013‑18).
Mr Ian Hogarth
Angel investor. Formerly: co‑Founder and CEO, Songkick.com (2007‑16); Masters in Engineering specialising in Computer Vision, University of Cambridge.
Mr Edward Lucas
Writer and consultant specialising in European and transatlantic security; Times columnist; Senior Vice‑President, Center for European Policy Analysis; author, 'Cyberphobia: identity, trust, security and the Internet' (Bloomsbury, 2015). Formerly: Senior Editor, The Economist.
Mr Jack Maxton
Programme Director, UK Cyber Security practice, KPMG (2017‑). Formerly: strategy and operational roles across a range of central government departments; lead engineer for a large cross government cyber programme; implementation of Finance and Military Capability Transformation and Information Operating Model transformation, Niteworks team, Ministry of Defence.
Mr Paddy McGuinness CMG OBE
UK Deputy National Security Adviser for Intelligence, Security and Resilience (until January 2018).
Dr Victoria Nash
Deputy Director and Senior Policy Fellow, Oxford Internet Institute, University of Oxford. Formerly: Research Fellow, Institute of Public Policy Research; Lecturer, Trinity College, Oxford.
The Rt Hon. Baroness Neville‑Jones of Hutton Roof DCMG
Member of the House of Lords; Member, House of Lords Science and Technology Committee (2015‑); Chair, Digital Policy Alliance working Group on IOT (2017‑); member, Engineering and Physical Sciences Research Council (2013‑). Formerly: Chair, Bank of England Advisory Committee on Cyber Security; Member, Joint Parliamentary Committee on National Security Strategy (2012‑15); Special Representative to Business on Cyber Security, House of Lords (2011‑14); Minister of State for Security and Counter Terrorism (2010‑11); Shadow Minister for Security and National Security Adviser to the Leader of the Opposition (2007‑10); Chair, Information Assurance Advisory Council (2005‑07); Chair, QinetiQ (2000‑05); a Governor, BBC (1998‑2004); HM Diplomatic Service (1963‑96): Chairman, Joint Intelligence Committee (1992‑94). A Governor, The Ditchley Foundation.
Ms Chi Onwurah MP BEng, MBA
Member of Parliament (Labour) for Newcastle upon Tyne Central, House of Commons (2010‑); Shadow Minister for Industrial Strategy, Science & Innovation (2016‑); Vice President, Party of European Socialists (PES); Fellow, Institution of Engineering & Technology and City & Guilds of London Institute. Formerly: Shadow Minister (Culture, Media and Sport; and Business, Innovation and Skills) (2015‑16); Shadow Cabinet Office Minister (2013‑15); Shadow Minister for Innovation, Science and Digital Infrastructure (2010‑13); Head of Telecommunications Technology, Ofcom; Partner, Hammatan Ventures; Director of Market Development, Teligent; Director of Product Strategy, GTS; Engineer, Project and Product Manager, Cable & Wireless and Nortel, UK and France.
Ms Cate Pye FIMechE, CEng, MEng, MA(Cantab)
Leads Security and Government Cyber Sector and is part of the UKI Cyber leadership team, Ernst & Young (2010‑). Formerly: advising at senior levels in the government and private sectors on policy, strategy and delivery for large programmes and change initiatives, including on security and cyber; advised various organisations in the security sector; Ministry of Defence; Cabinet Office.
Dr Jamie Saunders
Strategic security consultant, government advisor and visiting professor, University College London.
Professor Sir Nigel Shadbolt FRS FREng
Principal, Jesus College, Oxford, and Professor of Computing Science, University of Oxford; Visiting Professor of Artificial Intelligence, Southampton University; Chairman and co‑Founder, Open Data Institute; member, UK Data Advisory Board; past President, British Computer Society; Emeritus Editor in Chief, IEEE Intelligent Systems. Formerly: Information Advisor to the UK Government; Founder and Chief Technology Officer, Garlik Ltd; Director, AI Group, and Allan Standen Professor of Intelligent Systems, University of Nottingham.
Mr Richard Spearman CMG, OBE
Group Corporate Security Director, Vodafone (2015‑). Formerly: Her Majesty's Diplomatic Service (1989‑2015); Save the Children Fund (1984‑89).
Mr Jeffrey Thomas
Entrepreneur, imagineer and serial investor in public and private enterprises. Co‑Founder and Chairman, Corsham Institute (2013‑); co‑Founder and Non‑Executive Chairman, UKCloud Limited (2012‑); Founder and Non‑Executive Director, ARK Data Centres (2005‑); Founder and Chairman, Hartham Park, (1997‑).
Mr Rob Wainwright
Senior Cyber Partner (Europe), Deloitte (June 2018‑).
Mr Michael Williams
CEO, Pushfor (2016‑). Formerly: Founder (2001), edge IPK User Experience Platform (acquired by Temenos in 2012); Founder (1997), Entranet; Next Software (1996).
Mr Ian Wallace
Director, Cyber Security Initiative, New America; Member, Advisory Board, Global Forum on Cyber Expertise. Formerly: Visiting Fellow for Cybersecurity, Center for 21st Century Security and Intelligence, The Brookings Institution, Washington, DC; Counsellor, Defence Policy and Nuclear, British Embassy, Washington, DC (2009‑13); Fellow, Weatherhead Center for International Affairs, Harvard University; Ministry of Defence, UK: Policy Adviser to Deputy Commander, Multi‑National Force, Iraq (Baghdad, 2007‑08); Deputy Director, Capability Scrutiny; Chief Policy Adviser, Multi‑National Division South East, Iraq (Basra, 2005); Assistant Director, Defence Resources and Plans; Head of Policy, Permanent Joint HQ (2002‑03); Policy Adviser to Commander, Multi‑National Brigade Centre, Kosovo (2001‑02); Assistant Private Secretary to Defence Secretary (2000‑01).
Mr Peter Bass
Chairman, board of directors, Ember.ai; board member, Quberu. Formerly: Managing Director, Promontory Financial Group, LLC, Washington, DC; Executive Assistant to the National Security Adviser, The White House; Deputy Assistant Secretary of State for Energy, Sanctions and Commodities; Senior Adviser, Office of the Secretary, U.S. Department of State; Vice President, Chief of Staff to President and co‑COO, Goldman Sachs & Co.; Treasurer and Trustee, Freedom House; Treasurer and Director, The American Ditchley Foundation.
The Hon. John Bellinger III
Partner, Arnold & Porter LLP, Washington, DC; Adjunct Senior Fellow in International and National Security Law, Council on Foreign Relations. Formerly: Legal Adviser to the U.S. Department of State, Washington, DC (2005‑09); Senior Associate Counsel to the President and Legal Adviser to the National Security Council (2001‑05); Counsel for National Security Matters, Criminal Division, U.S. Department of Justice (1997‑2001); Of Counsel, Senate Select Committee on Intelligence (1996); Special Assistant to the Director of Central Intelligence (1988‑91). A Member of the Board of Directors, The American Ditchley Foundation.
Ms Julie Brill
Corporate Vice President and Deputy General Counsel for Global Privacy and Regulatory Affairs, Microsoft Inc. (2017‑). Formerly: Partner & co‑Director, Privacy and Cybersecurity, Hogan Lovells US LLP, Washington, DC (2016‑17); Commissioner, Federal Trade Commission (2010‑16); Senior Deputy Attorney General and Chief of Consumer Protection and Antitrust, North Carolina Department of Justice; Lecturer‑in‑Law, Columbia University School of Law; Assistant Attorney General for Consumer Protection and Antitrust, State of Vermont; Associate, Paul, Weiss, Rifkind, Wharton & Garrison, New York.
Mr Scott Charney
Vice President for Security Policy, Microsoft; Vice Chair, National Security Telecommunications Advisory Committee; Commissioner, Dutch Commission for the Stability of Cyberspace; Chair, Global Cyber Alliance. Formerly: led Microsoft's Trustworthy Computing Group; Chief, Computer Crime and Intellectual Property Section, U.S. Department of Justice; Chair, G8 Subgroup on High‑Tech Crime; Vice Chair, Organization of Economic Cooperation and Development Group of Experts on Security and Privacy; led U.S. Delegation to the OECD on Cryptography Policy; co‑Chair, Center for Strategic and International Studies Commission on Cybersecurity for the 44th Presidency.
Dr Alissa Cooper
Fellow, Cisco Systems, Washington, DC (2017‑); Chair, Internet Engineering Task Force (2017‑). Formerly: Distinguished Engineer, Cisco Systems (2013‑17); Applications and Real‑Time Area Director, Internet Engineering Task Force (2013‑17); Chair, IANA Stewardship Transition Coordination Group (2014‑17); Internet Architecture Board (2011‑13); Chief Computer Scientist, Center for Democracy and Technology (2008‑13).
Mr Nathaniel Gleicher
Head of Cybersecurity Policy, Facebook, Inc. (2018‑); Senior Associate, Strategic Technologies Program, Center for Strategic and International Studies (2016‑). Formerly: Head of Cybersecurity Strategy, Illumio (2015‑18); Director for Cybersecurity Policy, National Security Council, The White House (2013‑15); Senior Counsel, Computer Crime and Intellectual Property Section, U.S. Department of Justice (2010‑13).
Ms Yvonne Johnson
MSc Candidate in Cyber Security, University of Lancaster. Formerly: Baden‑Württemberg Scholarship: cyber security classes and independent security‑related research, Karlsruhe Institute of Technology.
Ms Tara L. Lemméy
CEO & Founder, LENS Ventures, San Francisco; Founder, AirNext, Alpha Bravo; Fast Company 'Most Creative People in Business MCP 1000'; Inventor, 20+ issued patents; co‑author, 'America's Moment', as member of Rework America, Markle Economic Future Initiative; Fellow, Bipartisan Policy Center; Faculty member, Innovation, University of Arizona Center for Integrative Medicine. Formerly: Founder & CEO, multiple technology startups; Technology co‑Chair, Markle National Security Task Force post 9/11; Embassy of the Future Commission, Center for Strategic and International Studies; Founding Chair of Policy, Law & Ethics, Singularity University; President, Electronic Frontier Foundation; Board member, TRUSTe.
Ms Yin Yin Lu
DPhil Candidate in Information, Communication and the Social Sciences, Oxford Internet Institute, University of Oxford (2014‑); Founder and Convenor, #SocialHumanities research network, The Oxford Research Centre in the Humanities (2016‑); Clarendon Scholar (2014‑). Formerly: Visiting Researcher, Information Visualisation and Visual Analytics Competence Centre, Fraunhofer Institute for Computer Graphics Research IGD, Darmstadt (2018); Teaching Assistant for Statistics Core and Accessing Research Data from the Social Web, Oxford Internet Institute (2015‑18); Tutor in Internet Theory, Worcester College, University of Oxford (2017); Researcher, Rockefeller Foundation Data Financing for Social Good project, Oxford Internet Institute (2016); Doctoral Intern, Alan Turing Institute, London (2016).
Professor Joseph Nye
Distinguished Service Professor, John F. Kennedy School of Government, Harvard University (2004‑); member, Global Commission on the Stability of Cyberspace; author, 'The Future of Power' (2011). Formerly: Dean, John F. Kennedy School of Government, Harvard University (1995‑2004); Assistant Secretary of Defense for International Security Affairs (1994‑95); Chairman, National Intelligence Council (1993‑94); Deputy Under‑Secretary of State for Security Assistance, Science and Technology. A member of the Board of Directors, The American Ditchley Foundation.
Ms Nuala O'Connor
President and CEO, Center for Democracy and Technology, Washington, DC. Formerly: Vice President of Compliance and Consumer Trust and Associate General Counsel for Data and Privacy Protection, Amazon.com; Global Privacy Leader, General Electric; Deputy General Counsel and Chief Privacy Officer, Emerging Technologies, DoubleClick; Deputy Director, Office of Policy and Strategic Planning, Chief Privacy Officer and Chief Counsel for Technology, U.S. Department of Commerce; (first) Chief Privacy Officer, Department of Homeland Security.
Mr James Pavur
Doctoral Researcher, Cybersecurity Centre for Doctoral Training, University of Oxford; Rhodes Scholar. Formerly: Director of Information Security, Students of Georgetown, Incorporated, Washington, DC; intern, Booz Allen Hamilton ‑ Defense and Intelligence Group, Maryland.