Ditchley has held conferences on intelligence issues before, but has effectively kept them confidential. As befits these more transparent times, this conference was as open as other conferences, and this Note will be on the website like other conference notes. This did not seem to prevent participants from speaking freely and openly. The result was a fascinating and potentially productive debate on how the twin needs of security and privacy can be met in modern democracies, and the principles which should underpin the bargain between the State and the citizen in this area. Another difference from previous conferences on this subject was the presence around the table of representatives of some of the major private companies whose role is now so indispensable. As we met, the US was grappling with the consequences of recent legal judgments questioning interception operations by their agencies, and Congressional reactions to them (the Freedom Act was subsequently passed), and the UK was considering new powers for its agencies.
Thanks are due to American Ditchley, whose generous support made this conference possible.
The need to review the relationship between security and privacy had come particularly from the changing nature of intelligence surveillance following the 9/11 attack, and the new threats and opportunities arising from the internet, as technology moved faster and faster. The Snowden revelations had not revealed intelligence agencies to be out of control or acting without authority, but had shocked the publics in many countries because they had been unaware of the nature of much of modern intelligence work, as well as uninformed about the authorisation and oversight arrangements already in place. More transparency, perhaps better characterised as translucency, was now needed about both, on a regular and proactive basis, not only in response to revelations or supposed scandals. The boundaries of what could be revealed also needed to be pushed further, to show convincing evidence of what surveillance was for and could achieve, while preserving essential operational secrecy. We were conscious throughout that we were only talking about how to tackle these issues in law-based, liberal democratic societies. There was no chance of such a discussion in the authoritarian states where so much of the world’s population lived.
It was impossible to agree comprehensive definitions of either privacy or security, since cultural sensitivities varied from country to country, and also as threats waxed and waned. Individual privacy was agreed not to be an absolute right, and the balance between its needs and collective security requirements was a matter of judgment, but it was also agreed to be a fundamental democratic value which had to be respected and preserved. We did not believe that the younger generation valued privacy less, despite the way that they used social media, and engaged in the internet culture. Indeed privacy was now becoming a selling point for service providers, through encryption. In the case of public security, it was now necessary to explain more precisely what we meant by the term ‘national security’.
Effective oversight of intelligence agency activities was vital. It had to be integrated and based on clear legislation. A mixture of parliamentary scrutiny, independent reviewers, and judicial recourse seemed best. Authorisation of surveillance activities by intelligence agencies was probably most effectively done by elected politicians, who could weigh proportionality, and political and other risks, as well as legality. Oversight bodies needed full access to the agencies, but should be mindful of the need to preserve operational secrecy, as well as the ability of the agencies to do their jobs. They should where possible be non-partisan politically, and have some teeth. They also needed to focus not just on data collection but also on its use and how it should be shared. The question of international data-sharing was particularly contentious, slow at best, and messy. Companies which held data were bound by the law of the countries in which they were based, but since this was mostly US law, others were frustrated. Better arrangements were needed, and greater reciprocity.
Bulk data collection was seen by the agencies as essential to uncover networks and identify and pursue targets on whom they could then focus. It was not the same as so-called mass surveillance, though without proper oversight it could be misused for that purpose. Was there a clear distinction between the ethical consequence of computerised filtering of bulk data, and human scrutiny of the resulting extracted data or message content? Some thought so, but others argued that the distinction was weakening or invalid, because of the ways in which bulk data could increasingly be used to extract personal information. It could not therefore be relied on to justify bulk surveillance activities. Regardless, all agreed that considerations of proportionality and necessity applied as much to computerised methods as to human analysis. In any case the issue of data retention was as important in many ways as its collection, though we could not reach clear agreements on the criteria for how long different categories of data should be retained, or by whom. There was a preference for it to be retained by the private companies who generated it, but they were reluctant to be in the front line in this way, and it was unclear who would pay.
What was the role of private companies in general? Optimism was expressed that satisfactory arrangements could be found between the agencies and the companies, despite recent arguments following the Snowden revelations. The companies were willing to give access to their data, but this could only be through the front door, on the basis of properly authorised, legally sound and specific warrants, not through the back door. Nothing else would be acceptable to their customers. Governments and companies needed to work together closely in any case, including on the threats from cyber attack. There were also questions about how the companies themselves used their data, since they too had power deriving from it, though it was different from that of governments. The pace of technological change and the rising tide of data meant that governments and regulators could never keep up. Influencing behaviours and setting norms was a better approach than trying to control developments.
We agreed that there was a need for a set of principles, based on legality, necessity and proportionality, which could govern relationships in this area within and between countries, and drew up a preliminary list which might help as a basis for bilateral treaties and international processes between like-minded, democratic countries. This was also an opportunity to renew the vital consent between citizens and states, and create renewed trust, without which there were bound to be fresh storms in the future as governments tried to tackle new threats while advancing technology brought new threats and new opportunities for data gathering.
One prompt for the organisation of this conference was the Snowden revelations, and the concern they had caused in many quarters about the extent of intelligence agency surveillance activity. However, we agreed that the issues were more fundamental than this. In the past, intelligence agencies’ main concerns had been espionage and counter-espionage and their principal focus had been on what foreign powers and their nationals were up to, even though internal subversion had at times been an interest too. The 9/11 attack had led to a radical change of emphasis onto counter-terrorism, which had in turn led to a focus on the unlawful activities of individuals, including domestic citizens. Operations which had previously impinged on a very small proportion of the population were now therefore crucial to the security of the public as a whole. Intelligence agencies had in the past been able to concentrate their efforts on known organisations, threats and individuals. Now they had to make a massive effort to try to identity potential targets who were often hiding in plain sight among the rest of the population. This was not so much looking for a needle in a haystack, as looking for a piece of hay in a haystack. This had led in turn to the increased use of the collection of bulk data on which particular filters could be placed, in order to try to trace individuals who were in contact with known suspects or might be using various communication methods for their nefarious purposes.
The internet had introduced at the same time new possibilities for the ill-intentioned, and new potential opportunities for the agencies. The speed of technological change, and the amount of data now available, were both astonishing, and continuing to accelerate. All concerned were in danger of being overwhelmed by this, and struggling to keep up. The interests of the agencies were in reducing the volume of unsifted data they had to deal with, rather than acquiring and keeping more haystacks.
One basic question was whether Snowden had revealed western intelligence agencies to be out of control. Participants did not think so. While the general public had not known about what the programmes revealed by Snowden had been doing, the technological capacities they had acquired, and the close links between some countries (Five Eyes), the various oversight bodies had been aware and there had been proper authorisation for the operations which had been revealed. The issue was not therefore illegal activity. Agencies were careful to obey the law, and had lawyers and codes of conduct to ensure that happened. Agency staff had no wish to find themselves before the courts in person.
Did that mean everything was fine? Again participants did not think so. There had been genuine shock in many countries at what had been revealed, including the monitoring of the communications of leaders of friendly countries. Part of this had no doubt been reinforced by anti-American sentiment in some quarters, but it had also revealed the extent to which most people did not understand either the activities of the intelligence agencies or the oversight mechanisms which were in place to control them. This was unhealthy from many points of view.
One of our main conclusions was therefore that, although publicly available information about our intelligence agencies was already light years ahead of where it had been (it was after all not so long ago that the existence of some of the UK agencies was not officially acknowledged at all), still greater transparency about these activities and capabilities, and about the way in which they were authorised and controlled, was not only desirable but essential if credibility was to be preserved and enhanced. The current system in many countries was simply messy and confused. This meant, for example greater pro-activity in ensuring that information was brought to public attention in a timely and controlled way, rather than only acknowledged after the fact in defensive mode after a so-called scandal had occurred, or tucked away in long, specialised reports which no member of the public was ever likely to read. It was quite possible to reveal a good deal about the purposes of intelligence activity and how authorisation and oversight operated while ensuring that operational secrecy was maintained (which everyone would understand). This cold wind of transparency could also be a useful internal discipline for the agencies themselves.
The best approach was described as “translucency” rather than full transparency, based on the analogy of a partial view of what was going on behind frosted glass. Great care was needed to ensure that those seeking to commit crimes, whether terrorist acts, or other forms of crime, were not given too many clues about how they could avoid detection, and thus the limitations as well as the detailed methods of intelligence gathering had to be safeguarded, but this was not impossible. At the same time it was suggested by participants outside the intelligence world that concealing or redacting all operational details on secrecy grounds inevitably meant in today’s world that credibility was sacrificed. People were no longer prepared, if they ever had been, to take such things on trust. There had to be “stories” about real operations and successes out there which people could understand and relate to, and a greater effort needed to be made to provide such stories. In other words, the boundaries of what could safely be made public needed to be pushed further. The burden of proof should be on those arguing for secrecy, not the other way round, just as it should not be assumed that everything which could be done in technological terms therefore should be done.
Privacy and security
We quickly concluded that finding a comprehensive definition of privacy which satisfied everyone was not possible, any more than it was possible to define security. Different societies and different cultures had different sensitivities, sometimes for historical reasons. It was accepted that privacy was not and never had been an absolute right; it had always been constrained by state requirements of one sort or another, usually concerned with security. However, privacy was still a fundamental value of democracy and the rule of law, and had to be maintained. What was important was to try to define practices which unacceptably impinged on privacy in ways which would be generally recognisable.
The question we were trying to answer was not therefore whether some degree of personal privacy could and should be given up to ensure collective security, but how far both could be maintained, under what circumstances, and with what constraints. Again there could be no simple and universal answer. Citizens in some countries were much more sensitive about some aspects of surveillance than in other countries, as varying public reactions between the UK and US to the Snowden revelations and state activity, not to mention elsewhere, had clearly demonstrated. Nevertheless, in any democratic country, there should be an open debate about the right balance to be struck, and certain broad principles such as proportionality should always be respected. We came back to this question of principles a good deal.
We discussed the assertion, often heard, that the younger generation no longer cared about privacy in the same way as their elders, as evidenced by their readiness to put details of their personal lives on Facebook and other sites, and by the general philosophy of a sharing culture. We did not accept this at face value. While it was clearly the case that the internet generation had implicitly accepted a certain trade-off between the convenience of the devices they used and the generation of available details and data which could then be commercialised, the furious reaction to publicity of abuses by some companies, and the increasing insistence by consumers on encryption, showed that worries about privacy were far from dead. Indeed, privacy was rapidly becoming a selling point.
Oversight of intelligence agencies
We did not have time to look comprehensively at oversight arrangements in different countries, but considered in some detail what happened in the US and the UK, and how they compared with arrangements in other like-minded countries. The starting point was the view that there was already a good deal of oversight, through legislature-based committees, as well as independent bodies and reviewers and judicial mechanisms. However, these arrangements were often complex and overlapping, hard to understand from the outside, and indeed largely unknown to the general public. There was therefore a strong case for modernising and streamlining these mechanisms, and ensuring that they were better-integrated and brought under a single, clear legislative umbrella. This would help to give the public confidence, and reduce concerns that particular bodies or individuals could be captured or manipulated by the agencies.
At the same time, we also thought it was important that overseers should see their task as including the preservation of the legitimate interests of the agencies and of national security, as well as the protection of the public. Their actions had to be about striking the right balance between the two – a balance which might vary from time to time in a particular country, for example because of increases or decreases in threats or public anxiety levels.
Our general view was that elected politicians and ministers were better-placed to provide the necessary warrants and authorisations for intelligence activities than lawyers or judges. They could take a broader look at the justification of the proposed activities, taking into account their wisdom, their political and other risks, and their proportionality, as well as their legality. They could and should also be able to defend and justify their decisions in parliament and other public fora, in ways which would be more difficult for, say, judges.
On the oversight front itself, we saw a case for both legislature-based committees and independent reviewers, but these should be integrated and “layered” in a rational way, without creating so much scrutiny and competing demands for information that they hampered the ability of agencies to do their legitimate jobs. Parliamentary committees were more likely to be effective if they were non-partisan, and included members who had not only an interest but also relevant experience, and who were in a position, for example through being out of front-line politics, to be seen as genuinely independent and able to withstand governmental and other pressures. At the same time it had to be accepted that no oversight mechanism was ever likely to be able to reveal all it did or knew. Indeed the purpose of having special committees in legislatures rather than open scrutiny was to provide a degree of trust by proxy. Citizens ultimately had to have some level of trust in the mechanisms in place.
One things governments and parliaments could do to increase trust and improve awareness was to hold more debates about these issues, rather than trying to avoid them. The reaction of the governments most concerned after the Snowden revelations had often appeared as ducking and weaving, trying to reveal as little as possible, or attempting to change the subject, rather than getting ahead of the argument. This had only made things worse and increased public suspicions.
It was clearly vital that overseers should have full access to the activities and operations of the agencies, as they did these days in many, though not all, democracies, but also important that great care should be taken about what was revealed, particularly on the operational side. While the need for more “translucency” was accepted, as suggested above, there were still differing views about how far details of “successes” could and should be made public. Doing so could increase support for the agencies, and add reality to otherwise dry accounts of processes, thereby increasing credibility. However, this was seen as a dangerous game by others, given the sensationalist instincts of the press, and the high risks of showing revealing too much about how results had been achieved. The overall view nevertheless seemed to be, again, that greater effort to disclose more about operations, and demonstrate their value, was now necessary to overcome a degree of public scepticism about decisions on operations which were unsupported by evidence.
Should overseers have “teeth”, and, if so, what kind? Again there were different views. American participants in particular thought some ability to impose sanctions was necessary, for example in the budgetary area, or in terms of disciplinary action against individuals. Others were inclined to believe that the threat of public criticism and political disapproval should be enough. The general take seemed to be that some teeth were necessary if oversight mechanisms were to be taken as seriously as they should be.
We also discussed the role of the press in supervision. Clearly journalists were interested and important. Links with the press should be maintained, not avoided. More transparency meant more press coverage. One important test of the acceptability of a particular activity or operation was how defensible it would be if the details did get into the press at some stage. But it was an illusion to think that the press could be controlled in any way, while the press themselves were always likely to be wary of being manipulated or captured.
There was a strong view that oversight could no longer be confined to the collection of intelligence and data. It now had to extend to how these were used, how they could be combined with other information, and how they could and should be shared with others. It also needed to embrace the co-operation with the private sector corporations that understood and used the technology, and increasingly held the data to which agencies wanted access.
The issue of data sharing across national boundaries occupied a good deal of our time, in one way or another. It was accepted that such sharing was desirable, not least since most of this would be about regular criminal activity, not terrorism. It was also accepted that current sharing arrangements were often cumbersome and unsatisfactory in an internet age, for example given the frequency of cyber attacks. Timeliness was often of the essence, but also very hard to achieve when the mechanisms were so clumsy. There had been efforts to improve the situation, for example to speed up the operation of Mutual Legal Assistance Treaties (MLATs), but much more was needed.
This gave rise to difficult issues of principle as well as practice. How much could or should countries share of the information they had collected about their own citizens, for example? This led to questions about whether it was right or reasonable for countries to apply different standards, for example higher levels of justification, to the collection of data on their own citizens than on citizens from other countries. Some thought it was obviously right to do so. There was bound to be more worry about what US intelligence agencies did with regard to US citizens than, for example, citizens of North Korea. Others argued that in principle there ought to be wider minimum standards at international level, even if only among like-minded countries; and that in practice justification and legal authorization were needed to conduct operations against foreign citizens just as much as against Americans.
It was also pointed out by several participants that in practice the key distinction in this area was not between citizens and non-citizens, but between residents and non-residents, because the distinction depended on a particular country’s geographical jurisdiction.
The underlying problem about sharing was that data held in a particular country was subject to the laws of that country, and those laws were often highly restrictive. US companies holding data in the US were for example heavily restricted in the way of sharing because of US law. US legislation, notably the Electronic Capture and Privacy Act (ECPA), would need to be revised to change this, which would be extremely problematic. The practical outcome was often that the US wanted access to data held by others but was able to give very little access to its own data, which was creating resentment. The principle of reciprocity had to be part of sharing arrangements. The Passenger Name Sharing (PNR) system could be a model for sharing of bulk data in some ways, but again better reciprocity was needed.
We debated extensively the collection of bulk data. This was seen by the agencies as essential to finding hitherto unidentified dangerous individuals who could then become targets for other operations. There was no other way of doing it. It was all about triage. This was becoming increasingly important as end-to-end encryption spread. We agreed that bulk data collection was not of itself the same as the “mass surveillance” beloved of the media, although it could of course without proper regulation and oversight be misused for that purpose. Some argued that there was a clear difference between high-powered computers scanning communications data and metadata and looking for particular characteristics which might reveal where a target might be found, on the one hand, and a human being examining the content of messages on the other. It was reasonable and legitimate to worry about the extent of the latter, but much less so about the former. This was an important justification for some major activities by agencies like GCHQ and NSA.
Others warned strongly that these distinctions between bulk data/metadata and content, and between computers and human access, were less and less valid, and should not be relied on as the prime argument for the ethical acceptance of these activities. As more and more data became available, and one set of bulk data could be combined with others, the idea that any set of data was safely anonymous was also steadily being discredited. Moreover, not all metadata was the same (and there was no agreed definition of metadata). Some could be as sensitive as more obviously personal details.
This was clearly an important argument, with huge consequences for future agency activity, which we were not able to resolve in the time available. The implications of the removal of the distinction between metadata and content, as a building block for public acceptability of agency activities, were potentially profound.
Meanwhile, the broader question of how to reconcile data collection with data protection also needed to be examined carefully. After all, good data protection was part of national security too. However, there was a fear that national data protection agencies were often too weak.
We heard strong arguments that the main concern was not data collection as such but data retention. People were seriously concerned that data collected for one purpose, properly authorised, could then be retained for long periods or even indefinitely, and be accessed by other agencies for other purposes, without this being transparent or known to the individuals concerned. It was therefore vital to insist that collected data should be destroyed after a short period of, say, three months. Others, mainly practitioners, contested having a single limit to apply to very different types of data, while recognising that there was an issue about retention. They argued that data which could be of use for criminal cases of great public interest could often not be accessed quickly because of unwieldy international arrangements for data sharing. Destroying this too quickly made no sense. In areas like counter-proliferation, data collected at one time might only become of interest at a later stage. Again, destroying it too quickly could be dangerous to national security.
One way to help calm fears about data retention was for this to be done in respect of communications data by the private companies who collected or were able to access it, rather than by the government agencies who were interested in having access to it. Access would then be possible only with the requisite authorisation. (This is of course what the new US Freedom Act now mandates in some areas.) However, the companies themselves had reservations about this, on two grounds. First, while not shirking their public duties, they did not want to be put into the frontline of liability and potential controversy in this way; and second, retaining such vast quantities of data was expensive, and they were not willing to foot the bill for something which was unrelated to their own commercial interests. Some limited government offers of finance had been made but so far they fell well short of what could actually be needed.
Another suggestion was that organisations which wanted data to be retained beyond a minimum period would have to justify this to an independent advisory body.
There was concern in all this that the notions of national controls and national borders were rapidly losing meaning in this area. The cloud now really was a cloud, in which the geographical location of server banks where data was stored meant little or nothing. Politicians and regulators were refusing to look this reality in the face. This was ultimately increasing the risks all round.
The role of the private sector
This brought us to the wider role of the private sector. In many ways, this was not new. Governments had long sought access to messages being sent via private companies, from letters to phone calls. However, the scale of private sector involvement in the internet had changed the picture dramatically. This was likely to increase further as the quantity of data continued to rise exponentially.
We were reminded that there was a significant difference between licensed telecom companies, which had always had a relationship with governments through licensing conditions, and the newer internet service providers which did not need licenses to operate. The vast majority of information asked for by agencies still came from the licensed telecoms companies, on the basis of normal warrants and authorisations, and there was little problem about this, although as internet communications expanded it would become less valuable as a source. The difficulty had arisen with the relationship between the internet service providers, whether providing only network capacity or content as well, and what the Snowden revelations had brought out into the open. Many consumers had been shocked to find out that agencies could have access to what was in their internet accounts, with or without the knowledge of the companies. The companies now had to be able to say that such apparent complicity was a thing of the past, and to offer secure encryption services to their customers, the keys to which they could not and would not share with government agencies.
Despite the friction this had caused, and public spats, both agencies and companies round the table expressed confidence that mutually satisfactory ways forward could be found, though this would not be easy. The key to this was acceptance by all concerned that access to information held by companies had to be through front doors, via legitimate, authorised warrants for specific purposes, not through back doors, or on the basis of bulk requests with scant justification. The companies were quite willing to share data in this way – they wanted to cooperate, and did not wish to be seen as unpatriotic or unwilling to help in the fight against terrorism or crime. The agencies could live with this approach as long as the right mechanisms and relationships could be established and the front doors were opened when necessary. There was no doubt a question over the reduced opportunity for the agencies to trawl bulk data usefully if they had to be specific about what they were looking for up front, but that would just have to be lived with.
The bottom lines for the companies were that they had to obey the laws in the countries in which they were based; and that they could not afford to alienate their customers, who were now alert to the risks to the information they were putting on to the internet. It was also of course important that they should not be forced to open their data to governments of undemocratic or hostile countries.
A wider question was the companies’ own ability and freedom to use the data they possessed. They argued that they could not use this in the same way as government agencies, for example to arrest people, or prevent certain activities. In any case, the market itself was always a corrective mechanism. Others noted that the powers given by possession of this data to the companies were nevertheless extensive, and in many ways as worrying as those in the hands of governments. They had more information than governments, and a greater interest in the private lives of their customers than at least democratic governments were ever likely to have. Their data, especially if sold on and used in combination with other data sets, could have profound influences over individual lives, for example in terms of creditworthiness. While in principle users of internet services gave their consent to how their data could be used, in practice few people bothered to read the small print or to challenge the conditions in it if they did. To what extent could this be said to constitute genuine consent? Of course big data could be and was used in positive ways for society, for example in the area of public health, but there were plenty of negative possibilities too. The problem was not with the existence of the data but how it could be used or abused.
Legislators and regulators were already concerned about these things, but were overwhelmed by the pace of technological change and the tsunami of data. They would always be a long way behind the game. The key could therefore be to aim to influence and control ethics and behaviours, rather than to engage in a vain attempt to regulate particular bits of technology or services which would have moved on by the time the legislation was passed, still less implemented. All this would require a good deal of trust all round – from the public in the companies and the regulators, and from the companies in the regulators and vice-versa. This was asking a great deal. It also raised the question of whether anyone could or would be in a position to call a halt if data services began to head in directions which actually threatened democracy or public safety/confidence. One analogy was with controls over cloning. Governments and companies needed to work together to look at the risks and trade-offs involved, and make sure these were being assessed in an effective and integrated way. Joint advisory boards at national or international levels were one possible way forward.
Governments and companies were in any case bound to need to work together increasingly closely in the battle against cyber-crime and cyber espionage and sabotage. The dependence of almost every organisation on the internet was creating huge vulnerabilities at national, company and personal level which we were struggling to comprehend, let alone deal with effectively. Neither governments nor companies were capable of taking this on alone.
There was widespread agreement throughout our discussion that a set of principles or norms was needed to guide behaviour within and between countries which shared values in an area where change was so rapid and the ground was shifting beneath our feet even as we thought about the problems. Within countries a new bargain between citizens and the state was required. Internationally, at least between like-minded countries, some basic assurances of behaviour were essential to facilitate trust and information-sharing, not least as the idea of national security became less and less easy to define. Principles were also needed to guide relationships and create trust and mutual accountability between governments and private companies.
Some key principles needed to underlie action in this area were summarised as follows:
- respect for the rule of law and international human rights at all times, and for the concept of lawful purpose;
- authorisation of intelligence activity up a recognised chain of command to permit effective oversight;
- necessity – states in open societies should only conduct surveillance and collect information reluctantly and when they had to, and with a reasonable prospect of success;
- proportionality – the likely impact and intrusion into privacy of a proposed operation had to be justified by a significant and specific risk, and in proportion to it in terms of reach and scope, with a hard-headed assessment as to the risk of collateral damage;
- integrity had to be assured throughout the system;
- legislative clarity about the aims and activities of agencies;
- effective, transparent, and integrated oversight mechanisms (preferably with teeth), governed by clear legislation;
- maximum translucency and where possible transparency about intelligence surveillance activities;
- recognition of a justifiable minimum of operational secrecy for agencies;
- no surveillance of friendly leaders without a clear and pressing security rationale;
- no use of surveillance for the benefit of commercial companies;
- indispensability of high levels of international collaboration and information-sharing between like-minded countries sharing basic values and principles, on the basis of reciprocity.T
The underlying prize in all this was citizen consent, and its renewal. Without that we were only heading for more trouble in this sensitive area.
We thought that if we could get agreement on such principles between a small group of countries in the first place, that would then lay the foundation for a network of bilateral agreements to strengthen both protection of the public and essential information-sharing, and for other processes to be put in place nationally and internationally. We were mindful that, while we could not and would not realistically want to extend this to undemocratic countries, we would in due course want to include countries like India.
Recommendations for the future other than these principles emerge from the detail of the Note, but have not been listed separately here. In the end, we thought that a satisfactory balance between the demands of security and the requirement for individual privacy is a question of good governance. Democracies may always struggle to deal with this because, as one participant put it, democracy requires openness, and its defence requires secrecy. But there is no comparison to be made between what democracies are trying to do, however imperfectly, and what happens in autocratic societies where there is no transparency, and no checks and balances exist. Some authoritarian states too have advanced technical capabilities which can impact us all and which we cannot and should not ignore.
This Note reflects the Director’s personal impressions of the conference. No participant is in any way committed to its content or expression.
CHAIR: Sir John Scarlett KCMG OBE
Senior Advisor, Morgan Stanley; Chairman, Strategy Advisory Council, Statoil ASA; Chairman, Bletchley Park Trust; Chief, Secret Intelligence Service 2004-09.
Mr David Irvine AO
Formerly: Director-General of Security/Director, Australian Security Intelligence Organisation (2009-14); Director-General, Australian Secret Intelligence Service (2003-09); Ambassador to China (2000-03).
Mr Richard Fadden
National Security Adviser, Government of Canada (2015) and Deputy Minister, Department of National Defence (2013-). Formerly: Director, Canadian Security Intelligence Service (2009-13); Deputy Minister, Citizenship and Immigration Canada (2006-09); Deputy Minister, Natural Resources Canada (2005-06); President, Canadian Food Inspection Agency (2002-05); Deputy Clerk and Counsel, Privy Council Office (2000-02); Security and Intelligence Coordinator (2001-02).
Professor Jacques Frémont
President, Québec Human Rights and Youth Rights Commission; Professor Emeritus, University of Montreal. Formerly: Director, International Higher Education Support Program, Open Society Foundations USA (2010-13); Provost (2005-10), Dean, Faculty of Law (2000-04) and Professor of Constitutional Law and Human Rights, University of Montreal. Member of the Program Advisory Committee, The Canadian Ditchley Foundation (2009-11).
Professor Kent Roach FRSc
Professor of Law and Prichard-Wilson Chair in Law and Public Policy, University of Toronto; Director, Legal Research, Commission of Inquiry into the Investigation of the Bombing of Air India Flight 182; Fellow, Royal Society of Canada (2002-); Editor-in-Chief, Criminal Law Quarterly (1998-). Formerly: Professor of Law, University of Toronto; Member, Working Group drafting anti-terrorism laws in Indonesia.
Hon. Lt Col. Scott Shepherd MA, MBA, CITP, KSJ
Founder (1994), President and CEO, Northstar Trade Finance Inc., Vancouver; Lt Col., British Columbia Regiment, Reserve Force of the Canadian Armed Forces; Chairman, BC, Canadian Forces Liaison Council; Advisor, Dalhousie University; Chairman of Board, MDS Aero Support Corporation. Formerly: President, Dalhousie Financial Corporation, British Columbia; Assistant Trade Commissioner, Department of External Affairs, Manila. Member of the Canadian Ditchley Advisory Committee.
Mr Gilles de Kerchove
EU Counter-Terrorism Coordinator, General Secretariat of the Council of the European Union.
Ambassador Christophe Bigot
Director of Strategy, Directorate General for External Security. Formerly: Ambassador to Israel; Advisor on North Africa and Near and Middle East to the Minister of Foreign and European Affairs (2007-09).
Mrs Isabelle Falque-Pierrotin
Chair, EU Article 29 Working Party (2014-); Member, Executive Committee, International Conference of Data Protection and Privacy Commissioners (2014-); Chair, French Data Protection Authority (CNIL) (2011-); State Counsellor (2001-). Formerly: Member (2004-14) and Vice President (2009-14), National Commission on Information Technology and Personal Freedoms; Board President, Forum on Internet Rights (2001-10); Rapporteur, State Council report on 'Internet and Digital Networks' (1997-98); President, Interministerial Commission on the Internet (1996).
Mr Ernst Uhrlau
Global Risk Senior Advisor, Deutsche Bank (2012-). Formerly: President, German Federal Intelligence Service (2005-11); Head of Intelligence Coordination Department, German Chancellery (1998-2005); President of Police, Federal State of Hamburg (1996-98); President, Office for the Protection of the Constitution (1991-96).
Adv. Yoram Hacohen
Senior Fellow (Cyber, Technology and Law), The Institute for National Security Studies (INSS), Israel; Founder and Managing Partner, Hacohen & Co. Technology and Information Regulation Experts, Israel. Formerly: Head, Israeli Law, Information and Technology Authority, Ministry of Justice, Vice President, Secure-net Ltd, Israel.
Dr Giuseppe Busia
Secretary General, Italian Data Protection Authority, Rome (2012-). Formerly: Secretary General, Authority for the Supervision of Public Procurement Contracts (2008-12); Director, State-Regions Conference and Secretary to the Joint Conference for relations between State, Regions and local authorities, Office of the Prime Minister (2006-08); Deputy Head of Cabinet to Minister of Cultural Heritage and Activities (2006).
Mr Peter Hustinx
Formerly: European Data Protection Supervisor (2004-14); Chairman, Dutch Data Protection Authority (1991-2004); Chairman, Article 29 Working Party (1996-2000); Chairman, International Conference of Data Protection Commissioners (1994-95); Legal Adviser, Dutch Ministry of Justice (1971-91); Chairman, Committee of Experts on Data Protection, Council of Europe (1985-88); Deputy Judge, Court of Appeal, Amsterdam (1985-2014).
Mr Jacob Kohnstamm
Chairman, Dutch Data Protection Authority. Formerly: Vice-Chairman (2008-10) then Chairman (2010-14), Article 29 Working Party; Chairman, Executive Committee of the International Conference of Data Protection and Privacy Commissioners (2011-14).
Mrs Phillipa McCrostie
Global Head of Corporate Finance, EY (formerly Ernst & Young), London (2008-); Member, Global Executive Board, EY (2008-); Board Member, Peterson Institute of International Economics, Washington, DC (2014-).
Mr Ingvar Åkesson
Senior Adviser, Secana, Stockholm. Formerly: Director General, National Defence Radio Establishment (2003-13); Permanent Under-Secretary, Ministry of Defence (1996-2003); Secretary, Committee on the Constitution (Parliament) (1988-96).
The Rt Hon. Lord Butler of Brockwell KG GCB CVO
Life Peer (1998-); Member, Intelligence and Security Committee of Parliament. Formerly: Master, University College, Oxford (1998-2008); Chairman, Review of Intelligence on Weapons of Mass Destruction (2004); Secretary of the Cabinet and Head of the Home Civil Service (1988-98). A Governor, The Ditchley Foundation.
Mr Duncan Campbell BA FRSA
Investigative journalist, author, consultant, television producer; forensic expert witness on computers and communications data; Visiting Fellow, Media School, Bournemouth University (2002-). Formerly: Staff Writer, New Statesman; Co-Founder, Stonewall; Co-Founder (1991), Investigation and Production Television; Founder Member, International Consortium of Investigative Journalists.
The Rt Hon. The Lord Carlile of Berriew CBE QC
Member of the House of Lords (1999-); Consultant on political risk issues. Formerly: Independent Reviewer of Terrorism Legislation (2001-11).
Mr Simon Case
Director, Strategic Policy and External Relations, Government Communications Headquarters.
Mr Peter Clarke CVO OBE QPM
Board Member, Charity Commission of England and Wales (2013-); Senior Adviser, Kellogg Brown and Root Limited (2011-); Partner, Peter Clarke Associates (2010-). Formerly: Metropolitan Police Authority (1977-2008), latterly Head, Counter Terrorism Command.
Mr Gordon Corera
Security Correspondent, BBC News.
Professor Timothy Garton Ash CMG
Professor of European Studies, University of Oxford; Isaiah Berlin Professorial Fellow, St Antony's College, Oxford; Senior Fellow, The Hoover Institution, Stanford University; Columnist, The Guardian; Founding Member, European Council on Foreign Relations. A Governor, The Ditchley Foundation.
Mr Robert Hannigan CMG
Director, Government Communications Headquarters (2014-). Formerly: Director General, Defence and Intelligence, Foreign and Commonwealth Office (2010-14); Prime Minister's Security Adviser (No.10) and Head of Intelligence, Security, and Resilience, The Cabinet Office; Director General (Political), Northern Ireland Office.
Mrs Verity Harding
Lord Hennessy of Nympsfield FBA
Crossbench Peer, House of Lords; Attlee Professor of Contemporary British History, Queen Mary College, University of London (1992-); Fellow of the British Academy. Formerly: Chairman, Kennedy Memorial Trust (1995-2000); Co-Founder, Institute of Contemporary British History (1986); Leader Writer and Columnist, The Times (1982-84); Whitehall Correspondent, The Times (1976-82). A Governor, a Member of the Council of Management and of the Programme Committee, The Ditchley Foundation.
Mr Steven Hill
Counsellor, UK Mission to the United Nations, New York.
Ms Helen Judge
On secondment to the Foreign and Commonwealth Office from the Ministry of Justice.
Mr Matthew Kirk
Group External Affairs Director and Executive Committee Board Member, Vodafone Group Services Ltd, London (2006-). Formerly: HM Diplomatic Service (1982-2006): HM Ambassador to Finland (2002-06); Head, ICT Strategy, Foreign and Commonwealth Office, London (1999-2002); Head of Division, EU Secretariat, Cabinet Office, London (1998); Head, EU Presidency Department, FCO (1997).
Mr Lewis Neal
Director, Intelligence Policy, Foreign and Commonwealth Office (2015).
The Rt Hon. Baroness Neville-Jones of Hutton Roof DCMG
Member of the House of Lords; Chair, Bank of England Advisory Committee on Cyber Security; Member, Joint Parliamentary Committee on National Security Strategy. Formerly: Special Representative to Business on Cyber Security, House of Lords (2011-14); Minister of State for Security and Counter Terrorism (2010-11); Shadow Minister for Security and National Security Adviser to the Leader of the Opposition (2007-2010); HM Diplomatic Service (1963-96): Deputy Under Secretary of State and Political Director, Foreign and Commonwealth Office (FCO) (1994-96); Chairman, Joint Intelligence Committee (1992-94). A Governor, The Ditchley Foundation.
Professor Sir David Omand GCB
Visiting Professor, King's College London; Author, 'Securing the State' (2010). Formerly: A Governor, The Ditchley Foundation (2006-11); Security and Intelligence Co-ordinator and Permanent Secretary, Cabinet Office (2001-05); Permanent Under Secretary of State, Home Office (1998-2001); Director, GCHQ (1996-98); Deputy Under Secretary of State Policy, Ministry of Defence (1992-96).
The Rt Hon. Sir Malcolm Rifkind KCMG QC
Formerly: Member of Parliament (Conservative) for Kensington (2010-15); Chairman, Intelligence and Security Committee (2010-15); Member of Parliament (Conservative) for Kensington and Chelsea (2005-10); for Edinburgh Pentlands (1974-97); Secretary of State for Foreign and Commonwealth Affairs (1995-97), Defence (1992-95), Transport (1990-92), Scotland (1986-90); Minister of State, Foreign and Commonwealth Office (1983-86). An Honorary Governor, The Ditchley Foundation.
Dr Jamie Saunders
Director, National Cyber Crime Unit, National Crime Agency, London (2014-).
The Rt Hon. John Spellar MP
Member of Parliament (Labour) for Warley (1997-), for Warley West (1992-97); Shadow Foreign and Commonwealth Office Minister (2010-). Formerly: Government Assistant Chief Whip (2008-10); Minister of State, Northern Ireland Office (2003-05); Minister for Transport (2001-03); Minister of State for the Armed Forces (1999-2001); Undersecretary for Defence (1997-99).
The Rt Hon. Sir Mark Waller
Intelligence Services Commissioner (2011-); Arbitrator and Mediator, Serle Court, London. Formerly: Vice-President Court of Appeal Civil Division (2006-10); Lord Justice of Appeal (1996-2010); President, Council of the Inns of Court (2003-06); Chairman, Judicial Studies Board (1999-2003).
The Baroness Shields OBE
Member, House of Lords; Parliamentary Under Secretary of State, Department for Culture, Media and Sport (2015-); Parliamentary Under Secretary of State, Department of Media, Culture and Sport (2015-); Prime Minister's Adviser in the Digital Economy; Member, Ministerial Digital Taskforce; Chair, Tech City UK. Formerly: Vice President and Managing Director (Europe, Middle East and Africa), Facebook; Managing Director (Europe, Russia, Middle East and Africa), Google, Inc.
The Hon. Rachel Brand
Member, US Privacy and Civil Liberties Oversight Board (2012-). Formerly: Assistant Attorney General for Legal Policy, US Department of Justice; Associate Counsel to the President of the United States; Vice President and Chief Counsel for Regulatory Litigation, US Chamber Litigation Center.
Ms Sue Halpern DPhil
Scholar-in-Residence, Middlebury College.
Mrs Jane Horvath
Senior Director of Global Privacy, Apple Inc. (2011-). Formerly: Global Privacy Counsel, Google, Inc. (2007-11); First Chief Privacy and Civil Liberties Officer, US Department of Justice (2006-07).
The Hon. John McLaughlin
Distinguished Practitioner-in-Residence, The Philip Merrill Center for Strategic Studies, School of Advanced International Studies, John Hopkins University (2004-); Member, Council on Foreign Relations; Member, Board of Trustees, Noblis Corporation; Foreign Affairs Columnist, www.ozy.com. Formerly: Acting Director, Central Intelligence Agency (2004); Deputy Director, Central Intelligence Agency (2000-04); Deputy Director for Intelligence, CIA (1997-2000); Vice Chairman and Acting Chairman, National Intelligence Council (1995-97).
Ms Jami Miscik
President and Vice-Chairman, Kissinger Associates Inc. (2009-); Member, President's Intelligence Board (2009-); Board of Directors: Morgan Stanley, EMC Corporation, In-Q-Tel, Council on Foreign Relations. Formerly: Global Head of Sovereign Risk, Lehman Brothers (2005-08); Deputy Director for Intelligence, CIA (2002-05); Director Intelligence Programs, National Security Council (1995-96). A Director, The American Ditchley Foundation.
Mr Erik Neuenschwander
Product Security and Privacy Manager, Apple Inc., Cupertino, California (2011-). Formerly: Data Analysis and iOS Performance Manager, Apple Inc. (2007-11).
Mr George M Newcombe
Member, Board of Visitors, Columbia University School of Law; Board of Overseers, New Jersey Institute of Technology; Member, The American Law Institute; Advisor, American Law Institute Privacy Principles Project. Formerly: Senior Partner, Simpson Thacher & Bartlett LLP (1983-2012). Director, The American Ditchley Foundation.
Mr Richard Salgado
Director for Law Enforcement and Information Security matters, Google Inc.
Mrs Mona Sutphen
Partner, Macro Advisory Partners, New York; Member, President's Intelligence Advisory Board; Board Member: International Rescue Committee, Human Rights First; Member, Council on Foreign Relations. Formerly: Managing Director, UBS AG; White House Deputy Chief of Staff for Policy (2009-11); Managing Director, Stonebridge International (2001-08); US Diplomatic Service (1991-2000): National Security Council, US Mission to the United Nations, Office of the High Representative in Sarajevo, State Department, and US Embassy, Bangkok.