74
Privacy and Internet Governance BERLIN – JUNE 2014 A publication by the Internet & Society Collaboratory Editor - Wolfgang Kleinwächter # 7 PROPOSITION Peter Schaar Chairman of the European Academy for Freedom of Information and Data Protection Internet & Society Co llaboratory H e a d of I n fo r m atio n S o c iet y Ja n M a lin o w sk i C o u n cil of E u r o p e, G e o rg A p e n e s F o r m e r N o r w e g ia n D ata I n s p e cto rate P e tr a S it te M e m ber of t h e G er m a n B u n d est a g ( e L e f t) G O V E R N M E N T & P A R L I A M E N T S t e p h a n i e P e r r i n N o n - C o m m e r c i a l S t a k e h o l d e r s G r o u p a t I C A N N R a k D a m m a k M e m b e r o f t h e S t e e r i n g C o m m i tt e e, I n t e r n e t R i g h t s & P r i n c i p l e s C o a l i t i o n , T o k y o L o r e n a J a u m e - P a l a s í C o l l a b o r a t o r y , B e r l i n C I V I L S O C I E T Y G e o r g e S a l a m a S A M E N A T e l e c o m m u n i c a t i o n s C o u n ci l, D u b a i N ic k A s h t o n - H a r t C o m p u t e r & C o m m u n i c a t io n s S u s a n n e D e h m e l B I T K O M G e r m a n y I n d u s t r y A s s o c i a t i o n , G e n e v a P R I V A T E S E C T O R H ig h e r S c h ool of E c o n o m ic s, M o sc o w M ic h a el K o m aro w N atio n al R e s e a r c h U n iversit y J o n n e S o i n in e n I n te rn e t E n gi n ee rin g T a s k F o rc e R ic h a r d H ill A u t h o r, fo r m e r I C T m a n a g e r T E C H N I C A L & A C A D E M I C C O M M U N I T Y

MIND #7: Privacy and Internet Governance

Embed Size (px)

DESCRIPTION

Volume 7 of the Multistakeholder Internet Dialog on Privacy and Internet Governance, with a proposition essay by Peter Schaar, former federal data protection officer of the German government. Publication freely available as Creative-Commons-BY. More issues available.

Citation preview

Page 1: MIND #7: Privacy and Internet Governance

Privacy and Internet Governance

BERLIN – JUNE 2014

A publication by the Internet & Society Collaboratory

Editor - Wolfgang Kleinwächter

# 7

PROPOSITION

Peter SchaarChairman of the European Academy

for Freedom of Information and Data Protection

Internet & SocietyCo llaboratory

Head of Information SocietyJan Malinowski – Council of Europe, Georg Apenes – Former Norwegian Data InspectoratePetra Sitte – Member of the German Bundestag (Th e Left)

GOVERNMENT & PARLIAMENT

Stephanie Perrin – Non-Commercial Stakeholders Group at ICANNRafi k Dammak – M

ember of the Steering Committee, Internet

Rights & Principles Coalition, Tokyo

Lorena Jaume-Palasí – Collaboratory, Berlin

CIVIL SO

CIETY

George Sala

ma – SA

MEN

A Te

lecom

munic

ation

s Coun

cil, Duba

iNick

Ashton

-Hart

– Co

mpute

r & C

ommu

nicatio

ns Susa

nne D

ehme

l – BI

TKOM

Germ

any

Indust

ry As

sociat

ion, G

eneva

PRIV

ATE

SEC

TOR

Higher School of Economics, Moscow

Michael Komarow – National Research University

Jonne Soininen – Internet Engineering Task Force Richard Hill – Author, former ICT manager

TECHNICAL & ACADEMIC COMMUNITY

Page 2: MIND #7: Privacy and Internet Governance

Privacy and Internet Governance# 7

A publication by the Internet & Society CollaboratoryEditor · Wolfgang Kleinwächter

1st EditionISBN 978-3-00-046186-6

Page 3: MIND #7: Privacy and Internet Governance

14 Peter Schaar · The Internet and Big Data - Incompatible with Data Protection?

20 Jan Malinowski · Big data: a challenge to privacy, a threat to society, an opportunity. Should we trust businesses with our privacy online or look to the state for protection?

25 Georg Apenes · Switching Off the Age of Enlightenment?

27 Petra Sitte · Big Data and Big Government necessitate a paradigm shift

05 Internet Governance for the Cloud Society · Preface

06 Wolfgang Kleinwächter · Editorial

RESPONSES GOVERNMENT & PARLIAMENT

PROPOSITION

Contents

64 Authors 68 About the Internet & Society Collaboratory

Page 4: MIND #7: Privacy and Internet Governance

33 Nick Ashton-Hart · The Internet is not incompatible with data protection, but the debate we currently have about privacy largely is

37 Susanne Dehmel · Modernizing data protection along with data processing technologies

40 George Salama · Big Data: An Opportunity Combined With Privacy Concerns. A Regulatory Perspective

44 Stephanie Perrin · The Internet and big data – incompatible with data protection? We don’t think so! A civil society perspective

49 Rafik Dammak · The need for versatility in data protection

51 Lorena Jaume-Palasí · Is data protection becoming incompatible with communication?

56 Jonne Soininen · The Current State of Internet Security From A Technical Perspective

59 Michael Komarow · Big Data leads to new international data processing policies

61 Richard Hill · Schaar is both profetic and mainstream

10 Abstract

RESPONSES PRIVATE SECTOR

RESPONSES TECHNICAL & ACADEMIC COMMUNITY

RESPONSES CIVIL SOCIETY

69 MIND needs your support 70 Previous Issues and Authors of MIND

72 Imprint

Page 5: MIND #7: Privacy and Internet Governance

Cred

it: K

mer

on | h

ttps

://fli

c.kr/p

/aSg

8g6

CC B

Y-N

C-N

D 2

.0 | h

ttps

://cr

ea-

tivec

omm

ons.o

rg/li

cens

es/b

y-nc

-nd/

2.0/

Page 6: MIND #7: Privacy and Internet Governance

7

Preface Internet Governance for the Cloud Society

PREFACE

The discourse on Internet Governance has reached an inflection point. It has become clear what is really at stake for our societies. Billions of individuals spend a considerable part of their lives online: we communicate and work, we shop and study, we discuss and argue via the Internet. This development is unstoppable and it is changing our society. However, the Internet does not only have an ever growing impact of the lives of individ-uals, but it increasingly shapes the life of organizations alike. Nations and corporations are also online, admin-istering, governing, doing business and enunciating their interests. Internet governance, therefore, is subject to a complex global struggle for power in the information age. It is our challenge to reconcile the desire for user data by industry and government with the fundamental rights and basic principles of civil rights and privacy. At the same time, the open and unrestricted character of the Internet needs to be preserved, as this openness drives

knowledge, progress, growth and vital infrastructures - not just in the so-called first world. Privacy, however, is impossible to protect and strengthen without true global cooperation and willingness by gov-ernments and corporations alike. Internet Governance used to be about domain names and IP addresses. Now-adays, we need to build a global consensus on how to translate concepts of privacy, democracy, freedom and security into a world where big data, ubiquitous access, and the Internet of things transform how we live. With this issue, we hope to contribute to this debate. We must ensure that the Internet benefits society.

The Collaboratory steering groupMartin G. Löhe (chair), Dr. Marianne Wulff (vice chair), Dr. Michael Littger, Lena-Sophie Müller, Dr. Philipp S. Müller

Page 7: MIND #7: Privacy and Internet Governance

8

EditorialInternet Governance

and PrivacyPROF. DR. WOLFGANG KLEINWÄCHTER, EDITOR

Was there privacy in ancient times and in the Middle Ages? Whole tribes lived under one roof, and in a village everybody knew everything about everybody. If you go to the ruins of the old Roman city of Pompeii, you will learn that even the restrooms were public spaces.

Today, privacy is seen as a fundamental individual human right, protected by Article 12 of the Universal Declaration of Human Rights which states: “No one shall be subjected to arbitrary interference with his pri-vacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.”

However, since the beginning of the Internet Age, we have seen growing unlimited access to all kinds of small and big personal data by transnational private corpora-tions and governmental security agencies. Individual privacy is eroded and undermined. Private correspon-dence is checked by authorized or non-authorized parties. As soon as you are connected to the Internet via a fixed or mobile end device – whether it is in your private home or in your hotel room, if you are walking in the street or riding in a car – somebody on the other end of the line will know where you are, what you are doing, and what your plans will be. It is not only the usual skeptics who

argue that the 21st century will see the “end of privacy”. Are we moving backwards into something like the “digi-tal Middle Ages”?

HISTORY OF PRIVACY

The understanding of privacy as a legal right has its own history. It goes back to a case from the 17th century – known as the Semayne’s Case from 1604 – when a British lawyer, Sir Edward Coke, stated: “The house of every one is to him as his castle and fortress, as well for his defence against injury and violence as for his repose.” The Semayne’s Case acknowledged that the king did not have unbridled authority to intrude on his subjects‘ dwellings, but recognized that government agents were permitted to conduct searches and seizures under certain conditions when their purpose was lawful and a warrant had been obtained.

This has later been taken as a blueprint by James Madi-son when he introduced the 4th amendment to the US Constitution in 1789: “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the

Page 8: MIND #7: Privacy and Internet Governance

9

EDITORIAL

persons or things to be seized”. Later, in 1890, Samuel D. Warren and Louis D. Brandeis described privacy as the “right to be let alone”.

The word “privacy” comes from the Latin privatus which means “separated from the rest”. The whole idea of the Internet is that we are connected, not separated, and that everybody can communicate with everybody anytime, anywhere. In the new virtual global village, we are all under one roof. Can we remain alone in cyberspace? Do we want to remain alone? How can protection work in a borderless space so that we as individuals are safe against unreasonable searches and seizures? How we can use the freedom we have won in the virtual world without risk-ing losing our privacy if we use the Internet? This is a big question and finding the right answer is not easy.

As we have seen in the last decade, technology always develops faster than our legal system. Code makers work at a higher speed than law makers. In the information age, it is the code that defines the space in which law makers now operate. This brings a lot of new flexibility to the system. On the other hand, social values, individual rights, and personal freedoms do not change overnight when new technologies are introduced. Our legal system has a high degree of stability which is needed in a demo-cratic society. What we have learned in recent years is that a lot of new Internet based services and applications offer new opportunities but very often do not need new regulations. They can be managed and dealt with on the basis of our existing legal system, both nationally and internationally.

From a legal point of view, there is no difference between stealing money offline and stealing money online. Steal-ing money is a crime, and a crime is a crime is a crime, offline as well as online. Doing harm to other people remains illegal whether it is done in the real or in the virtual world.

Yes, there are new problems in borderless cyberspace. If providers and users of Internet based services oper-ate under different jurisdictions, there is a pressure to “harmonize” national regulations or to decide which jurisdiction is relevant in a concrete controversial case. And yes, there are some new problems which have not yet been clearly defined in our traditional legal system, such as cloud computing or the linkage of objects to the Internet via interactive RFID chips. But neither cloud

computing nor the Internet of Things leads to the dis-appearance of universal values or human rights. In this respect, it was very natural that the UN Human Rights Council stated in a resolution from June 2012 that “the same rights that people have offline must also be pro-tected online”.

THE UN RESOLUTION ON PRIVACY IN THE DIGITAL AGEThis is also relevant for the right to privacy, as it was reaffirmed in the UN Resolution on the right to privacy in the digital age, initiated by Brazil and Germany and adopted at the 68th UN General Assembly in December 2013. The resolution notes inter alia that “the rapid pace of technological development enables individuals all over the world to use new information and communication technologies and at the same time enhances the capacity of Governments, companies and individuals to undertake surveillance, interception and data collection, which may violate or abuse human rights, in particular the right to privacy, as set out in article 12 of the Universal Declaration of Human Rights and article 17 of the International Covenant on Civil and Political Rights, and is therefore an issue of increasing concern”.

This brings us to the question of whether all technolo-gies that are invented and are available should be used in an unlimited way. There is a real question whether we need ethical, moral, and legal barriers for the use of certain types of technology. A person who owns a gun is not totally free to use this gun for everything. She or he has to respect concrete laws and if she/he ignores them and uses the gun against human beings, she/he will be punished and jailed.

Code makers work at a higher speed than law makers.

Credit: Ministerio TIC Colom

bia | https://flic.kr/p/bEjYer CC BY 2.0 | https://creativecom

mons.org/licenses/

by/2.0/deed.de

Page 9: MIND #7: Privacy and Internet Governance

10

In other words, we need restrictions on the use of com-munication technology which allows interference into our private homes, intrusion into our private communi-cations, and surveillance of our day-to-day behavior by private or public parties, corporations, governments, or our unfriendly neighbors.

There can be reasons for a justified interference. But this has to be the exception and cannot be the rule. And it needs to go through a legal procedure where a neutral third party, based on evidence of a clear and present danger, checks the necessity and proportionality of such interference. In other words, there will be no one-size-fits-all solution. It has to be decided on a case by case basis, taking into account the specific circumstances.

THE CHALLENGE TO FIND THE RIGHT BALANCEThe big challenge here is to find the right balance. But one thing is also clear; this can´t be left to the “free mar-ket”, where the individual Internet user has no adequate negotiation capacity against big corporations or big gov-ernments. For a fair balance, we need the protection of the law. As Jean Baptiste Lacordaire, the French philoso-pher, stated nearly two hundred years ago: “Between the strong and the weak … it is freedom that oppresses and the law that liberates”.

The 2013 UN Resolution on Privacy in the Digital Age is moving in the right direction here. The resolu-tion reaffirms “the human right to privacy, according to which no one shall be subjected to arbitrary or unlawful interference with his or her privacy, family, home or cor-respondence, and the right to the protection of the law against such interferences”. It recognizes that “the exer-cise of the right to privacy is important for the realization of the right to freedom of expression and to hold opin-ions without interference, and one of the foundations of a democratic society”, and it emphasizes that “unlawful or arbitrary surveillance and/or interception of commu-nications, as well as unlawful or arbitrary collection of personal data, as highly intrusive acts, violate the rights to privacy and freedom of expression and may contradict the tenets of a democratic society”.

Furthermore, the resolution also notes that “while con-cerns about public security may justify the gathering and protection of certain sensitive information, States must

ensure full compliance with their obligations under international human rights law”. And it expresses its deep concern about “the negative impact that surveil-lance and/or interception of communications, including extraterritorial surveillance and/or interception of com-munications, as well as the collection of personal data, in particular when carried out on a mass scale, may have on the exercise and enjoyment of human rights”. It con-cludes that “States must ensure that any measures taken to combat terrorism are in compliance with their obliga-tions under international law, in particular international human rights, refugee and humanitarian law”.

This is clear and balanced language adopted by UN member states and supported by a wide range of non-governmental stakeholders, in particular from civil society. To find the right balance not only among gov-ernments and stakeholders but also between justified security concerns and individual privacy rights is not easy, but we have to face this challenge in the digital age. The right answer can be found only in a bottom-up, open and transparent multistakeholder policy develop-ment process.

In this respect, it is good that the resolution invites the governments of the UN member states “to review their procedures, practices and legislation regarding the sur-veillance of communications, their interception and the collection of personal data, including mass surveillance, interception and collection, with a view to upholding the right to privacy by ensuring the full and effective imple-mentation of all their obligations under international human rights law” and to “establish or maintain existing independent, effective domestic oversight mechanisms capable of ensuring transparency, as appropriate, and accountability for State surveillance of communications, their interception and the collection of personal data”. However, such a call should go beyond relevant activi-ties by the governments of the UN member states and should also include the private sector, civil society, and the technical community.

TOWARDS A MULTISTAKEHOLDER MODEL IN THE DEVELOPMENT OF PRIVACY POLICIES A lot of personal data and surveillance capacity is now in the hands of the private sector. While private corpora-tions are obliged to respect the legislation of the country in which they operate, they often try to escape national legislation by “ jurisdiction shopping” – that is, to pick the country with the lowest standard of privacy laws as the place for starting business in borderless cyberspace. An inclusion of the private sector in a multistakeholder

For a fair balance, we need the protection of the law

Page 10: MIND #7: Privacy and Internet Governance

11

process to develop policies to respect individual privacy rights is as important as bringing civil society directly to the negotiation table. Networks like Privacy Inter-national, Human Rights Watch, Reporters without Borders, Article 19, Transparency International, Con-sumer International and others have to have a voice and a vote when it comes to global mechanisms which will enhance the protection of privacy in the digital age. And even more important is the inclusion of the technical community. This community has developed standards which enabled surveillance and enhanced control capacities. This community, as IETF or W3C, is now challenged to offer standards which will allow a higher protection of individual privacy. Privacy by design is a very concrete challenge for the Internet standard setting organizations, in particular when it comes to the next wave of services and applications relating to the Internet of Things.

Furthermore, it needs an enhanced understanding of the various elements of privacy protection and more specifications. When the Global Business Dialogue on eCommerce (GBDe) discussed privacy concerns in 1999, they differentiated between “sensitive” and “non-sensi-tive” data. For “sensitive” data (data related to health, finances, sexual orientation, religion, and political affili-ation), they proposed that a corporation should ask the individual if the corporation wanted to use this data (opt in). For “non-sensitive data” (such as shopping behavior, travel, open chats, searches), they proposed that corpora-tions could use the data as long as the individual did not express an explicit reservation (opt out). This approach was not further investigated or translated into concrete legislation. But it shows that a multistakeholder approach widens the perspective and can bring more and reason-able arguments to the negotiation table.

To take another example: the German constitution has included – since the 1980s – the right to informational self-determination which gives all rights regarding how to use personal data to the individual. In the 1990s, the right to access the secret files of the East German secret service (Stasi) was seen as a constitutional right. Can such an approach be globalized? Is it the right of an indi-vidual to know what information secret services around the world have collected about her or him? May I ask the NSA whether they have looked at my private commu-nication and if yes, what do they have in their database? NET MUNDIALIn this respect, the final document adopted at the recent Global Multistakeholder Meeting on the Future of Inter-net Governance (NetMundial) can be a good guideline

on how to enhance the multistakeholder model when it comes to policy development and decision making with regard to privacy issues in the Internet Governance Eco-system. Principle 1.3 of the NetMundial Declaration says very clearly: “The right to privacy must be protected. This includes not being subject to arbitrary or unlawful sur-veillance, collection, treatment and use of personal data.” And the roadmap section of the Sao Paulo Declara-tion states: “Mass and arbitrary surveillance undermines trust in the Internet and trust in the Internet governance ecosystem. Collection and processing of personal data by state and non-state actors should be conducted in accordance with international human rights law. More dialogue is needed on this topic at the international level using forums like the Human Rights Council and IGF aiming to develop a common understanding on all related aspects.”

This is a process and it will not be settled overnight. The next concrete step will be the report by the United Nations High Commissioner for Human Rights on “the protection and promotion of the right to privacy in the context of domestic and extraterritorial surveillance and/or interception of digital communications and the collec-tion of personal data, including on a mass scale, to the Human Rights Council at its twenty-seventh session and to the General Assembly at its sixty-ninth session (2014), with views and recommendations, to be considered by Member States” as it was decided by the UN General Assembly in 2013.

There is still a long way to go. But the first steps have been taken. Do not expect big jumps. Let´s go forward by taking more small steps, but let´s move in the right direction.

EDITORIAL

Page 11: MIND #7: Privacy and Internet Governance

12

Peter Schaar recognizes the urgency to look for convincing solutions for the emerging challenges regarding data pro-tection. Nowadays we are facing a high risk that the fundamental right to pri-vacy and other core values of Western democracies will be lost.

PROPOSITION

Page 12: MIND #7: Privacy and Internet Governance

13

JAN MALINOWSKI focuses on issues where privacy is sometimes minimised by obscuring the rel-evance of the individual or citizen, presenting persons as mere data subjects. He sees states are the duty bearers of human rights.

GEORG APENES identifies chances personal information can give to projects and plans that are generally accepted as being constructed for the common good. Still nowadays there are no tools that can protect individual privacy.

PETRA SITTE sees fatal impacts of Big Data and Big Government if not all EU citizens were treated as domestic residents. This would lend the EU a much stronger position in negotiations with the US Administration on dismantling the sur-veillance system.

NICK ASTHON-HART explains why the debate about data protection is incomplete and identi-fies false assumptions about the role of government, economy and competition law. He draws a line between governmental and economic use of personal data

SUSANNE DEHMEL raises the need for new methods of processing in order to cope with the exist-ing and ever growing amounts of data we produce. She is positive that Internet and big data are compatible with data protection.

GEORGE SALAMA demands from a Big Data policy framework to not hinder innovation and investments by giving operators and Internet service providers. At the same time privacy settings should be simplified and redesigned.

STEPHANIE PERRIN discusses why the value of big data should not be accepted as a given and that a societal value still has to be proved. She proposes five initiatives on how to meet this challenge and respond to the “need for a broad coalition to defend” the values we cherish in the information society.

RAFIK DAMMAK sees Big data as an evolution bringing new opportunities for businesses, but without a clear benefit for users. He explains how data protection and pri-vacy can borrow the scalability principle in order to be able to handle the next technology threat to privacy.

LORENA JAUME-PALASÍ refers to an impetus to store. Human acquisitiveness towards infor-mation did not change, but the access to it did. Instead of concentrating on data minimization we should concentrate on the values resulting from data in need of juridical protection.

JONNE SOININEN states that the internet is more secure than ever since the Internet’s technical community is developing technologies for increased security, whereas the pub-lic has created more awareness of privacy on the Internet.

MICHAEL KOMAROV declares that technological development has overtaken the policy-making process and applications according to web 3.0 are likely to be far more effec-tive at piecing together personal data than even traditional search engines.

RICHARD HILL demands from parliaments to take action, both to stop mass surveillance by governments and to curtail the power of dominant service providers to obtain data from customers and use it as they see fit to generate large profits.

GOVE

RNM

ENT A

ND

PA

RLIA

MEN

TPR

IVAT

E SE

CTO

RCI

VIL

SOCI

ETY

TECH

NIC

AL &

ACA

DEM

IC

COM

MUN

ITY

RESPONSES

Page 13: MIND #7: Privacy and Internet Governance
Page 14: MIND #7: Privacy and Internet Governance

PROPOSITION

PropositionPETER SCHAAR, CHAIRMAN OF THE EUROPEAN ACADEMY FOR FREEDOM OF INFORMATION AND DATA PROTECTION

Page 15: MIND #7: Privacy and Internet Governance

16

The Internet and Big Data - Incompatible with Data

Protection?PETER SCHAAR, CHAIRMAN OF THE EUROPEAN ACADEMY FOR

FREEDOM OF INFORMATION AND DATA PROTECTION

Credit: Heinrich Böll Stiftung | https://creativecommons.org/licenses/by-sa/2.0/ | CC BY-SA 2.0

Page 16: MIND #7: Privacy and Internet Governance

17

PROPOSITION

The Internet is frequently used as a synonym for digital globalization. Today, data travels around the world within a split second. And big data represents a concept based on the idea of collecting as much data as possible – the more data that is collected, the better the concept works.

Ideas and rules which have grown over decades and cen-turies must be scrutinized to determine whether they still fit into the brave new cyber-reality. This also applies to the protection of privacy. The current concepts of data protection date back to the Sixties of the 20th century. Since then, the world has changed dramatically, particu-larly in the field of information processing. Fifty years ago, most data was still processed manually. The few com-puters there were had very limited processing capabilities. Automated processing was carried out in data processing centers separated from offices and other workplaces. And cross-border data transfer was the absolute exception.

There is no question that – given rapid technological development – we urgently need to look for convinc-ing solutions for the emerging challenges regarding data protection. Otherwise, the world will witness a further erosion of privacy. The advocates of privacy and funda-mental rights must deal with the fact that some traditional rules of data protection are no longer effective in a world of ubiquitous and globalized information processing. In particular, the proponents of privacy have a vital interest in re-examining the current data protection regimes.

Most of the current data protection rules and regulations focus on the individual procedure used for data process-ing. The starting point of legal assessment has been the specific purpose. Personal data may be collected “for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes” says Art. 6 of the European Data Protection Directive of 1995. The processing of data is permitted as long as it is adequate, relevant and not excessive in relation to the purposes for which the data is collected. The traditional key question is: which data is needed to fulfill a legitimate purpose? The main criteria are relevance and adequacy. In other words, data protection regulations consider data processing from a micro perspective: single pieces of data, an individual algorithm, a specific purpose. Today, com-panies and public bodies see data processing more and more from a macro perspective: how can data coming from various sources be used to better understand what‘s going on and to optimize procedures? In particular, mass

data is seen as an asset; the “new oil” of the information society. Big data approaches are looking primarily for cor-relations and not for reasons. No big data apologist would even understand the question of which purpose a specific piece of data is to be collected for.

The privacy community faces the challenge of integrat-ing the macro perspective into a modern legal, political and economic framework. More systemic, technological and procedural instruments should be added to the data protection toolbox without forgetting that the funda-mental rights of the individual remain indispensable. The protection of privacy remains an expression of human dignity – it derives from a basic European constitutional understanding. That is not all; the individual has a right to informational self-determination so that he or she can freely develop his or her own personality. It is the indi-vidual who should basically determine which of his or her personal data is disclosed to whom and for which purposes it should be processed.

Personal data must not be seen as the property of the controller (or processor) who has the practical means to access the information. Up to now, data processing has been based on the consent of the individual “data subject” who always has the right to withdraw his or her consent. After withdrawal, the controller must stop the process-ing. Even if data is collected for a specific purpose, in particular within the framework of a contractual rela-tionship, this is no carte blanche for the data controller.

Purpose limitation of personal data remains a legal requirement even in a new technological environment. On the other hand, changes of purpose might be asso-ciated with smaller risks if the data is anonymized or pseudonymized. Regulators should c onsider setting up incentives for data controllers to remove identifiers from data sets. One example of this approach is the distinc-tion between personalized and pseudonymized profiles in the German Telemedia act. If a provider intends to carry out profiling with personal data including personal identifiers, he needs the explicit consent (opt in) of all individuals concerned. If the profiling is to be carried out without identifiers, the provider needs to inform the data subjects and offer them the opportunity to object (opt out).

The basic data protection principles – privacy and infor-mational self-determination – need to be preserved and protected in a changing world. Legislators, notably par-liaments, are called upon to draw the exact boundaries between acceptable and unacceptable use of personal data. At least in the first instance, this is not primarily the task of the courts. Parliaments are the first port of

Mass data is seen as an asset; the “new oil” of the information society.

Page 17: MIND #7: Privacy and Internet Governance

18

call for interpreting the requirements set up by consti-tutions. They also need to define concepts and concrete rules regarding the implementation of the right to data protection. This applies to public sector as well as to non-public actors, especially where huge, almost market-dominating companies are concerned.

What does that mean in detail? The “core area of private life” enjoys absolute protection under constitutional law, as the German Federal Constitutional Court pointed out in several rulings. Nobody has the right to cross this red line. It is not only eavesdropping on the bedroom by police or secret services that must be prohibited, but also the monitoring of highly personal, confidential electronic communications, as far as the core area of private life is affected. In this highly sensitive sphere, there is no jus-tification for surveillance measures carried out by public authorities. Private companies are certainly not allowed to cross this red line either. Highly sensitive personal health data must not be used for advertising or for other commercial purposes without the explicit and voluntary consent of the individual concerned. Criminal penalties remain essential for the defense of these red lines.

Fundamental rights must also be observed in the pro-cessing of data that, at first glance, appears less sensitive. For instance, in cases where data on purchasing behavior is used to determine the individual‘s health status, such as possible pregnancy, or political and religious views. These are not just the nightmares of a data protector; it is reality in the big data context, as shown by examples of American supermarket chains that operate in exactly this way. In addition, it is not an entirely new phenome-non. Even credit scoring systems used to assess the credit worthiness of a customer are mainly fed with “harmless” data – but the results might have serious consequences for the loan applicants. Big data approaches will raise more and more questions like this. These questions can-not be solved by mere prohibition of the generation of additional knowledge. The risks for privacy and individ-ual self-determination coming from statistical analyses of mass data should be limited by setting up procedural requirements. Privacy Impact Assessments ( PIA) – the preventive, systematic assessment of the impact of specific technologies and the definition of protective measures – might be an important tool to prevent, or at least diminish, negative consequences. The proposed basic regulation for data protection tabled by the Euro-

pean Commission more than two years ago contains interesting approaches for PIA that are worth develop-ing further.

More transparency can help individuals to exercise their right to informational self-determination and to protect their privacy. Data controllers are obliged to provide potential users and customers with the relevant infor-mation. Extensive privacy policies which are as long as a novel, and which nobody reads, provide pretended transparency only. Instead, the core information must be easily accessible and understandable. Who is respon-sible? What data will be processed? For what purpose and where? The data protection authorities need to con-tinue to verify that the information provided by the data controllers meets these requirements. In addition, viola-tions and false promises could be punished by individual claims for damages and secured by a right to mount class actions, such as those well known in the area of con-sumer protection.

In a networked world, users need genuine alternatives much more than customers in other markets. If you want to buy a car, you have the choice between a lot of differ-ent types and brands and make your purchasing decision independent of other people’s choices. In the market of interactive social networks, the situation is completely different: if all your friends are members of a specific service it is hard for you to leave, even if the provider changes his privacy settings in a way you do not agree with. In particular, the functioning of those markets that are fed by personal data is of great importance from a data protection perspective. The successful models of “free” services, mainly financed by targeted, personalized advertising, need to be brought under effective competi-tion control. The user needs a free choice about which data he or she gives away to which provider under which conditions. The challenge of preventing monopolistic structures restricting the user’s autonomy in Web 2.0 services has not yet been tackled by law-makers. Even in this field, the EU data protection package might pro-vide an element of the solution by constituting a “right to data portability”. The user of interactive services, such as social networks, needs the opportunity to retrieve and extract his or her data from one provider in order to process it on his or her own computer or to transfer it to another provider. In order to prevent discrimination, a “right to connectivity” should be considered as well. Big Internet services – at least if they are quasi monopo-lists in a specific class of services – should be obliged to accept those users who comply with the respective rules. Moreover, Web 2.0 services should provide open inter-faces to enable communications between members and non-members.

In a networked world, users need genuine alternatives much more than customers in other markets.

Page 18: MIND #7: Privacy and Internet Governance

19

PROPOSITION

Furthermore, there is the fundamental question of the integration of data protection in information technology – in software as well as in hardware. Hardware identi-fiers allowing third parties to track users without their consent must be avoided. Nobody can dispute the fact that it is incompatible with data protection and with fair information principles that some apps suck almost all the data stored on smartphones, even if this data is not required for the apps’ functionalities. This is only one example of the shortcomings in the field of technologi-cal data protection. There are many more: if you become a member of a social network, many privacy settings are switched off by default. If you start a web search, most search engine providers get much more personal infor-mation than they need for carrying out the search or optimizing the algorithms. This needs to be changed if we want to prevent privacy fading away. IT systems need to be designed in a privacy-friendly way, giving the indi-vidual maximum control of his or her own data. But it is not sufficient to provide the user with privacy controls. Another important question is how products and ser-vices are delivered. Today, nobody would accept cars with weak safety settings or with airbag sensors not activated. In the IT market, however, many products and services are delivered without, or with a low level of, privacy pre-cautions. The keywords for new thinking in this area are “privacy by design” and “privacy by default”. Technologi-cal data protection may help to bridge the gap between conflicting interests. Many tasks may be realized with anonymous, or at least pseudonymous, data.

Last but not least, it is unacceptable that governments and intelligence agencies are abusing the increasing international data transfer for bulk access to the trans-mitted data. As we have learned from the Snowden leaks, the NSA, GCHQ and other secret services col-lect as much data as they can. Legal safeguards focused on the protection of a country’s own people and limited to national territory do not protect personal data in the increasingly globalized world. As a consequence of the secret services’ “hunger for data”, the trust in Internet services has collapsed. In particular, some providers of services that are global by nature fear substantial losses. Private and commercial customers fear that the data they give to these services would be secretly accessible to intelligence services without sufficient safeguards. Tech-nical and legal changes may help to reconstruct trust. Data encryption and secure routing of data packets on

the Internet should be promoted. On the other hand, there is a pressing need to change the legal frameworks for transborder data processing. The “national interests” exemption in almost all legal instruments for data pro-tection concerning international transfers of personal data cannot be accepted any more. Additionally, there is a need to establish binding international data protec-tion standards guaranteeing the protection of private life, as laid out in Art. 12 of the United Nations Charter of Human Rights.

The legal framework for data protection should be both strong and flexible. Without robust enforcement, the legal requirements would remain theory. With-out flexibility, every new technological development would undermine the rules and create the need for legal changes. The current German system of data protection is weak regarding enforcement, at least on the federal level. The Federal Commissioner for Data Protection has no power to fine and his decisions are not binding for providers of telecommunications and postal services falling under his supervision. On the other hand, there are a lot of federal laws with specific data protection provisions. This system is neither consistent nor flexible. The reform of data protection legislation on a Euro-pean level should be seen as a chance to overcome these shortcomings.

If the approaches to improve data privacy and data secu-rity on a European and an intercontinental level fail, there is a high risk that the fundamental right to privacy and other core values of Western democracies will be lost. As a reaction to the excessive surveillance programs of the NSA and GCHQ , several governments have started activities for strengthening national control over network infrastructures, as well as over Internet services. A negative effect of such efforts could be a balkanized Internet, split along national borders with censorship and thought control – and intensified surveillance by national authorities. The pursuits of liberty and prosperity, free discussion and inclusion, closely linked to the infor-mation society, are at stake. There is a need for a broad coalition to defend these values.

The legal framework for data protection should be both strong and flexible.

Page 19: MIND #7: Privacy and Internet Governance

20

Page 20: MIND #7: Privacy and Internet Governance

21

JAN MALINOWSKI, COUNCIL OF EUROPE, HEAD OF INFORMATION SOCIETY

GEORG APENES, FORMER NORWEGIAN DATA INSPECTORATE

PETRA SITTE, MEMBER OF THE GERMAN BUNDESTAG (THE LEFT)

Responses Government and Parliament

RESPONSES GOVERNMENT & PARLIAMENT

Page 21: MIND #7: Privacy and Internet Governance

22

Big data: a challenge to privacy, a threat to society, an opportunity.Should we trust businesses with our privacy online or look to the state for protection?JAN MALINOWSKI1, COUNCIL OF EUROPE, HEAD OF INFORMATION SOCIETY

1 Disclaimer: the views expressed here are only those of the author and should in no way be re-garded as representing those of the Council of Europe or any of its organs, bodies or services.

Page 22: MIND #7: Privacy and Internet Governance

23

RESPONSES GOVERNMENT & PARLIAMENT

Discussions on big data mostly focus on the challenges that it poses to privacy, the opportunities it offers busi-ness and the security establishment, and the benefits it offers users: new products and services, improved user experience, promise of better healthcare, new forms of participation in decision making and democracy. These are serious, interrelated issues where privacy is sometimes minimised by obscuring the relevance of the individual or citizen, presenting persons as mere data subjects.

Youth representatives at the Stockholm European Dia-logue on Internet Governance (EuroDIG) in June 2012 described their approach to privacy online as “click yes and hope for the best”. Much is justified by saying that users have to pay for pretended free services with their personal data. This is mostly about commercial use of data, targeted advertising, data sharing among commer-cial partners or other transactions involving personal data, or even metadata aggregation and research.

While most people saw at most the tip of this iceberg, Edward Snowden jolted everyone by revealing the obscure space occupied by largely unaccountable spooks and their high-profit contractors, with an insatiable appetite for personal data and metadata.

As the discussion matures with the help of Snowden and others, the message of certain commentators evolves. For example, from advocating the exploitation of data gold mines in the hands of governments to demanding more respect for privacy; or from requiring overbroad data retention to becoming the liberators of users from per-vasive data kettling; or in one stroke reversing the safe harbour discourse to unsafe.

Neither risks nor opportunities stop there. Compare it to climate change. It can be described in dispassion-ate terms as a (moderate) temperature rise over time, which can reasonably be expected to have some effect on the environment or even on the availability of energy resources. For some, the environmental concern lies even in the sustainability of the economy and development. But more compelling still, climate change can (appar-ently) also be described as killing 300,000 people every year; or the disappearance of Arctic ice by 2030; or 2.5 billion people dying because of it by 2054.

Can we describe big data by reference to its impact in equally alarming terms? Can one say, for example: the exploitation of big data serves today to shape your con-sumption; it can reveal your whereabouts at all times, your intimate conduct, preferences, feelings or even your thoughts; tomorrow it will determine your health deci-sions; and in the longer term it will serve to shape your

political choices and, by aggregation, those of your com-munity? Many fear that we already live in a version of George Orwell’s 1984 world.

Big data is not just there; it does not grow on trees, flow in streams, nor is it found in natural reservoirs, and metadata is not the air around all that. Bruce Schneier described data as the pollution problem of the informa-tion age: all computer processes produce it and it stays around. Arguably, personal data is a natural occurrence, a by-product of any human activity, for example, when someone sees and remembers another person in a par-ticular location. But there has been a quantum leap in the latency of data, including metadata, the ability to record, store and aggregate it, and in the capacity to pro-cess it. And different data sets can now be combined and exploited together.

It is worth keeping Melvin Kranzberg‘s first law of tech-nology in mind: “Technology is neither good nor bad; nor is it neutral.” Increased technical capacity comes with greater responsibility. This principle of increased respon-sibility was clearly established by the European Court of Human Rights in S and Marper v. the United King-dom (concerning the overbroad storage of DNA samples by law enforcement agencies; technically possible but a disproportionate intrusion into people’s privacy). Few saw in Marper (December 2008) an announcement of the invalidation by the Court of Justice of the European Union of the data retention directive (April 2014).

Will we be able to tame big data, data ubiquity, the syn-dication of data sets, dragnet, big data enabled mass surveillance, uncontrolled or unaccountable access and predictive analyses, and prevent the misuse of big data to shape public opinion and steer political processes? Increasing control in society with the objective of mini-mising risk, of criminal activity for instance, has very tangible opportunity costs, especially in terms of fun-damental freedoms or civil liberties. A lowest common denominator approach would play into the hands of new age despots, whether government or corporate, lead-ing sheepish masses into an electronic Gulag. Equally unacceptable, relativism could bring to an end the uni-versality, integrity and openness of the Internet.

Recognition is owed to a number of individuals for their legacy to the world, such as Tim Berners-Lee, Jon Postel, Vint Cerf, Stuart Parkin (a precursor of big data), Sergey

Compare it to climate change.

Page 23: MIND #7: Privacy and Internet Governance

24

Brin and Larry Page. They preceded corporate activity that later sought to obtain maximum profit from com-mon spaces and tools. Nonetheless, businesses must be given considerable credit for many positive developments. But can the question of big data, a matter so impor-tant for humanity as a whole, be entrusted to corporate self-regulation or even to a variety of well-intentioned stakeholders with bitty roles and uncertain accountabil-ity, or be left in the hands of market forces? The answer has to be no, at least not without qualification.

States are the duty bearers of human rights; they are under negative obligations (e.g. not to interfere with the right to privacy) and also positive obligations (to pro-tect and promote respect for privacy, and to provide an enabling environment for its exercise). These obligations are all the greater given that the rights to privacy and to the protection of personal data are instrumental to the exercise of other fundamental rights, such as freedom of expression and the right to assembly and association, and all are intimately linked to participation in a demo-cratic society.

With all their faults and lacunae, in their more advanced forms state accountability and good governance mech-

anisms are unparalleled in the corporate world. More often than not, corporate social responsibility follows rather than anticipates policy and legal constraints, sometimes after the failure of intense lobby activity. Companies also embrace social responsibility in response to scrutiny from fourth estate public watchdogs, namely media and nowadays also civil society organisations.

At least in Europe, states can effectively be held to account for their performance against certain of their international law obligations before supranational courts. Their obligations are sometimes scoped out in the case law of those courts and also by their other specific commitments under international law, such as the Con-vention for the Protection of Individuals with regard to Automatic Processing of Personal Data (often referred to as Convention 108). Another example is the Con-vention on Cybercrime (also known as the Budapest Convention) with its procedural safeguards and human rights requirements; it might usefully be recalled that the Budapest Convention seeks the criminalisation of illegal access to and interception of data, and system interfer-ence. Both instruments, unique in their kind, are open to worldwide accession in the interest of harmonised stan-dards and effectiveness in the protection of rights.

The O

pte P

rojec

t, W

ikim

edia

Com

-m

ons |

CC

BY 2

.5 | c

reat

iveco

mm

ons.

org/

licen

ses/b

y/2.

5/de

ed.en

Page 24: MIND #7: Privacy and Internet Governance

25

RESPONSES GOVERNMENT & PARLIAMENT

Where hard international law obligations and account-ability do not reach, states’ obligations and commitments are sometimes explained in softer legal instruments. Examples can be found in Council of Europe adopted standards on the protection of individuals with regard to the automatic processing of personal data in the con-text of profiling, and on the protection of human rights with regard to search engines or social networks. There are also common standards, anchored solidly in Con-vention 108, on personal data processing and the police or employment, and attention has also been paid to the protection of sensitive data, such as financial, medical or other very personal or intimate information.

Melvin Kranzberg‘s fourth law of technology comes in handy: “Although technology might be a prime ele-ment in many public issues, nontechnical factors take precedence in technology–policy decisions.” That is to say, it doesn’t have to be accepted just because it is tech-nically possible; public policy considerations and public interest imperatives may well limit, condition or shape the deployment or use of technology. But public inter-est must be understood in its true dimension, not in an opportunistic manner. That is why human rights must be an overriding consideration, underpinned by rule of law (or due process requirements), all against a backdrop of democracy and of good governance with its fundamental multistakeholder dimension.

The declared purpose of the now invalidated data reten-tion directive was to enable the prevention, investigation, detection and prosecution of serious crime, in particular organised crime and terrorism. It commanded the build-ing of data stockpiles in Europe. While its invalidation may not signal the beginning of the end of big data, a dispassionate reading of the judgment of the Court of Justice of the European Union (Joined Cases C-293/12 and C-594/12) foretells a significant impact on data pro-cessing, which in data protection terms encompasses the collection, preservation and exploitation of data – what-ever the purpose. Paragraphs 26 to 28 of the judgment read:

„… the data which providers of publicly available electro-nic communications services or of public communications networks must retain, pursuant to Articles 3 and 5 of Directive 2006/24, include data necessary to trace and

identify the source of a communication and its destination, to identify the date, time, duration and type of a communi-cation, to identify users’ communication equipment, and to identify the location of mobile communication equipment, data which consist, inter alia, of the name and address of the subscriber or registered user, the calling telephone number, the number called and an IP address for Internet services. Those data make it possible, in particular, to know the identity of the person with whom a subscriber or regis-tered user has communicated and by what means, and to identify the time of the communication as well as the place from which that communication took place. They also make it possible to know the frequency of the communications of the subscriber or registered user with certain persons during a given period.“

„Those data, taken as a whole, may allow very precise con-clusions to be drawn concerning the private lives of the persons whose data has been retained, such as the habits of everyday life, permanent or temporary places of residence, daily or other movements, the activities carried out, the social relationships of those persons and the social environ-ments frequented by them.“

„… even though, as is apparent from Article 1(2) and Article 5(2) of Directive 2006/24, the directive does not permit the retention of the content of the communication or of information consulted using an electronic communi-cations network, it is not inconceivable that the retention of the data in question might have an effect on the use, by subscribers or registered users, of the means of communica-tion covered by that directive and, consequently, on their exercise of the freedom of expression guaranteed by Article 11 of the Charter.“

And paragraphs 34 and 59 add:

„… the obligation imposed by Articles 3 and 6 of Directive 2006/24 on providers of publicly available electronic com-munications services or of public communications networks to retain, for a certain period, data relating to a person’s private life and to his communications, such as those refer-red to in Article 5 of the directive, constitutes in itself an interference with the rights guaranteed by Article 7 of the Charter.“

„… whilst seeking to contribute to the fight against serious crime, Directive 2006/24 does not require any relationship between the data whose retention is provided for and a threat to public security and, in particular, it is not restric-ted to a retention in relation (i) to data pertaining to a particular time period and/or a particular geographical zone and/or to a circle of particular persons likely to be involved, in one way or another, in a serious crime, or

Where does that leave the possibility of taking advantage

of big data for positive use in the common interest?

The O

pte P

rojec

t, W

ikim

edia

Com

-m

ons |

CC

BY 2

.5 | c

reat

iveco

mm

ons.

org/

licen

ses/b

y/2.

5/de

ed.en

Page 25: MIND #7: Privacy and Internet Governance

26

(ii) to persons who could, for other reasons, contribute, by the retention of their data, to the prevention, detection or prosecution of serious offences.“

The judgment explains that, in addition, the directive failed to put in place adequate safeguards as regards “access of the competent national authorities to the data and to their subsequent use” or against “the risk of unlawful access to that data” and for “the protec-tion and security of the data in question in a clear and strict manner in order to ensure their full integrity and confidentiality.”

As regards national security, the Council of Europe ministers responsible for media and information society stated in November 2013 that:

“Any data collection or surveillance for the purpose of protection of national security must be done in com-pliance with existing human rights and rule of law requirements, including Article 8 of the European Convention on Human Rights. Given the growing tech-nological capabilities for electronic mass surveillance and the resulting concerns, we emphasise that there must be adequate and effective guarantees against abuse which may undermine or even destroy democracy.”

If big and indiscriminate data retention is not permitted for law enforcement or crime prevention and “a system of secret surveillance for the protection of national security may undermine or even destroy democracy under the cloak of defending it” (in Weber and Sarabia v. Ger-many, 2006), where does it leave big data for commercial or other exploitation? Other cases pending before the European Court of Human Rights: Big Brother Watch et al v. the United Kingdom (GCHQ ), and Centrum för Rättvisa v. Sweden (FRA) may offer additional guidance.

It may be unsafe to rely solely on the argument of opti-misation of user experience, or even the consent of users, given the dominant position of the entities seeking con-sent and the relative lack of choice of those giving it. Equally risky would be to argue the anonymisation of personal data incorporated into big data sets, which experts say is impossible given the ability to correlate data to form a precise picture of the individual behind the different elements.

Where does that leave the possibility of taking advan-tage of big data for positive use in the common interest? The UN itself is using or exploring the use of big data for development and for humanitarian action, to combat hunger, poverty or disease, to provide relief where it is most needed and to allocate scarce resources in the most efficient way. Big reliable data, produced in an unbiased and often unnoticed manner, as opposed to outdated sta-tistics, can help in many ways. The OECD also sees in big data a source of growth and innovation, or knowledge based capital, with considerable potential for scientific, medical research or even for public governance.

Are there pitfalls or do we trust that by giving a posi-tive label to the exploitation of big data, for example for public governance or for providing relief to people in a post-conflict situation, there is a guarantee that it will not be open to misuse, that data will be secure, that ano-nymity will be preserved? In the absence of certainty, a precautionary approach is justified, taking every measure possible to avoid unintended consequences: the burden falls to those taking the action.

Are all answers to these questions case or circumstance-neutral or, for example are the identities of the data controller and of those employed by the data control-ler relevant? The controller’s arrangements for storing, securing and processing personal data are most relevant, as underlined in the abovementioned judgment of the Court of Justice of the European Union. It is easy to imagine and depict situations that go wrong, and be accused of scaremongering. After all, the data controller vows to do no evil, so what do we have to fear? And the national security services tell us that they are out to catch the terrorists, not to engage in industrial espionage or to capture private images of people who are not suspected of any wrongdoing.

The basic rules remain valid and many countries have undertaken to respect them and to enforce them within their jurisdiction: necessity and proportionality, data min-imisation, retention only when needed, deletion of data or removal from data sets upon request or as soon as no longer needed. Convention 108, with its modernisation process under way, is a common standard that can serve the Internet community as a whole. Effectiveness will require more accessions and strong follow-up arrange-ments, possibly extending to monitoring compliance not only by state but also non-state actors, and including the transborder dimension of data processing: data flows, storage and security, access, exploitation. There will be a need for further careful examination of human rights, rule of law and democracy considerations to strike the right balance in respect of big data for years to come.

States are the duty bearers of human rights.

Page 26: MIND #7: Privacy and Internet Governance

27

Norway is celebrating 200 years of having a written constitution. During the spring of 1814, a handful of presumably representative members of the general pub-lic gathered. After the US and France, Norway was historically one of the first nations in which individual citizens enjoyed full freedom and the sovereign power was regulated by law.

But to be honest, most people were not particularly concerned with human rights 200 years ago. They went

about their everyday tasks in the fjords, in the barns and in the fields. Even though small libraries popped up regionally in Norway in the late 17th century, we have no indications that Norwegian peasants asked for literature on human rights in general or on the safe-guarding of the individual right of self-determination by means of a constitution. Statistics are very rare in this field, but as far as I know ‘An Essay Concerning Human Understanding’ by John Locke, published in 1690, or the ‘Critique of Practical Reason’ by Immanuel

Switching Off the Age of Enlightenment?GEORG APENES, FORMER NORWEGIAN DATA INSPECTORATE

RESPONSES GOVERNMENT & PARLIAMENT

Thom

as H

awk |

http

s://fl

ic.kr

/p/7

FVN

3j | C

C BY

-NC

2.0

| ht

tps:/

/cre

ative

com

mon

s.org

/lice

nses

/by-

nc/2

.0/d

eed.

de

Page 27: MIND #7: Privacy and Internet Governance

28

Kant, published about one hundred years later, were not general reading. Still, they laid the foundation for the age of European thought known as the Age of Reason or the Enlightenment.

Giving philosophical impetus to the development of political institutions in Europe?

Yes.

Generating enthusiasm amongst the general public – the individuals who, little by little, should be given political power and become ‘voters’?

No.

In 2013, Edward Snowden ‘happened’. Half the IT community was shocked. The other half commented that this was nothing more than to be expected. As they said: sooner or later our globe will be completely transparent. Once again, we will try to think up sys-tems or methods that may reserve a niche of privacy for a nation, a region or an alliance of nations shar-ing common values in their legislation or having signed the same treaty documents, but we very much doubt it is possible – after the Internet, multiple surveillance devices are omnipresent.

It was my own impression that these recommendations were delivered rather half-heartedly. The critics plainly pointed to the fact that the world before the Snowden incident certainly didn’t lack internationally binding, legal documents. In short, US president Barack Obama knew what he was doing, being familiar with Euro-pean legislation in this area. However, he decided not to respect it. That was what made Angela Merkel so mad! And at the present stage of technological development, it is next to impossible to detect who is listening in, and where, and when. One day this may, however, be possible and we may expect enormous resources to be invested in efforts to make eavesdropping virtually impossible… And this in itself will provoke still more efforts – to re-conquer lost territories of accessibility!Meanwhile, we may consider what will happen when we as a nation discover that information collected from one individual may be of legitimate interest to other individu-als, groups, researchers or governmental agencies.

To illustrate my point, I refer to what has happened in medicine in the last 20–30 years, where it is common to argue that giving up privacy or the protection of even very sensitive personal data may help others. It may also be argued that, for instance, insurance companies and social planners in health, schools and housing will find personal information interesting in their work.

Thus it is my humble opinion that we do not have the tools – at least not today – to protect individual privacy while at the same time giving personal information to projects and plans that are generally accepted as being constructed for the common good. Defence being one example, medicine being another.

In the US, there is an old saying that goes: “You can’t have your cake and eat it”. This is why nobody, as far as I can see, has yet come up with a scheme that allows both.

The situation in 2014 is, I think, that we may have to choose: if it is possible to develop systems and technolo-gies that may promote ‘classical’ privacy – which I myself very much doubt – let us still give it a try. If it is not possible, however, then let us note that George Orwell might well have been right in 1948 when he wrote in his novel 1984: “Who controls the past controls the future”.

We do not have the tools.

As they said: sooner or later our globe will be completely

transparent.

Page 28: MIND #7: Privacy and Internet Governance

29

RESPONSES GOVERNMENT & PARLIAMENT

In his book Digital Disconnect, Robert W. McChesney sees an increasingly symbiotic relationship between Big Data and Big Government, describing it as “a marriage made in heaven, with dire implications for liberty and democracy” (McChesney, p.21). He explains how a mili-tary–digital complex has been developed, made up of the government, the military and the secret services on the one hand and the Internet giants from the digital econ-omy on the other. According to McChesney, they have a complementary and mutually beneficial relationship: the government benefits from being given access to technolo-gies and data by the firms; the firms benefit not only from receiving large services contracts from the government, but also because they can rest assured that the govern-ment will not restrict their activities through anti-trust, taxation or regulatory measures and will represent their interests across the world.

The grim picture painted in 2013 by McChesney, who is Professor of Communication at the University of Illi-nois, was published prior to the far-reaching revelations made by Edward Snowden. Since then, acronyms and codes such as PRISM, Tempora and XKeyScore have become synonymous with the covert mass surveillance of the population on a previously unknown scale.

Though only a fraction of the Snowden documents has so far been published and we currently have to assume that Internet firms are generally forced by national laws to collaborate with the security apparatus in the states con-cerned, the existence of a convergence of interests cannot be dismissed completely. This convergence of interests stems from the automated processing of large amounts of data for the purposes of analysis, prediction and access-ing of information on individual behaviour.

Peter Schaar aptly defines the underlying concept as follows in his proposition: “big data represents a con-

cept based on the idea of collecting as much data as possible – the more data that is collected, the better the concept works”.

In some areas of science this may even have positive effects. In medicine, for example, Big Data analytics is already bringing practical benefits in fighting infec-tious epidemics. Big Data makes improved prevention, earlier protective measures and more targeted treatment possible. Yet the knowledge gained through Big Data appears relatively superficial in comparison with sci-entists’ traditional quest for knowledge. In a talk given in Berlin recently, Viktor Mayer-Schönburger from the Oxford Internet School pointed out that Big Data allows correlations to be identified, but is rarely able to provide information on causality (see Sievers in Neues Deutschland, 22 April 2014). Yet finding answers to the question “why” remains the duty of critical science and it is also the very essence of enlightenment, emancipa-tion, innovation and progressive policymaking. Thus, the limits of Big Data also highlight a trend towards a total-ity of existing facts and make clear that the use of Big Data must be well thought through and must be subject to strong human oversight, i.e. oversight by society and policymakers. It is particularly important in this con-text to ensure a sensitive technology impact assessment, leading to legal provisions focusing on individual civil liberties and human rights.

Against this background, the words of Eric Schmidt, now Google’s Executive Chairman, give pause for

Big Data and Big Government necessitate a paradigm shiftPETRA SITTE, MEMBER OF THE GERMAN BUNDESTAG (THE LEFT)

Big Data makes improved prevention, earlier protective measures and more targeted

treatment possible.

Page 29: MIND #7: Privacy and Internet Governance

30

Credit: TechCrunch | https://flic.kr/p/8EMbwv | CC BY 2.0

https://creativecomm

ons.org/licenses/by/2.0/deed.de

thought. Speaking as Google CEO at a technology conference in August 2010, he suggested that the challenges of modern technology could only be tackled through “much greater transparency and no anonymity”. And he added that, in a world of asymmetric threats, “true anonymity is too dangerous” (see Fried in CNET, 4 August 2010).

Schmidt’s words described a vision of society in which the elimination of all kinds of anonymity on the Inter-net is viewed as an article of faith. What he omitted to say was that in a world of mobile communication, with robots, drones, intelligent household appliances and many other technologies, information about people‘s cur-rent location is also generated. He also did not mention that the technologies needed for biometric body scans and analysis of humans have already existed for some time – whether they are used for voice or face recogni-tion, fingerprints or handprints. It is already possible for the algorithm-based analysis of large amounts of data

gained through smart devices to be combined with the logging of physical characteristics such as somebody‘s way of walking and moving or their speech, heartbeat or breathing patterns.

The data processing industry dreams of complete trans-parency in order to allow the profitable marketing of individual patterns of behaviour through advertising, insurance services, transport guidance systems and other process management systems. At the same time, this transparency presents opportunities on an even greater scale for the security state whose existence has become particularly evident since the Snowden leaks. The secu-rity state would gain opportunities to predict (and thus observe and sanction) human behaviour with, at the very least, Orwellian dimensions.

Thus, not only are a large number of good detailed regu-lations needed, as proposed by Peter Schaar, but also no less than a paradigm shift in data protection policy. We need a European area of data protection which is worthy of its name and which sets limits on the dreams of gener-ating profits through Big Data at the expense of personal autonomy and identity, as well as on Big Government‘s false promises of security.

We need a European area of data protection

Page 30: MIND #7: Privacy and Internet Governance

31

RESPONSES GOVERNMENT & PARLIAMENT

Yet a European data protection area must not lead to re-territorialisation or balkanisation of the Internet, as Peter Schaar writes. A European Schengen rout-ing system, for example, would mean the end of the fundamental principle of global connectivity. This would make it difficult to reject calls like those already made in the framework of the European Commission‘s Schengen deliberations during the Hungarian Council presidency for checks on content entering the area at external borders. Instead, we must ensure an equally high level of protection both within and outside the European Union. And in the context of amendments to European data protection law, we must ensure legal security concerning data processing by international companies, and that the right to informational self-determination is taken into account.

One important step in this direction is to make the principles of “privacy by design” and “privacy by default” binding in Europe. Schaar only mentions this briefly in his article, yet it is worthwhile explaining this by means of concrete examples. The privacy by design principle would make certain functionalities obligatory, such as the default encryption of data, data deletion after the performance of a function and technical measures ensur-ing purpose limitation. The privacy by default principle would mean that the strictest possible data protection settings would apply as soon as people began to use electronic services and applications. Thus web services, smartphones, tablets and apps would not be able to pass on data on usage, contacts and location and accumulate it on server farms without users giving their consent.

In addition, the establishment of a European area of data protection will undoubtedly require the exertion of polit-ical and economic pressure. The means to achieve this are there. One important element is the renegotiation and, if necessary, termination of the Swift and Passenger Name Record agreements, along with the Safe Harbour Agreement, which has proved ineffective in practice. In addition, an initiative is needed to create a European open-source infrastructure developed by a large num-ber of small and medium-sized companies with public financial support and with standards developed openly, publicly and transparently. This is the only way to create a trustworthy European counterweight to the dominance of large American Internet firms.

Finally, I should point out that the mass surveillance of Internet communications is not only being carried out by the US National Security Agency (NSA) and the British Government Communications Headquarters (GCHQ ). The French Direction Générale de la Sécurité Extéri-eure (DGSE), the German Federal Intelligence Service

(BND) and other Western external intelligence services are also involved in this total surveillance. They are all responsible for surveillance of communications outside their own countries or between their own countries and others. Surveillance of communication within a service’s own country is largely taboo and the legal restrictions in place are similar in all cases.

These services cooperate via a system of information swaps. They receive information on communications within their own country in exchange for informa-tion they have gathered on communications in another country. As surveillance of communications abroad is not subject to any restrictions and thus not to any over-sight, the system always conforms with the national legal framework – at least from the perspective of the secret services and the governments which back them. Were it possible to end this system of mutual favours, at least in Europe, by ensuring that all EU citizens were treated as domestic residents, one of the first important elements would be removed from the worldwide system of mass surveillance. This would lend the EU a much stronger position in negotiations with the US Administration on dismantling the surveillance system. Unless this hap-pens, the impacts of Big Data and Big Government will indeed be fatal.

LIST OF REFERENCES:

Fried, Ina. “Google‘s Schmidt: Society not ready for tech-nology.” CNET. 4 August 2010. http://www.cnet.com/news/googles-schmidt-society-not-ready-for-technology/

McChesney, Robert W. Digital Discon-nect. New York: The New Press, 2013

Sievers, Uwe. “Verfolgt vom eigenen Datenschat-ten.” Neues Deutschland 22 April 2014. http://www.neues-deutschland.de/artikel/930683.ver-folgt-vom-eigenen-datenschatten.html

Page 31: MIND #7: Privacy and Internet Governance
Page 32: MIND #7: Privacy and Internet Governance

RESPONSES PRIVATE SECTOR

NICK ASHTON-HART, COMPUTER & COMMUNICATIONS INDUSTRY ASSOCIATION, GENEVA

SUSANNE DEHMEL, BITKOM GERMANY

GEORGE SALAMA, SAMENA TELECOMMUNICATIONS COUNCIL, DUBAI

Responses Private Sector

Page 33: MIND #7: Privacy and Internet Governance

34

Credit: Le.Sanchez | https://flic.kr/ p/asAHVk Le.Sanchez |

https://creativecomm

ons.org/licenses/by-sa/2.0/

Page 34: MIND #7: Privacy and Internet Governance

35

RESPONSES PRIVATE SECTOR

SUMMARY

Peter Schaar’s article „The Internet and Big Data – Incompatible with Data Protection?“ is an excellent tour d’horizon of the debate we currently have, especially in Europe, about data protection. It also shows how incomplete that debate is, and the false assumptions at the core of why, in three substantial ways:1. The use by governments of personally identifiable

information (PII), especially for national security uses but also much more broadly, is fundamentally differ-ent from use by the private sector of that information. Right now, there is very little discussion of how fun-damental privacy protections, even in the EU, do not apply to large swathes of governments’ activi-ties – governments who are often the first to complain about economic actors’ uses of the very same infor-mation. More profoundly, the use (and really abuse) of the Internet as a tool for data acquisition and anal-ysis for national security is not a data protection, nor an Internet, problem: countries see foreigners as ‘fair game’ for data collection with no legal inhibi-tions at all. Mutual legal assistance treaty (MLAT)1

1 For an understanding of MLATs and why reform is impor-tant, see Access’ MLAT website at www.mlat.info.

reform (amongst other activities) is the venue for solv-ing the real problem here, not data protection laws.

2. All economic use of data is not the same. So-called data brokers2 (and more generally business-to-business (B2B) business models) have a strong economic interest in weak privacy protections as they need to aggregate the maximum amount of personal information with maximum freedom to exploit it. The opposite is true of business-to-consumer (B2C) services, who need users to trust them lest these users switch to another competing service with better policies. The current debate largely lumps everyone together – which, iron-ically, serves the interests of the unscrupulous more than it does the responsible, by demonising everyone.

3. Competition law – itself based upon 20th century, or even 19th century, concepts – is very poorly suited to solving 21st century data protection issues in creating consumer choice through its application. This is a fun-damental misunderstanding of competition law in the Internet age, and of the conditions that create innova-tion in technology-based services. Even a cursory look

2 There are many good sources of information about the ac-tivities of data brokers. A good snapshot, if rather US-cen-tric, of the many issues with them can be found on EPIC’s website at https://epic.org/privacy/choicepoint/.

The Internet is not incompatible with data protection, but the debate we currently have about privacy largely isNICK ASHTON-HART, COMPUTER & COMMUNICATIONS INDUSTRY ASSOCIATION, GENEVA

Page 35: MIND #7: Privacy and Internet Governance

36

We deserve better from national leaders and governments than

cynical attempts.

at how competition law has been applied, especially in Europe, in tech sector services shows it has done little more than create a billing bonanza for specialist law-yers on the one hand, and an ability for government competition authorities to ‘look busy’ on the other.^

THE DIFFERENCE BETWEEN GOVERNMENTAL AND ECONOMIC USE OF PERSONAL DATA: NIGHT AND DAY

Put bluntly, governments can lock people up and throw away the key when it comes to national security – including indefinite detention without trial. Companies cannot. Companies can monetize personal data and, in the case of consumer-facing free-to-the-user services, target advertising. Abuses can and have resulted in lost jobs and in the release of very personal information – sexual orientation, pregnancy, and many other situations – and these deserve real focus. These abuses should not be allowed to obscure the truth that there is a vast power disparity between companies and governments, and there are consequences as a result of abuse of personal informa-tion by each.

To make matters worse, while the public debate about reform of data protection laws focuses almost exclusively on reforms to protect users from the actions of the pri-vate sector, little to no debate is taking place on the very large carve-outs in existing data protection legislation that allow governments to use personal information for all sorts of purposes with little to no disclosure, or any obligation to ask citizens for their consent, for any of it. This is particularly striking given the fact that there are spectacular abuses of PII by governments and not just in national security realms: large amounts of PII have been made public through inadequate systemic security, plain carelessness, and even greed3. As recently as this past Feb-ruary, the UK’s National Health Service sold the records of 47 million UK citizens4 to the insurance industry.

Many governments are deliberately fostering the con-flation of these two entirely different realities. There is a deliberate campaign by Western countries, especially those who are part of the Five Eyes surveillance treaty system, to conflate them in international discussions in Geneva. Representatives of these countries routinely object to any attempt to differentiate between the two

3 For just a few examples, and in just one country, this Wikipedia article is salutary: https://en.wikipedia.org/wiki/List_of_UK_gov-ernment_data_losses. Even more egregious is the case of the UK’s National Health Service selling the data of UK citizens.

4 For mainstream reporting of this episode and its aftermath see http://www.telegraph.co.uk/health/nhs/10659147/Patient-records-should-not-have-been-sold-NHS-admits.html.

in various multilateral fora in Geneva. I have personally witnessed this and when I have done so, in attempting to highlight the differences, these countries then intervene to object5. It could be argued that US President Obama’s speech on surveillance on 17th January 2014 in part implied a conflation of these two concepts6.

To say that efforts such as this plumb the depths of politically-convenient expediency is an understatement – particularly for countries that tout themselves as para-gons of freedom and democracy. We deserve better from national leaders and governments than cynical attempts at deflection of responsibility given the clear history of their abusive behaviour – and these same governments’ sustained, and successful, campaigns to give themselves much more freedom to use our PII than they give to anyone else. It is disappointing that some of the most active advocates for privacy protections in civil society fail to call governments on this asymmetry and simply demonize the private sector and let governments largely off the hook.

Every stakeholder that comes into contact with PII should be held responsible for their use of it, and the accountability expected of each should relate to their ability to do harm when they abuse it. Right now, the opposite is largely true.

ALL ECONOMIC USE OF DATA IS NOT THE SAME

Schaar’s article dwells at some length on the responsi-bility of economic actors to protect PII – without once making clear that one size does not fit all. Moreover, his article does not address a fundamental question which should be asked if we want to understand the economic

5 This happened most recently on 14–17 April at a meeting of the Multistakeholder Preparatory Process for the ITU’s High Level Event (to be held this June), during a discussion of how or whether to address mass surveillance in the documents being negotiated.

6 The speech text in full may be found at http://www.whitehouse.JRY�WKH�SUHVV�RĴFH������������UHPDUNV�SUHVLGHQW�UHYLHZ�VLJQDOV�intelligence. The paragraph in particular that relates to this point reads in part “Corporations of all shapes and sizes track what you buy, store and analyze our data, and use it for commercial purposes; that’s how those targeted ads pop up on your computer and your smartphone periodically.” The announcement in the same speech of a high-level panel to look into personal data could easily be seen as DQ�DWWHPSW��LQ�SDUW��WR�IXUWKHU�FRQIJDWH�WKH�WZR��7KH�UHSRUW��UHOHDVHG�on 1st May, largely ignores government use of personal information.

Page 36: MIND #7: Privacy and Internet Governance

37

RESPONSES PRIVATE SECTOR

Deciding that routes are ‘secure’ means instructing the network to

route packets not based upon efficiency, but upon

perceptions of security.

use of PII: what motivates different sectors to acquire PII, and what motivations do they have in using it?

Unsurprisingly, different parts of industry that possess PII have starkly different motivations – in some cases, these motivations are diametrically opposite. A case in point are so-called data brokers: companies whose whole business model is to aggregate as much information about each person as they can find and sell it on to third parties as many times as possible7. Contrast that with the other end of the scale: companies who provide services for free at the point of use and make money through advertising. Here, the motivation is to foster consumer trust lest these consumers leave for other services.

COMPLAINTS ABOUT PRIVACY POLICIES MISS THE BIG PICTURE

Schaar dismisses privacy policies with the statement “Extensive privacy policies which are as long as a novel, and which nobody reads, provide pretended transparency only”. This is facile and shallow. Why?

It dismisses the fact that privacy policies exist to provide the individual with guarantees from the com-pany they have provided their data to, and in return limit the company hosting the service from liabil-ity if they follow the rules they have promised to. The more detailed the privacy policies are, the more terms there are to enforce when compa-nies violate them – the more detailed the privacy policies are, the more data protection author-ities have to work with in cases of breach.While some companies may be motivated to hide abuse through obfuscation (the ‘data broker phenom-enon’), long policies are a consequence of the fact that services are global and privacy laws national. Creating user contracts that are capable of mul-tinational application guarantees complexity.

7 For an overview of data broker practices in the USA, a 15-minute segment on popular news magazine show 60 Min-utes entitled “The Data Brokers: Selling Your Personal Informa-tion” is worth watching. Find it here: http://www.cbsnews.com/videos/the-data-brokers-selling-your-personal-information/.

His points on the inadequate penalties available to data protection officials are well made. With the different motivations that exist, bad actors do need to be deterred by remedies that will actually be commercially painful, and the caps on fines in most jurisdictions are unlikely to be painful except to the SME sector, perhaps.Competition law is not a panacea

When the pace of technological change is as rapid as it is, the idea that competition law is the solution to online problems – with its traditional multi-year timelines and vast costs to all parties – is hard to credit. Here are a few issues with Schaar’s arguments:

Schaar suggests that social media services do not allow portability of data between services, when the most popular do and have for years.He states that “…monopolistic structures restrict[ing] the user’s autonomy in Web 2.0 ser-vices…” when there are dozens of search engines, social media platforms, and the like and they are free at the point of use – and all competition cases we have seen so far have not related to restrictions on users, but rather to relative treatment of com-peting services. The argument seems to suggest we penalise popular services for being popular.He suggests that “In order to prevent discrimina-tion, a ‘right to connectivity’ should be considered as well”. It is unclear what services he is addressing here given that a business model that seeks to prevent users from accessing services would be hard to sustain.

On the other hand, competition between firms on privacy policies is another thing entirely, and it is unfor-tunate that Schaar does not mention this: when you have multiple services in the same space (social media, search, etc.) then users can actually look at privacy policies to see which services have policies they like the best.

RESPONDING TO SURVEILLANCESchaar rightly identifies that the current situation with respect to data gathering by governments for national security services is damaging to economic development and corrosive of the foundations of democratic values. Unfortunately, his proposed fixes, with one exception, are likely to make things worse rather than better.

Firstly, the idea that a global agreement on data pro-tection standards is desirable is extremely problematic: privacy protection legislation is under review in many countries worldwide and these regimes are likely to remain in flux for some time. Creating binding interna-tional norms in such a situation is at best a very slow and

Page 37: MIND #7: Privacy and Internet Governance

38

politically fraught undertaking. Secondly, the suggestion that “secure routing of data packets” is desirable is far from true. Internet technologies are designed to allow each data packet to find the most efficient route between two points to maximise performance and optimise effi-cient utilization of the network. Deciding that routes are ‘secure’ means instructing the network to route packets not based upon efficiency, but upon perceptions of secu-rity. If many countries were to do this, it might well impact the stability of the network as a whole and would almost certainly result in a more expensive and slower network for everyone. If that was not bad enough, there is no ‘secure’ route vis-à-vis surveillance; security ser-vices globally are gathering traffic from far beyond their borders. The idea which is posited in public by many countries that ‘routing around’ certain countries protects their citizens is badly misinformed.

Schaar is correct that encryption will frustrate mass surveillance systems’ ability to directly access communi-cations ‘on the wire’ without going through due process – and it can, depending upon the type of communica-tion, frustrate access to the contents of communications demanded of online services through due process. All of that is treating the symptom rather than the core problem.

If you want to really resolve issues with data protection in a national security context, then reform of MLATs may provide the best option at a practical level, since that could allow a balancing of interests of security, law enforcement, human rights, and transparency that would also restore trust. There is something of value for reform for those responsible for national security, too: if everything is encrypted, accessing the tiny proportion of electronic communications that really is necessary for them to do their job will become very difficult. Truly interoperable MLATs could allow socially acceptable access across multiple borders to the communications of real terrorists and criminals more quickly and efficiently than is presently possible. The present concept of secu-rity services seems to lead to covering the earth with secret data centers in a vain attempt to store and then sift through all electronic communications by every-one by each country willing to spend the fabulous sums involved in an Orwellian surveillance arms race. This is not sustainable from any perspective.

Page 38: MIND #7: Privacy and Internet Governance

39

Modernizing data protection along with data processing technologiesSUSANNE DEHMEL, BITKOM GERMANY

In his article, Peter Schaar raises a great many ques-tions that keep coming up with regard to the use of the Internet and big data, and generally in relation to the development of our digital society. The digitalization of our lives leads to both new dimensions in the amount of data processed and also new dimensions in our abil-ity to process and access data. We need new methods of processing in order to cope with the existing and ever growing amounts of data we produce. This efficient way of processing data doesn’t only help us to keep our famil-

iar analysis processes going, but, fascinatingly, also opens up completely new forms of analyses that enable us to implement formerly unthinkable scientific and economic applications. This means we have to re-balance interests. Therefore, we struggle with the question of whether we are able to incorporate new technologies into existing categories of data protection laws without losing too much of the scientific and economic potential of these technologies on the one hand and legal clarity on the other – or whether we need to rethink some data protec-

RESPONSES PRIVATE SECTORCredit: r2hox | https://flic.kr/p/gdM

rKi CC BY SA. 2.0 | https://creativecom

mons.org/licenses/

by-sa/2.0/

Page 39: MIND #7: Privacy and Internet Governance

40

We should be aware of the fact that the time and effort needed to

achieve pseudonymization and anonymization must be feasible

and adequate.

tion laws and principles in order to preserve or transfer the effective protection of our private sphere and our freedom of action in the future.

Not only does Schaar name a number of questions that arise; he also outlines some of the answers to them. A very important one is the setting of incentives for pro-cessing data in anonymized and pseudonymized form in order to keep the intensity of interference with basic rights and the risk of misuse as low as possible. The option of anonymized data processing is also signifi-cant for big data applications, as consent and purpose limitation can constitute barriers for applications that were not foreseeable at the moment of data collection or where there is no possibility of getting consent from the data subject. The note on the German Telemedia act (Telemediengesetz) and its definitions of “anonymous” and “pseudonymous” is helpful with a view to the ongo-ing consultations on the EU data protection regulation. It seems that there is no common understanding of these notions between member states; yet it would be very helpful to agree on the definition of these terms, also with regard to third countries. But if we ask for the increased use of anonymized and pseudonymized data as a means of risk limitation, we should be aware of the fact that the time and effort needed to achieve pseud-onymization and anonymization must be feasible and adequate in relation to the risks. At the same time, it is also quite clear that with growing amounts of data about us available, it becomes significantly more difficult to anonymize data in a way that cannot be reversed by anyone anywhere, now and in the future. Incentives need to be offered for companies to set up safe surroundings – advanced anonymization technologies together with organizational measures. These incentives must be strong enough to motivate companies to undertake this effort. It must be possible to get out of the limitations of data protection law when any links that related information to a person were removed from the information. It should also be possible to handle pseudonymized data flexibly when there is no indication of undue negative effects on the data subjects’ interests. Therefore, we definitely need an international concept with a relative definition of ano-nymity as we already have in German law, and it should be enhanced with provisions for privileged handling of pseudonymized data.

Schaar also names a procedural instrument that should help to balance risks and interests of data processing within companies or other responsible organizations: privacy impact assessments (PIAs). From an indus-try perspective, this could become useful as we know similar instruments function well in other areas, such as compliance departments. In order to introduce, or rather extend, impact assessments concerning privacy, a legal obligation to use them might be helpful. But the law should not be too detailed on how these PIAs are to be conducted. There needs to be flexibility with regard to the detail and depth of a risk assessment, depending on the nature and context of the data processing process or product and the privacy risks that can be expected. Best practices should be collected. Industry could work together with stakeholders who represent the interests of data subjects to develop standards for characteristic procedures and contexts in a process of self-regulation or co-regulation.

Such risk assessments can help to realize the concept of “privacy by design” in all privacy-critical processes and products and combinations of them. This concept is important for an effective realization of data protection in an economic way. The goal is to keep data protec-tion in mind from the start and also while developing new products or services. Thus it is possible to implement measures that keep privacy risks to a minimum and/or give consumers and companies the opportunity to con-sciously decide upon the use of their data with regard to a certain service or use of a product. The concept of privacy by design is, in my view, more important and more helpful than the concept of privacy by default, as the latter only concentrates on the stage of delivering the product or service and automatically puts data pro-tection at the top of the consumer’s priority list, thus tending to patronize him. Quite often, consumers have to decide between optimum user friendliness /conve-nience and optimum data protection, as both might not be achievable at the same time. Using social networks as an example, some users want to be found by as many matching contacts as possible because their business might depend on these contacts. But others only want to use the network in order to share information with people they already know. Both groups of users expect to be able to do so in a convenient way without much bother beforehand, but their preferred privacy settings will be different. Privacy by design should mean combining optimum convenience with optimum privacy, if possible – but if this is not possible, consumers should be given the choice of different options with different advantages and disadvantages. Plus, providers must explain to the customer what these advantages and disadvantages are.

Page 40: MIND #7: Privacy and Internet Governance

41

RESPONSES PRIVATE SECTOR

Transparency is the keyword to which companies (and

governments) should feel obliged and bound.

The instrument of “data portability” also mentioned only makes sense for services where users administrate and/or display a lot of their own data and where the inability to extract this data in a fairly convenient way would have a prohibitive effect on changing service providers. It is not a data protection issue but rather a competition issue and should be treated as such. An undue extension or appli-cation of a so-called right to data portability on other services, such as online shops, or on data controllers in general could unduly burden many of these data control-lers and might lead to other competition problems while not bring any advantages in terms of data protection.

Transparency is the keyword to which companies (and governments) should feel obliged and bound, as it is the basis for a fair relationship with customers and any other person whose data will be processed. Schaar is right in asking for policies that can be consumed in a reasonable time and manner. But to ask for this is still easier than to comply with legal provisions and reduce the information to the relevant bits for the consumer at the same time (who generally cares about data protection, but in many cases still cannot be bothered to go into the technical details of the services he is using). It is a complex task to set up global processes and products to ensure that they comply with all the diverging provisions in different states and with different purposes (security, fraud pre-vention legislation, civil law, data protection laws, etc.) and at the same time to have unified and transparent processes that can be easily explained to each customer. Nevertheless, companies need to earn and retain the trust of their customers, no matter if these are other com-panies or consumers. Trust is built up if you feel that your partner and his work are reliable and his actions are foreseeable. Therefore, companies need to work on their transparency in the form of fair communication to and with customers on the basic ways of handling their data. A fpredictable way of handling customer data might become one quality aspect of the products or services companies want to sell.

But all industry’s efforts to act as trustworthy data con-trollers might be thwarted if member states of the EU and third countries do not play according to the same rules that they impose on companies. The rules for intel-ligence services and other authorities that might want to access user data for some kind of security reasons should

also be as clear and as transparent as possible. This is a difficult task for each government internally and in rela-tion to foreign partners, as it might mean limiting its own power and information advantage. But a balance of security interests and an interest in the freedom of action of the individual has to be found. Otherwise, govern-ments might not only face democratic problems but also economic deficits on the long run. Calls on the legisla-tor to draw the exact boundaries between the acceptable and unacceptable use of personal data are valid for the actions of private individuals as well as for state authori-ties. Just as companies need clear rules within which they can act, state authorities and intelligence services need such rules. In both cases, the existence of clear and trans-parent rules also allows effective enforcement through adequate sanctions.

Despite the challenges we are facing, I think Peter Schaar is saying yes, the Internet and big data are com-patible with data protection. And again I agree with him. Combining both is feasible if governments and industry make an effort – in defining transparent and fair rules for companies and authorities on the one hand and for data subjects on the other, in developing new technical and organizational measures that fit new data processing techniques, and in keeping up a factual social discourse on how we want to live in our digital world.

Page 41: MIND #7: Privacy and Internet Governance

42

Big Data: An Opportunity Combined With Privacy Concerns.A Regulatory Perspective GEORGE SALAMA, SAMENA TELECOMMUNICATIONS COUNCIL, DUBAI

Azrie

li Cen

ter

Ksen

ia Sm

irnov

a | h

ttps:/

/flic.

kr/p

/bcG

W4x

CC

BY

2.0

| http

s://c

reat

iveco

mm

ons.o

rg/li

cens

es/b

y/2.

0/

Page 42: MIND #7: Privacy and Internet Governance

43

“Big Data” is a new term that is currently trending within the ICT industry and which is coming into sharper focus, especially due to privacy and security concerns. With the incredible growth of data produced/consumed by Inter-net users through the different types of social networks and mobile apps, the need for a strong but flexible legal and regulatory framework is increasing and also becoming mandatory. Such regulatory measures should be developed in a smart manner that on the one hand gives incentives to Internet players and telecom operators to explore new rev-enue streams lying behind Big Data, and at the same time grants the basic Internet user a secure, private cyber space.

The first aspect of Big Data regulations, which should be taken into account while formulating a Big Data policy framework, is not to hinder investments by giving opera-tors and Internet service providers the ability to easily explore and tap into new revenue streams, such as the ones resulting from Big Data. In the Arab region, operators are under pressure resulting from strong regulatory interven-tions, such as high licensing fees, new spectrum costs, hidden taxes, and royalty fees. This massive pressure in terms of CAPEX is accompanied at the same time by strict regulatory obligations to provide high quality services at affordable rates. International roaming flat rates regula-tions are a clear example that illustrates the extent to which telecom operators are facing political and regulatory pres-sures, which is negatively affecting their revenue growth. Therefore, the Big Data concept, which is described in the article as “new oil”, should be regulated while bearing it in mind as a new revenue generation opportunity for telecom operators and service providers. If Big Data is the “new oil”, then broadband, which is provided by operators, is the vehicle through which this new oil is consumed. For operators to be able to provide such super speed reliable broadband, the question of their sustainability should be placed first. The role of governments and regulators is crucial in formulating and putting in place a clear set of industry friendly policies; such policies are crucial in iden-tifying the methodology and level of data utilization, data quality, access, and preservation.

Secondly, and most importantly, is the question of privacy, security, and data protection when it comes to Big Data regulations. The point raised in the article that social net-works’ privacy settings are switched off by default reflects a key concern that needs to be revisited when setting poli-cies and regulations for Web 2.0 services. Privacy settings

should be developed and displayed in a very simple way that enables the basic user to adjust his privacy and secu-rity preferences in a straightforward manner. Also, the traditional way of displaying the terms and conditions of any new Internet service subscription is another con-cern that requires simplification and redesign, so instead of having long pages full of text in a tiny font, make a video tutorial available in different languages, for exam-ple. Another interesting example is the management of the user’s online assets, personal data, email accounts, and social media profiles after death. Google has recently introduced tools that apply to all Google-run accounts, including Gmail, Google+, YouTube, Picasa and others. Users have the option to delete data after a certain period of time or pass their data on to specified people.

No one can deny that privacy is a cornerstone for setting up any Big Data policy framework, but it is also important not to let such concerns hinder innovation. The oppor-tunity resulting from data mining and analysis across different sectors is creating a deep positive impact in the overall national economy. Government has an important role to play in encouraging Big Data use in fields including health care, education, road safety, weather forecasting, financial reporting, mapping, and macroeconomic fore-casting. Collected once and used many times is the most efficient methodology in the adoption of the Big Data concept. This saves time, processing power, and cost; therefore, government and the private sector are to be in alignment and synchronization when it comes to shar-ing data and information, while maintaining a certain level of privacy and transparency. At the same time, there should be a clear distinction between data collected and processed by government agencies or private sector entities on a national level and those collected as a result of the international bulk of data transfer. It is very well said in the article that: “It is unacceptable that governments and intelligence agencies are abusing the increasing interna-tional data transfer for bulk access to the transmitted data”. The question of “trust” when using a new Internet service is under major threat and for this trust to be rebuilt, both technical and policy solutions need to be implemented. Data encryption, secure routing, and IPv.6 adoption are being considered amongst the technical solutions. Interna-tional agreements, regional cooperation, dispute resolution mechanisms, and commercial settlement processes are examples of public policy considerations.

RESPONSES PRIVATE SECTOR

The role of governments and regulators is crucial in formulating and putting in place a clear set of

industry friendly policies

Collected once and used many times is the most efficient

methodology in the adoption of the Big Data concept

Page 43: MIND #7: Privacy and Internet Governance
Page 44: MIND #7: Privacy and Internet Governance

STEPHANIE PERRIN, NON-COMMERCIAL STAKEHOLDERS GROUP AT ICANN

RAFIK DAMMAK, MEMBER OF THE STEERING COMMITTEE, INTERNET RIGHTS & PRINCIPLES COALITION, TOKYO

LORENA JAUME-PALASÍ, INTERNET & SOCIETY COLLABORATORY, BERLIN

RESPONSES CIVIL SOCIET Y

Responses Civil Society

Page 45: MIND #7: Privacy and Internet Governance

46

The Internet and big data – incompatible with data protection? We don’t think so! A civil society perspectiveSTEPHANIE PERRIN, MEMBER OF THE NON-COMMERCIAL STAKEHOLDERS GROUP AT ICANN

Page 46: MIND #7: Privacy and Internet Governance

47

RESPONSES CIVIL SOCIET Y

Much successful data mining is done with relatively non-sensitive

data. That’s better, but it doesn’t eliminate all concerns.

Cred

it: B

obM

ical |

http

s://fl

ic.kr

/p/jg

Vkon

| CC

BY 2

.0 |

http

s://c

reat

iveco

mm

ons.o

rg/li

cens

es/b

y/2.

0/

The subject of “big data” can be a depressing one for privacy and human rights advocates, for many reasons. In the first place, it is not well understood by the pub-lic or by civil society. Secondly, the prominence of data analytics in the business plans of many of the biggest global IT players and their customers makes it difficult for civil society to fight these elements since analytics are promoted as driving the Internet economy. The value of big data should not be accepted as a given; analytics still have to prove their societal value. Google Flu Trends, the original big data poster child, was later shown to be a failure.

It appears, though, that risk assessment and market sorting using big data is highly addictive. The ability to predict future behavior has entranced humanity since our earliest days and a shiny new tool that promises to reveal more about risk will be a difficult bauble to part with, for both the private and the public sector. We must, how-ever, pay attention to the consequences. Oscar Gandy wrote about this in 1993, predicting “the panoptic sort is an antidemocratic system of control that cannot be transformed because it can serve no purpose other than that for which it was designed – the rationalization and control of human existence” (1993, p.227). Peter Schaar delivers a similar message, grounded in his struggles as a data commissioner, to deal with these new technolo-gies using old legislation and feeble powers. His vision

of a “balkanized Internet” that enables nation states to ignore the promise of freedom that the Internet brings, and engage in state control and censorship, is a sober-ing one.

Schaar gives us many insights in this article as to how big data challenges traditional data protection techniques and principles. He focuses on the role of purpose in data protection, discussing how purpose is less applicable in data analytics, which looks for correlations in data that are unrelated by purpose. Purpose of collection is, nev-ertheless, still valid as a data protection principle – why would anyone fighting for human rights abandon spe-cific, legitimate purposes for data collection and use? He argues that the privacy community must figure out how to move from the micro perspective, focusing on each individual data element, to a macro perspective, govern-ing the conduct of those who have masses of data and want to punch it around and see what it tells them. In the past, some jurisdictions have relied on consent of the individual to address these issues. Does it still work? If

Page 47: MIND #7: Privacy and Internet Governance

48

Most consumers have no idea about the collection and use of

their data.

so, why do these different actors have so much data? Did we consent to all that collection? The answer could be yes. But was that collection proportionate? Was the pur-pose relevant? Was secondary use transparent to the data subject? If we don’t ask these questions, then we face the American Wild West, where anything goes.

Admittedly, much successful data mining is done with relatively non-sensitive data. That’s better, but it doesn’t eliminate all concerns. After all, if grocery purchases or Internet surfing can predict race, health conditions, or criminal tendencies, we could have the worst of both worlds. Gandy spent much of his career looking at racial discrimination through the lens of informa-tion practices. As he points out (2009), discrimination against a group can be accomplished without gathering identifiable information. Dixon and Gellman recently published a report (2014) on consumer scoring with a detailed analysis of what can be done with identifiable and non-identifiable data already held by data brokers and others. The United States government released a report on big data on April 30, 2014, which states: “A significant finding of this report is that big data ana-lytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans’ relationship with data should expand, not diminish, their opportunities and potential” (iii, 51-53, 64-65).

This is one of the problems with the new paradigm of “big data”. Anonymity in the cloud of data does not stop the identification of factors useful for individual pro-filing, and it can mask invidious discrimination. The privacy of groups is something not well covered in cur-rent data protection regimes, and several innocuous data elements can now place an individual in a group that has an accurate predictive profile. This goes against the principle of informational self-determination that is at the core of privacy protection, and it may defeat anti-discrimination laws because it can select for race (or other factors) without ever specifying those factors overtly.

Gandy returned somewhat more optimistically to this issue in 2009, in Coming to terms with chance: Engag-ing rational discrimination and cumulative disadvantage. Like Schaar, Gandy is convinced that the time to unite against surveillance and discrimination is now. He has

little faith in data protection as a solution; perhaps one can forgive an American scholar for reaching that con-clusion. One of his proposals is to set up a body similar to the Environmental Protection Agency, to investigate harm that comes to groups or individuals from data min-ing and risk profiling. It’s a thought. Civil society as a rule, however, is not going to accept a model of data protection that abandons human rights in exchange for proving harm. Most consumers have no idea about the collection and use of their data. Asking them to prove how the data was used, let alone whether it produced harm, is not a practical or fair response.

What to do? Firstly, let me say that civil society in North America and indeed the world still looks to Europe and its data commissioners for leadership in privacy protec-tion. We depend on the Article 29 Working Party to comment on developments and technology, and we look to Germany to defend the right of informational self-determination. Never give up! Civil society in Europe has been thoroughly engaged in the debate about the new European data protection regulation and will continue to press for strong rules. It may be, however, that the tradi-tional approach is not enough. If big data is something new (and maybe it isn’t really new), then we need new privacy protections for it. Perhaps civil society can help think out of the box and come up with new concepts of how to control this technology.

Those of us engaged in the privacy struggle over many years often reflect ruefully that we have been unsuccess-ful in stopping tracking via cookies and web beacons, limiting video surveillance, or promoting privacy enhancing technologies (remember PETS?) and mak-ing sure that legislation comes with teeth, resources, and fearless data protection commissioners. At this inflec-tion point in Internet governance debates, we have a real opportunity to push for stronger protections for human rights and privacy.

If we care to take up the challenge that Schaar has issued – to join together to protect the core values of western democracies – here are a few possible initiatives:

1. Recognize that European civil society has much to discuss with its global partners, including tech-nology activists such as the Electronic Frontier Foundation, TOR, and the Australian Privacy Foundation. We all need to redefine goals for 21st century data protection legislation. Work-shops or webinars that get down to details of what to look for – new approaches to the old principles – would be a start. Funding to talk about meet-ing the challenge of big data would be welcome.

Page 48: MIND #7: Privacy and Internet Governance

49

RESPONSES CIVIL SOCIET Y

2. Those engaged in the Internet governance debates need to demand that organizations such as ICANN, the IETF, IANA, and the ancillary registries, meet strong levels of data protection, possibly through bind-ing corporate rules. Technological infrastructures must facilitate privacy protection and anonymity and must not allow the Internet to be the source of unreg-ulated information for undefined big data activities.

3. Support for the existing data protection law and for the Commissioners who struggle to enforce it remains fundamental. Civil society and the commis-sioners are logical partners and could find more ways to work together without compromising the integ-rity and independence that each must maintain.

4. Education. Citizens need to understand the technologies of surveillance and how to use different technological tools to protect them-selves. Civil society can and does help with this, but it is time to step up the effort.

5. Maybe Gandy has a good idea, and maybe the Big Data Protection Agency needs to be a part of the global Internet governance structure? The Inter-net is global and we want to keep it that way. This is a challenge for data protection, particularly if the collection instruments for big data are distrib-uted across applications and platforms. We must start thinking about how human rights could be enforced in data analytics, and abuses remedied.

The promise of the Internet as a tool for individual self-determination, for growth and development, must not be snuffed out. Privacy is a fundamental human right, and protecting it in the face of technological advances requires optimism, determination, and teamwork. Big data can be corralled, and it will be when more people understand it and unite behind the effort. Peter Schaar’s article points to some of the places where we can make a start.

REFERENCES

Dixon, P. & Gellman, R. (2014). The Scor-ing of America: How Secret Consumer Scores Threaten Your Privacy and Your Future. avail-able at http://www.worldprivacyforum.org/2014/04/wpf-report-the-scoring-of-america-how-secret-con-sumer-scores-threaten-your-privacy-and-your-future/. San Diego, CA: World Privacy Forum.

Executive office of the President. (2014). Big data: Seiz-ing opportunities, preserving values. http://www.whitehouse.gov/the-press-office/2014/05/01/fact-sheet-big-data-and-privacy-working-group-review. Executive office of the President: Washington, D.C.

Gandy, O. H. (2009). Coming to terms with chance: Engaging rational discrimination and cumula-tive disadvantage. Farnham, U.K.: Ashgate.

Gandy, O.H. (1993). The panoptic sort: A political econ-omy of personal information. Boulder, Colo.: Westview.

Page 49: MIND #7: Privacy and Internet Governance

50

Credit:_Bunn_ | https://flic.kr/p/9i7ZK2 CC BY 2.0 | https://creativecom

mons.org/licenses/by/2.0/

Page 50: MIND #7: Privacy and Internet Governance

51

An effective and enforc eable regulation needs to be at the same

pace and aligned with what is regulated.

The need for versatility in data protectionRAFIK DAMMAK, MEMBER OF THE STEERING COMMITTEE, INTERNET RIGHTS & PRINCIPLES COALITION, TOKYO

Big data is an evolution bringing new opportunities for businesses, but without a clear benefit for users. It is an example of the evolution of the technological threat against privacy. However, it is still difficult to apprehend it and separate the hype from the reality of this trend. At the same time, we have to preserve and protect the right to privacy, regardless of any changes. There is a duty to do so even when things are still fuzzy and changing quickly. But the hype may distract or prevent us from enforcing an effective regulation. We definitely need a different approach to respond to those challenges in the context of big data or endless techno-logical progress and guaranteeing privacy.

Firstly, recognizing the limitations of what we have now as tools as outlined by the author is a prerequisite to finding the appropriate answers. Secondly, I would like to suggest a different perspective as a member of civil society and, perhaps more relevant here, as a soft-ware engineer concerned with building and designing sustainable systems. Therefore, I would like to use soft-ware engineering metaphors and principles.

Big data is enabled by the progress of technologies such as the Internet of Things and cloud computing,

in addition to the current collection of data by differ-ent IT systems, regardless of their primary purpose or usage. However, existing data protection regulations or frameworks were designed to cope with “legacy” and traditional IT systems, processing personal data for specific purposes. Big data made a breakthrough in terms of using massive data from different sources, new technologies and platforms and adding more advanced algorithms and statistical and mathematical models.

Let’s make a comparison between the regulation and the issue to be regulated. Big data or cloud comput-ing, for example, are about scalability and planning for the continuing expansion of storage, computing, net-working resources and data, while regulation tends to respond case by case in ad-hoc manner to any-thing new (and usually when it is already too late). It is more a reactive than a proactive approach. What

Page 51: MIND #7: Privacy and Internet Governance

52

we can conclude is: existing laws fail to “scale out”, to be interpreted or adapted to cover new use cases, new technologies and applications adequately. An effective and enforceable regulation needs to be at the same pace and aligned with what is regulated.

So, can data protection and privacy borrow the scal-ability principle in order to be able to handle the next technology threat to privacy? Can data protection regulations be conceived and designed to be effective many years ahead? Can data protection laws be built iteratively and be evolutionary? Can the data protec-tion framework be able to cope with evolving threats to privacy Yes, it is possible: if we “design” laws to include new cases, such as when a system can add a new com-ponent. It is possible if we make regulation iterative. A new law is already a “legacy” product when it starts to be applied. “Upgrading” data protection laws is a con-tinuous effort; principles remain but responses change.

The author advocates having a strong and flexible data protection legal framework, but that is far from being enough. Data protection authorities need to be able to predict changes, to plan for responses when needed. Using the software metaphor again, data protection must be updated regularly with new features to cope with the new realities, and the designer or the legislator in this case needs to do so often.

On the other hand, even if the laws, regulations and legal frameworks are here, the question is about the capacity and readiness of data protection authorities. What about the skills needed in the human resources who are supposed to comprehend, understand and enforce the rules on diverse applications or instances of big data? Even businesses are having a hard time f inding data scientists and experts on big data for building and using those oceans of data. Without adequate expertise, data protection agencies or com-missioners will be unable to detect new infractions and irregularities.

The author advocates privacy by design and giving con-trol to users. Again, it is not enough. It is not just a problem of awareness or knowledge about privacy rules and practices by developers – it is about forgetting who matters: users. In agile software development, every-thing is about users. But when it comes to data usage and processing, companies or start-ups tend to focus on their business models and ignore the fact that they need to build user-centric and friendly systems first. Being user-centric must be systematic. When the focus is shifted to the right spot, privacy by design practices would be effective.

Finally, this raises a missing question that we need to respond to: how much innovation is permitted without undermining or threatening privacy? We must avoid framing privacy as antagonistic to innovation or lowering the standards of data protection to match the innova-tion. In fact, innovation should be thought of as a way to improve data protection, to strengthen privacy as a right in the Internet. Any innovation failing to do so is only a regression of rights and a backward move. So, can data protection regulations make big data as an innovation beneficial to users and citizens?

Page 52: MIND #7: Privacy and Internet Governance

53

Big Data is a product of human thinking and its natural avidity for knowledge. The principle of data “abundance” is nothing new to mankind. Since the very beginnings of civilization, humans have collected and conserved culture, traditions or wisdom in libraries and archives. The impetus to store operated with different criteria and categories so as to classify the data. It would try to col-lect things which seemed, at first sight, unstorable and to figure out ways to make them collectable: songs would be stored by means of words and notes in manuscripts, historical moments would be collected in words and pic-tures, etc. No information was therein irrelevant: from objects used in daily life, to discussions held in the agora over sacral books: archives would welcome all pieces. Access to these collections was in ancient times restricted to the political elite: the library and state archives of the Epang Palace to the Chinese emperor or the library of Alexandria with respect to the Pharaoh. In the course of time, human acquisitiveness towards information did not change, but the access to it did – and, this brought both political turmoil and changes in the path towards more democratic structures; even in those times in his-tory when no right to privacy existed.

The better and more automated the storage techniques, like the printing press, the more information stored. The logic of the Internet follows exactly this impetus and, as

all new technologies that have come along throughout history, it gave this impetus to store a boost. Peter Schaar falls into the trap of technological mythologization when he characterizes Big Data as the “new oil”, the novelty of it relying in storing for the sake of storing. Information was always power – this is why in earlier, non-democratic times, its access was restricted to elites. And information storing has always been innate to humankind – for the sake of knowledge. In doing so, storage was the first step to experiment; next were data correlations, while possi-ble explanations were a third step. This process has been one of the scientific methods by which polymaths and humanists have understood nature for centuries. Collect first, correlate afterwards, and subsequently search for possible explanations based on the correlation.

Peter Schaar also avoids finishing the history of the juridification of data protection until our present days. Hence, I will continue with the description of the his-torical evolvement of data protection law and, then, proceed with the problems that Peter Schaar’s demands would pose.

With the evolution of the Internet, automated data pro-cessing became a practice not only for governments and big companies, but for all. With the use of smartphones, social media, emails, apps, etc., individuals also pro-cess data automatically every day, both deliberately and unknowingly. Every conversation or statement on Face-book is data. Every Foursquare check-in, with or without mentioning friends, is data. Pictures, videos, Likes, the sharing of a status or photograph is data. What nowa-

Is data protection becoming incompatible with communication?LORENA JAUME-PALASÍ, INTERNET & SOCIETY COLLABORATORY, BERLIN

RESPONSES CIVIL SOCIET Y

Human acquisitiveness towards information did not change, but

the access to it did.

Page 53: MIND #7: Privacy and Internet Governance

54

days is understood as communication is also categorized data. This does not only mean that the original purpose of data protection has changed – to protect citizens from the asymmetric power of data-gathering governments and big companies, – but also that the concept of auto-mated data processing itself has changed. Automated data processing is now a constitutive part of modern communication. Moreover, in the countries where data protection laws were drafted there already existed laws regulating freedom of expression – and thus communica-tion. So, suddenly, these now have two different corpora of law regulating the same.

The concept of data protection was, from a philosophy of law point of view, already semantically misleading and technically problematic from the very start. The object of juridical protection was not clear. If the right to privacy was already contemplated as a different right, what is then the object of the juridical protection of data protection? Data? For what sake? What are the values that had to be strengthened and the harms that needed to be averted? Data (in Latin, “something given”) is per se a neutral product of human action and, hence, in principle neither good nor bad. I agree with Peter Schaar when he states that Big Data constitutes a big challenge for data protection: these new technological developments prove how inadequate those principles were and continue to be.

Law should not regulate on the basis of technology mod-els or novelties, but on the basis of principles. However, technologies may be used as a litmus test to expose the (in)adequacy of a regulatory principle. This is precisely what new technologies, like the Cloud, the Internet of Things or Big Data are doing: Peter Schaar is stuck in the old paradigm of automated data processing. Thus, he concentrates on constricting data preventively by demanding data minimization – instead of concentrat-ing on the values resulting from data in need of juridical protection. Considering the history of human knowl-edge and information production as well as the innate impetus to store, data minimization demands a change in the nature of interaction and communication. This is neither how the Internet functions, nor how human beings are. Data minimization generates silos of infor-mation and privileges by selecting some information and discarding other, and then creating contextual gaps that might entail higher risks of manipulation and misuse.

Data minimization inverts the dynamics proper to the nature of information and knowledge that gradually led to democratic development.

The same applies to Peter Schaar’s appeal for informa-tional self-determination and consent as the primary instruments for protection. They sound good at first, but have detrimental consequences for individuals upon closer inspection.

Informational self-determination in data protection implies ownership of the data produced by an individ-ual. The owner of the data is who determines what is done with it. However, ownership of data has always been difficult to determine, even outside the digi-tal dimension: if I go to the bakery that is in front of my home, every day, early in the morning, who owns this data? Me, the baker, the neighbors also present at the bakery, all of us? Should it be forbidden for the neighbors to tell someone else about this data? Much “sensible” personal data cannot be hidden or kept behind closed doors: a pregnancy after a few months, a person with a broken leg wearing crutches, the reli-gion of a woman wearing a headscarf. Whose data is the knowledge of my neighbor having a broken leg? His data, because it is his leg? My data, because I saw it with my eyes? Am I allowed to talk about the bro-ken leg with the baker or with a stranger? As explained above, data comes etymologically from Latin, meaning “something given”; there is a semantic connotation of detachment immanent in the word “data”. The concept of ownership provokes more conflicts than orientation and clarity. It may even open the door to censorship. And, ultimately, it raises once more the question about the adequacy of this principle and the purpose, that is to say, the object of juridical protection of this corpus of law in general.

Regarding Peter Schaar’s appeal for consent, it should be noted that consent – under circumstances of fair-ness – requires full knowledge and understanding. An individual cannot freely sell his or her personal freedom – even if he or she gives his consent, for no individual can surrender this fundamental right. Why should it be different with other fundamental rights? Consent in data protection reassigns the responsibility from the companies and states running the data manipu-lation and gathering algorithms – which they usually have designed with the utmost untransparency – to the

Whose data is the knowledge of my neighbor having a broken leg?

This is neither how the Internet functions, nor how

human beings are.

Page 54: MIND #7: Privacy and Internet Governance

55

Credit: Lorena Jaume-Palasi https://flic.kr/p/n6TcLi | CC BY 2.0 https://creativecommons.org/licenses/by/2.0/

individuals. Individuals are, thus, expected to know as much as algorithm specialists and programmers do: once the code is made transparent and the terms and conditions have been explained clearly in a one pager, citizens should be able to decide by themselves. Peter Schaar seems to ignore that Big Data and data protec-tion are both complex issues. They cannot be explained in one page. A short explanation is a shortage of informa-tion. Furthermore, algorithm transparency still remains intransparent for non-technicians. Users know less about the algorithms of a company than the company itself.

Recapitulating, information shortages and knowledge asymmetries are not the basis for fair consent. Turning the responsibility that governments and corporations should have over to users does not protect these individ-uals: it merely perpetuates power asymmetries between individuals and states or companies while giving the illu-sion of autonomy.

Big Data is not incompatible with privacy or anonymity protection, but it is incompatible with old paradigms of automated data processing and an outdated understand-ing of the Internet.

Peter Schaar does identify one of the most relevant challenges and risks of Big Data, however: predictive algorithms and analytics restrict individuals’ freedom of choice, since they preselect in advance a reduced num-ber of options that come to erode autonomy in the long term. Predictive analysis filters the environment of the user; it tells the individual what he or she is and embeds him or her in anticipatory obedience to a setting where the individual remains what he or she is supposed to be. Consequently, predictive algorithms could perpetuate social inequality. Consent or informational self-deter-mination would be of no help in this case, for they apply after, and not before, the pre-selection. Only rules on freedom from discrimination would prevent misuse.

Big Data entails more risks than a mere erosion of pri-vacy and, hence, data protection laws need to consider them in their entirety. Data protection must ascertain, first, which values it wants to safeguard, which principles may be harmed and are, thus, in most need of protection, so that it may, subsequently draft not data- but princi-ple-oriented regulations. Freedom from discrimination, the right to self-development, and freedom of choice are principles and values; data is not.

RESPONSES CIVIL SOCIET Y

Page 55: MIND #7: Privacy and Internet Governance
Page 56: MIND #7: Privacy and Internet Governance

JONNE SOININEN, INTERNET ENGINEERING TASK FORCE

MIKHAIL M. KOMAROV, NATIONAL RESEARCH UNIVERSITY HIGHER SCHOOL OF ECONOMICS, MOSCOW

RICHARD HILL, AUTHOR, FORMER ICT MANAGER

RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y

Responses Technical & Academic Community

Page 57: MIND #7: Privacy and Internet Governance

58

The recent revelations of the pervasive monitoring by security agencies including the NSA and GCHQ sent shockwaves through the Internet technical community. Though it was hardly surprising that organizations whose main purpose is to monitor and analyze com-munication were actually performing their task, the scale and tactics of those activities did surprise the technical community. The revelations served as a wake up call to the technical Internet community to focus more time and work on Internet security.

As a reaction to the revelations on pervasive Inter-net monitoring, the Internet Engineering Task Force (IETF) held a session about pervasive monitoring in the technical plenary in Vancouver in November 2013. There was clear consensus among the participants that more could and had to be done in the Internet protocols to increase security in the Internet proto-cols. However, the topic of Internet security is by no means a new topic in the Internet technical commu-nity. Though it is often stated that the Internet was not originally designed with security in mind, the IETF has had a significant focus on security and privacy for decades. A testament to it is the RFC1984 published as early as 1996, in which the Internet Architecture board and the Internet Engineering Steering Group state that the IETF will work on securing its technologies with encryption regardless of government restrictions on encryption technologies. Over the years, the IETF has specified technologies, such as IPSec and Trans-port Layer Security, to secure communications over the Internet. These technologies are widely deployed and

used routinely on the Internet. In addition, the IETF and the IAB increased the focus on privacy in Inter-net protocols even before the Snowden revelations. The IAB published guidance on privacy considerations for Internet protocols in RFC6973 in July 2013. Hence, the IETF did not start to work on Internet security and privacy in the aftermath of the Snowden revelations. However, the focus on security was further increased.

An old proverb says you can lead a horse to water, but you cannot make it drink. The same is true of Internet security. Though extensive tools for Internet secu-rity have been available for a long time, many people have not been using them. Sometimes the privacy and security aspects have not been considered important enough in the tradeoff, for instance, between increased security and increased computing power needs. How-ever, over recent years major Internet service providers have started to use technologies such as TLS by default to secure their services. This development is very encouraging and there is hope that others will increas-ingly follow this trend as well. These available security mechanisms do effectively secure communication over the Internet between the service and its users.

Though extensive tools for Internet security have been

available for a long time, many people have not been using them.

The Current State of Internet Security From A Technical PerspectiveJONNE SOININEN, INTERNET ENGINEERING TASK FORCE

Page 58: MIND #7: Privacy and Internet Governance

59

The Internet is international in nature.

RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y

In light of the Snowden revelations and the Heart-bleed bug, it might seem counterintuitive to state that the Internet is more secure than ever, and continues to become more secure as new technologies are developed and those technologies already developed are deployed. However, looking purely from an Internet communi-cations angle, this statement is true. In addition, the new increased focus on security in the Internet will only strengthen this development.

In addition, in the discussion about pervasive sur-veillance, questions have arisen about the security of Internet routing and what traffic flows through coun-tries that perform pervasive monitoring of Internet traffic. There has been a general call for enhancing rout-ing security, which Peter Schaar mentions in his article. There has even been a call from certain European political leaders for a European Internet with greater security envisioned. Although in the early days of the commercial Internet much of the traffic did actually go via the US, today local traffic does stay local. The introduction of local Internet Exchange Points in coun-tries and peering agreements between local operators have assured this in Europe for over a decade already. The same trend is seen all over the world, including more recently in developing countries. Today, there is no technical reason why local Internet traffic from any European country should or would pass through any other country.

As these statements about the technical state of Internet security and routing are both positive about the current situation and hopeful about the future, the question may arise as to how it was possible for the NSA, for instance, to perform extensive surveillance even on for-eign citizens. The answer is in the popular services we use. These services are provided mainly by US-based companies. These organizations fall under local legis-lation and have had to hand over information to local agencies. The Internet does not inherently leak this information but the information is obtained directly from the service provider. Hence, the issue is that we the users provide the information to these organizations by using their services. In his article, Peter Schaar also raises the issue of the significant market power of some of these Internet players. Commercially, these players are very strong and they may have significant market power, at least in the western hemisphere. However, the reasons these companies have become significant

are not rooted in Internet technology. As a matter of fact, Internet technology enables a level playing field for competition and local alternatives exist and are widely used in many regions. Therefore, the users can choose between services and service providers. We have to ask why users, regardless of privacy implications, continue to use the currently popular services and why there is a lack of viable alternatives in Europe.

The Internet is international in nature. The packets pass over national borders as easily as they do within a coun-try or a region. This also causes legal friction between countries and regions when services are provided out-side a jurisdiction. Despite the issues, the international nature of the Internet is inherently a good thing. It is one of the key reasons the Internet has become the uni-versal data network. As Peter Schaar states in his text, the reactions to pervasive surveillance may try to start restricting the international nature of the Internet. In addition, there is a risk that the example of pervasive surveillance increases the interest of other nations to start similar programs themselves. These are real risks to the Internet.

The increase in computer processing power and stor-age capacity has revolutionized data processing over the last decade. Currently, there seems to be little technical restriction on storing data in an always-available format over extended periods and processing it almost in real time for different purposes. The data stored for a certain purpose can be used for a completely different purpose than originally intended as business models or political climates change. As Peter Schaar points out in his text, the right to privacy is an extremely important human right. This includes the right to privacy on the Internet. Taking the current technical capabilities into account, the right to privacy is perhaps more important today than it was ever before. Therefore, it is absolutely vital for the trustworthiness of the Internet that progress in developing new security technologies and deploying those already specified continues and even accelerates. In addition, users need to be aware of the implications of their actions for their privacy online. The Internet technical community continues to develop the technol-ogies for increased security and improved privacy. In addition, the public focus on the Snowden revelations has created more awareness of privacy on the Internet. This technical progress and increased awareness can lead the horse to the water. We can only hope it will also drink.

Page 59: MIND #7: Privacy and Internet Governance

60

Credit: http://www.nsa.gov/about/photo_gallery/gallery.shtml http://goo.gl/70KGE2

Page 60: MIND #7: Privacy and Internet Governance

61

Big Data leads to new international data processing policiesMIKHAIL M. KOMAROV, NATIONAL RESEARCH UNIVERSITY HIGHER SCHOOL OF ECONOMICS, MOSCOW

RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y

I would like to agree wholeheartedly with Peter Shaar’s paper “The Internet and Big Data – Incompatible with Data Protection?” and particularly with his proposal of developing new data processing policies. We live in a world where technology develops fast. Unfortunately, we usually face big delays between the introduction of new technologies to the mass market and people around the world and the evaluation of the influence of those technologies, either on ecology or on the lives of human beings, including in terms of privacy or personal data protection. As a representative of the academic com-munity, I find myself in a similar situation when a new technology is developed but is then followed by a considerable delay in developing a new educational course which teaches others how to use that technology properly.

Over the last twenty years, technological development has overtaken the policy-making process, and usually

we face a problem first and then try to solve it. I think we encountered a problem when Web 2.0 was devel-oped and implemented quickly in our lives. We realized it only after the Snowden leaks and now we are trying to propose mechanisms to solve it. We understood the information we were sharing with our friends and rela-tives via many Internet services was available not only to its target audience. We should also understand and accept the fact that information is now one of the most critical resources and it is on the same level as oil and gas. It is also necessary to remind ourselves of the differ-ences between the terms “information” and “data”. Data is the source of information, and what kind of informa-tion and how much information we can get from the same amount of data depends on algorithms, i.e. the mathematical methods we use, as well as on the skills of the people processing that data. That is why when we are talking about privacy, we usually mean information about us. But in terms of the policy-making process, we

Page 61: MIND #7: Privacy and Internet Governance

are focused on processing data which consists of some of our personal information. I think it is quite impor-tant for our understanding that Mr. Shaar mentioned “most of the current data protection rules and regulations focus on the individual procedure used for data process-ing” and that “data protection regulations consider data processing from a micro perspective: single pieces of data, an individual algorithm, a specific purpose. Today, companies and public bodies see data processing more and more from a macro perspective”. This is where the line is drawn – on one side, we should process the data in order to make efficient prognoses, improve processes and improve our quality of life; on the other side, while processing the data, we should be responsible for an indi-vidual’s personal information contained in that data (or someone’s personal data). That is why data processing and the data analysis process on the macro level should be standardized and regulated much more strongly than on the micro level (on the level of individuals). Governments already collect lots of data about individuals for different purposes and this is where we have to start improving personal data processing regulations. We do not have the individual at the heart of the system of data process-ing, but business or governmental issues, and most of the regulations are comfortable enough for them; however, the situation should be changed and I agree wholeheart-edly with Mr. Shaar – parliaments should be introducing those changes.

I would also like to go further with technological devel-opment – from Web 2.0 to Web 3.0. Today, there are plenty of ways to prevent the use of particular websites: by including them in databases, by requesting a password and by only allowing access to certain websites (such as using parental controls). But there are ten times more ways to bypass all these safeguards. The important thing about Web 3.0 is that the resulting information may be counterfeit or misleading, depending on its popularity in society, which is sometimes not correct. It was once mentioned that, from a data protection perspective, one of the main aims of the Semantic Web and Web 3.0 was to make data easier to process and re-use. However, this leads to the question of what becomes of the protection of personal data in such an open, universally accessible web of interlinked data? This is particularly important because applications according to Web 3.0 are likely to be far more effective than even traditional search engines at piecing together personal data, thus increasing the risk of

identity theft. This leads to special requirements for safe-guards to protect user data, as well as policies to ensure people understand how their information will be used. It is necessary to say that we almost non-protected at the Internet from the non-appropriate information, unfor-tunately there might be a one-look rule – when you will see something once and obtain that information just to build policy for the future, not to show that information, because we don’t have special governmental or interna-tional policies against placing that information on the Internet. In terms of Web 3.0, when our things will use information from the Internet, or generate the informa-tion and send it to the Internet, we should specify the policy and special agreement of connecting things to the Internet; there should be a clear identification field which would point to the particular person to whom the thing belongs, otherwise we will have lots of uncontrolled information generators – bots – which will influence the dissemination of information.

We should think about “privacy by design” issues and probably special certification for systems dealing with personal data. I, too, would like to support the initiative of “open interfaces to enable communications between members and non-members” and I think there is a good example explaining how it works with regard to terms and conditions and our privacy: the movie “Terms and Conditions May Apply” by Cullen Hoback.

We should not fear Big Data concept development and the implementation of new technologies in our lives, but we should allow individuals to be excluded from all the analytical and statistical processes at any time. Due to fast growth in the technological field and in the amount and type of data on the Internet, reaction from the legal side has been slow, resulting in a lack of laws and policies to protect our privacy. It is the goal of the international community to jointly update current laws regulating data and information dissemination policy (including on the Internet). How long would it take to arrange joint inter-national action?

62

We should specify the policy and special agreement of connecting

things to the Internet.

Technological development has overtaken the policy-making

process.

Page 62: MIND #7: Privacy and Internet Governance

63

Schaar is both profetic and mainstreamRICHARD HILL, AUTHOR, FORMER ICT MANAGER

Peter Schaar’s excellent and well-thought-out paper is at once prophetic and mainstream. It is mainstream because it reminds us of fundamental values that were clearly enunciated during the Age of Enlightenment, and it is prophetic because those very same values have recently been reaffirmed by political leaders in United Nations Resolutions and in a judgment of the European Court of Justice.

Schaar reminds us why so much data is being collected and analysed for private purposes: “… mass data is seen as an asset, the ‘new oil’ of the information society.” Indeed, consumers obtain “free services” if they agree to allow the service provider to use their data. The data are used to target advertising. But the revenue derived from the advertising is far greater than the cost of pro-viding the “free” service, so in fact users are paying for the service (albeit not with money) and are receiving in return far less than the value of the information they provide.1 To some extent, this situation is a consequence of the funding model for Internet traffic flows, where the receiver pays, and there are no pervasive “pay-per-use” mechanisms. So providers of services other than access have developed advertising as their main revenue stream.

As Schaar correctly notes, network effects and econo-mies of scale lead to concentration, so there is often a

1 A summary discussion of this situation, with references to more detailed work, is found in section 6 of Hill, Richard, 2014. “The Future of Internet Governance: Dystopia, Utopia, or Realpolitik?”, in Pupillo, Lorenzo (ed.), The global Internet governance in transition , Springer (forthcoming)

dominant provider of a particular service (e.g. a social network) and users have no choice but to accept the terms and conditions of that dominant provider. I join Schaar in calling for such service providers to be brought under effective competition control, for example, by enforcing a “right to portability” and by envisaging mea-sures to avoid the abusive use of data collected by the smartphone software known as apps.

As Schaar correctly points out, mandatory minimum privacy standards are, in this context, analogous to man-datory safety standards that we take for granted.

Schaar reminds us that informed consent is a condition for the processing of data. The service providers referred to above do obtain the consent of their users, but this is done by a contract of adhesion whose terms and condi-tions are often very long and are often not read in detail by users. It seems legitimate to wonder whether there really has been informed consent for the use of the con-sumer’s data.

Schaar reminds us that data can be used in unexpected ways, and I would add that no database is entirely immune from theft: an insider can copy a large amount

RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y

I would thus call for the elimination of restrictions on

encryption.

Page 63: MIND #7: Privacy and Internet Governance

64

of data and sell it, thus violating the terms and conditions under which the user provided the data.

Schaar rightly notes that “it is unacceptable that gov-ernments and intelligence agencies are abusing the increasing international data transfer for bulk access to the transmitted data”.2 He calls for greater use of encryp-tion, but unfortunately certain types of strong encryption are restricted by laws or regulations. I would thus call for the elimination of restrictions on encryption.

Indeed, the very design of the Internet was based on the assumption that there would be end-to-end secu-rity3, so its pervasive implementation would seem to be a necessity.

As Schaar notes, “there is a need to establish binding international data protection standards guaranteeing the protection of private life, as laid out in Art. 12 of the United Nations Charter of Human Rights”. I agree with Schaar that a binding instrument is needed; that is, a treaty. Of course, a treaty can only be agreed by member states, and those who favor discussing Internet-related issues in the less formal and more open process usually referred to as the “multi-stakeholder” model4 should accept that those discussions can precede, but cannot replace, formal intergovernmental processes. For example, an attempt was made recently, at the April 2014 NETmundial meeting, to tackle the issue of mass sur-veillance, but all that was agreed was to restate text that had been previously agreed at the United Nations. This meeting had been convened largely to discuss the mat-ter of mass surveillance, so its failure to propose steps to curtail mass surveillance was considered disappointing.5 However, that is not entirely a fair assessment; while the UN Resolution was agreed only by states, the NETmun-dial text was agreed by a broad coalition of governments, civil society, private sector, academia, and technical experts. So it has broad support and should influence more formal discussions.

2 For a more general critique of certain current Internet practices, see Hill, Richard, 2013. “Internet Governance: The Last Gasp of Colonial-ism, or Imperialism by Other Means?”, in Weber, R. H., Radu, R., and Chenou, J.-M. (eds) The evolution of global Internet policy: new principles and forms of governance in the making?, Schulthess/Springer

3 See section 4.2.5.2 of Hill, Richard, 2014. “The Internet, its gov-ernance, and the multi-stakeholder model”. Info, Vol. 16 No 2

4 See section 5 of Hill (2014)5 O’Brien, Danny, 2014. “Human Rights Are Not Negotiable: Looking

Back At Brazil’s NETmundial”, Electronic Freedom Foundation, 25 April ������KWWSV���ZZZ�Hij�RUJ�GHHSOLQNV���������QHWPXQGLDO!�DFFHVVHG�26 April 2014; GIP team, “Why NETmundial mattered and what was achieved”, Geneva Internet Platform, 24 April 2014 <http://giplatform.RUJ�UHVRXUFHV�ZK\�QHWPXQGLDO�PDWWHUHG�DQG�ZKDW�ZDV�DFKLHYHG!�DF-FHVVHG����$SULO�������DQG��KWWS���EHVWELWV�QHW�QHWPXQGLDO�UHVSRQVH�!��

Following up on NETmundial, I would propose that the matter be taken up in the ITU, whose Constitution has always had a provision on secrecy of telecommunications (Article 37). That provision is too weak, but it can be strengthened as follows:

1. Member States agree to take all possi-ble measures, compatible with the system of telecommunication used, with a view to ensur-ing the secrecy of international correspondence.

2. Nevertheless, they reserve the right to communicate such correspondence to the competent authorities in order to ensure the application of their national laws or the execution of international conventions to which they are parties. However, any such communication shall take place only if it is held to be necessary and proportionate by an independent and impartial judge.

3. Member states shall respect the secrecy of tele-communications in accordance with both their own laws and the laws of the state of the originator of such correspondence.

As Schaar so rightly says, “the pursuits of liberty and prosperity, free discussion and inclusion, closely linked to the information society, are at stake. There is a need for a broad coalition to defend these values.”

Indeed, I find it surprising that we seem to have forgot-ten fundamental principles that were formalized more than 200 years ago and repeatedly reaffirmed since then. The Fourth Amendment of the US Constitution, drafted in 1789 and approved in 1791, states:

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and sei-zures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

Article 12 of the Universal Declaration of Human Rights (UDHR) and Article 17 of the International Covenant on Civil and Political Rights state (in pertinent part):

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence; …

In accordance with Article 29 UDHR:

In the exercise of his rights and freedoms, everyone shall be subject only to such limitations as are determined by law solely for the purpose of securing due recognition and respect

Page 64: MIND #7: Privacy and Internet Governance

65

RESPONSES TECHNICAL & ACADEMIC COMMUNIT Y

for the rights and freedoms of others and of meeting the just requirements of morality, public order and the general welfare in a democratic society.

Dilma Rousseff, President of Brazil, in her 24 September 2013 speech at the United Nations, stated:

In the absence of the right to privacy, there can be no true freedom of expression and opinion, and therefore no effec-tive democracy.

United Nations General Assembly Resolution A/RES/68/167 of 18 December 2013 on the right to privacy in the digital age reaffirms the right to pri-vacy; emphasizes that unlawful or arbitrary surveillance violates that right; expresses concern about mass surveil-lance; and calls on states to respect privacy and to review procedures, practices and legislations, including with respect to mass surveillance.

On 8 April 2014, the European Court of Justice struck down the European Data Retention Directive on the grounds that the data retention was not limited to what is strictly necessary and that it exceeded the limits imposed by compliance with the principle of proportionality.6

On 10 April 2014, the European Union Article 29 Working Party adopted Opinion 04/2014 on surveil-

6 http://curia.europa.eu/juris/document/document.jsf?doclang=EN&text=&pageIndex=1&part=1&mode=reqGRFLG ������RFF ıUVWGLU FLG ������

lance of electronic communications for intelligence and national security purposes.7 That Opinion states:

From its analysis, the Working Party concludes that sec-ret, massive and indiscriminate surveillance programs are incompatible with our fundamental laws and cannot be justified by the fight against terrorism or other important threats to national security. Restrictions to the fundamental rights of all citizens could only be accepted if the measure is strictly necessary and proportionate in a democratic society.

So Schaar’s call for action is not isolated. As citizens, we must insist that our parliaments take action, both to stop mass surveillance by governments and to cur-tail the power of dominant service providers to obtain data from customers and use it as they see fit to generate large profits. And we must insist that the Necessary and Proportionate principles8 supported by a large number of organizations and scholars be implemented, despite the resistance shown at NETmundial by the US and its allies such as Sweden to calls for the implementation of those principles.

Paraphrasing what was said by former Spanish judge Baltasar Garzón in a speech in Geneva recently, it is time to stop debating the legality of what is manifestly illegal.

���KWWSV���ZZZ�KXQWRQSULYDF\EORJ�FRP�ıOHV���������ZS���BHQ�SGI�8 https://en.necessaryandproportionate.org/text

Def

ense

Adv

ance

d Res

earc

h Pro

jects

Agen

cy (D

ARPA

) | ht

tp://

goo.

gl/O

wF8f

q

Page 65: MIND #7: Privacy and Internet Governance

66

1. GEORG APENESGeorg Apenes is a Norwe-gian jurist and was appointed director of the Norwegian Data Inspectorate in 1989. There he made a mark in the political debate as a defender of privacy. When commenting on Internet privacy, Apenes deplored the indifference with which people disseminate personally identifiable infor-mation. He stepped down in April 2010. Georg Apenes has authored several books, with topics spanning from monographies on political themes and analysis of politi-cal parties to festschrifts and amateur history. In addition, he has written columns in the newspapers Fredriksstad Blad, Stavanger Aftenblad, Dagens Næringsliv and A-Magasinet.He served three terms (elected 1977, 1981 and 1985) in the Parliament of Norway. In 2010 Apenes was appointed Knight, First Class of Order of St. Olav, for his work.

2. NICK ASTHON-HALLNick Asthon-Hall is the senior permanent representative of

the for-profit technology sector to the UN, its member-states, and the international organisa-tions resident in Geneva. He has been an active part of mul-tilateral policy development starting with the sustainable development agenda for the world’s cities (HABITAT 11) in 1992, has been an active part of the Geneva commu-nity for 14 years and a resident of it for the past eight. He worked in the music indus-try as a manager to artists such as James Brown. Also, in the tech sector he went from a Systems Administrator post to CIO/CTO in five years and has broad hands-on technol-ogy experience from running a small local area network to designing multi-country wide area networks. He is currently Executive Director of the Internet & Digital Ecosystem Alliance (IDEA). Prior to that he was Geneva Representative of the Computer & Commu-nications Industry Association (CCIA), Director for At-Large and Senior Director for Participation and Engagement with the Internet Corpo-ration for Assigned Names

Authors

Page 66: MIND #7: Privacy and Internet Governance

67

AUTHORS

BILD LORENA

and Numbers, Inc. (ICANN) and Executive Director of the International Music Man-agers Forum (IMMF)

3. RAFIK DAMMAKRafik Damak is an engi-neer working and living in Japan. He is member of the steering committee for the Dynamic Coalition on Inter-net Rights and Principles as well as 1net, representing civil society. He has been involved in the ICANN community as NCUC (Non-commercial users constituency) indi-vidual user member. He is also a former elected GNSO Councillor for the Non-Com-mercial Stakeholder Group and a member of ICANN’s Nominating Committee (NomCom) who is responsible for selecting eight members of the Board of Directors and other key positions within ICANN‘s structur. In addi-tion he participated in several ICANN working groups like the new generic top-level domain (gTLDs) applicant support. He was elected as chair of the Non-Commercial Stakeholders Group (NCSG),

who represents, through its elected representatives and its Constituencies, the interests and concerns of non-commercial registrants and noncommercial Internet users of generic Top-level Domains (gTLDs). He is working on improving awareness about Internet Governance in Tuni-sia and MENA in general.

4. SUSANNE DEHMELSusanne Dehmel is Head of Department Data Protec-tion at the Federal Association for Information Technol-ogy, Telecommunication and New Media (BITKOM) since 2010. She is a lawyer and studied in Passau, Freiburg and Cardiff. Before tak-ing over the Data Protection Department she was respon-sible for Copyright Law and Intellectual Property Issues from 2002-2009. Encour-aging the development of a modern and practicable data protection law for the infor-mation society is an important part of her current work.

5. RICHARD HILLDr Richard Hill is an inde-pendent consultant and was the Secretary for the various ITU groups that discussed the revision of the International Telecommunications Regula-tions (ITRs). He was the head of the secre-tariat team dealing with the substantive issues at the World Conference on International Telecommunications (ITRs). He was part of the secretar-iat team for the World Summit on the Information Society (WSIS) and has been involved in Internet governance mat-ters since the mid 1990’s.Prior to joining ITU in 2001, Richard was Depart-ment Head, IT Infrastructure Delivery and Support, at Orange Communications (a GSM operator). He also worked at Hewlett-Packard‘s European Headquarters in Geneva, Switzerland. Rich-ard holds a Ph.D. in Statistics from Harvard University and a B.S. in Mathemat-ics from M.I.T. Prior to his studies in the U.S.A., he obtained the Maturita‘ from the Liceo Scientifico A. Righi

in Rome, Italy. He has pub-lished papers on Internet governance, mediation, arbi-tration, and computer-related legal and intellectual prop-erty issues and the standard reference book to X.435.

6. LORENA JAUME-PALASÍ Lorena Jaume-Palasí is a lec-turer and PhD candidate at the Department for Practical Philosophy at Ludwig-Max-imilians-University Munich. Her main focus are moral con-flicts in international relations and new technologies in gov-ernance structures, as well as strategies of collective actors and collective rationality. She has co-organized the German Youth-IGF, the New Media Summer School and the 2014 German IGF. She has been coordinating Global Internet Governance working groups at Berlin’s Internet & Society Collaboratory, where she also leads projects and partnerships since June 2014. She occa-sionally writes for the online magazines, such as irights.info

Page 67: MIND #7: Privacy and Internet Governance

68

7. WOLFGANG KLEINWÄCHTERWolfgang Kleinwächter has been involved in Internet Gov-ernance for decades and has participated - in various capac-ities - in ICANN, the Internet Governance Forum (IGF) and WSIS (World Summit on the Information Society). In the WSIS process he was a member of the Civil Society Bureau, co-chaired the Inter-net Governance Caucus (IGC) and was appointed by UN Secretary General Kofi Annan as for the UN Working Group on Internet Governance (WGIG). He is a co-founder of the European Dialogue on Internet Governance (EURODIG), the Global Internet Governance Aca-demic Network (GIGANET), the Summer School on Inter-net Governance (SSIG) and chair of the ICANN Studi-enkreis. Also, Kleinwächter is a Professor for International Communication Policy and Regulation at the Department for Media and Information Sciences of the University of Aarhus in Denmark where he teaches “Internet Pol-

icy and Regulation” since 1998. He is a founding mem-ber of the Collaboratory, and was appointed to the ICANN Board of Directors in 2013.

8 MICHAEL M. KOMAROVMichael M. Komarv is Asso-ciate professor of the National Research University Higher School of Economics in Mos-cow and Deputy dean for international relations at the faculty of business informat-ics. He accomplished a PhD at Moscow State Institute of Electronics and Mathe-matics in 2012. Komarov is also specialist in wirelss sen-sor networks and ICT and was awarded with grants for the best scientific projects, medals for the best scientific projects and IT integration to the educational system. He is member of the Techni-cal Committee on Business Informatics and Systems IEEE and was a Speaker at the Workshop of the Inter-net Governance Forum in 2012/2013. He is also Head of the Interuniversity Labo-ratory for innovative projects

“Wireless Interactive Sys-tEms and NETworks” (www.wisenetlab.ru) and Co-founder of the Allrussian Public Orga-nization “Young Innovative Russia” (www.i-innomir.ru)

9. JAN MALINOWSKI Jan Malinowski is a lawyer, qualified in Spain and in Eng-land. Following eight years of professional practice in Barce-lona and London, He joined the Council of Europe where he worked for eleven years with the anti-torture watch-dog. Since 2005, he has been responsible for Council of Europe work on media pol-icy, freedom of expression and Internet governance. This work has resulted in the adop-tion by the Organisation‘s 47 member states of a number of ground-breaking human rights-based normative texts, including a new notion of media, a commitment to do “no harm” to the Internet and the acknowledgement of the states’ shared responsibility for preserving the integ-rity and ongoing functioning of the Internet. As Head of the Information Society

Department, he is now also responsible for work related to two unique Council of Europe conventions, on data protection and cybercrime.

10. STEPHANIE PERRIN Stephanie Perrin spent 30 years in the Canadian federal govern-ment, working on information policy and privacy issues. She was Director of Privacy Policy responsible developing pri-vate sector privacy legislation (PIPEDA), leaving in 2000 to work for Zero Knowledge Sys-tems, to promote technology for anonymity on the Internet. She is a PhD candidate at the University of Toronto Fac-ulty of Information, with research interests focusing on why privacy is not imple-mented in Internet standards and functions. She is a mem-ber of the Expert Working Group at ICANN, tasked with revamping the Whois direc-tory, and her research examines why privacy has developed into such an intractable prob-lem at ICANN. This research examines concepts of iden-tity online, and the inadequacy of current privacy norms.

Page 68: MIND #7: Privacy and Internet Governance

69

AUTHORS

11. GEORGE SALAMAGeorge Salama is Senior Manager for Public Policy at the SAMENA Telecom Council and is responsible for setting up and executing the Council’s public policy plan that includes broadband devel-opment, ICT policy, spectrum management, digitization, and Internet governance. Salama spent over six years at Egypt’s National Telecom Regu-latory Authority (NTRA), International Technical Coordination Department, and was in-charge of Inter-net public policy issues on national, regional and inter-national levels. He was also part of the Egyptian gov-ernment delegation to the Internet Governance Forum 2007 to 2010. After complet-ing his Bachelor of Science with a major in Computer Science and a minor in Elec-tronics from the American University in Cairo, AUC, Salama completed his Master’s of Science degree in Business Information Technology from Middlesex University, UK, in 2008. Salama is currently a part-time PhD researcher

with Tampere University, Fin-land, and is pursuing his thesis on “Internet Governance – Intergovernmental Model vs. Multi-Stakeholder Approach.”

12. PETER SCHAAR is Chairman of the Euro-pean Academy for Freedom of Information and Data Pro-tection (Berlin) and Former German Federal Commis-sioner for Data Protection and Freedom of Information. Holding a diploma in eco-nomics he worked from 1980 to 1983 with the Senate’s office for administrative ser-vices Hamburg. 1986 to 1994 Schaar worked as head of sec-tion with the Hamburg Data Protection Commissioner. He was deputy there from 1994 to 2002. In 2001 and 2002 he was a dedicated member of the Commission set up to accompany the moderniza-tion of the Data Protection Law. On November 1, 2002 he founded a consulting com-pany for data protection. His further engagements are cov-ering the Gesellschaft für Informatik (Society for Infor-matics), the International

Working Group on Data Pro-tection in Telecommunications (IWGDPT), the Hamburger Datenschutzgesellschaft (HDG, Hamburg Society of Data Protection) as well as the Humanistische Union (Humanistic Union). Peter Schaar is the laureate of the eco Internet AWARD 2008.

13 . PETRA SITTEPetra Sitte is Chief Whip of the Parliamentary Group DIE LINKE in the German Bundestag since 2013. She is a Member of the Bundestag since 2005 and was a member of the Bundestag’s Commis-sion of Inquiry on Internet and Digital Society (2010-2013). Since more than two decades her areas of public policy have been science, technology and innovation. She holds a doc-torate in political economy.

13. JONNE SOININENJonne Soininen is Associ-ate Technical Director at Broadcom based in Hel-sinki, Finland. Prior to Broadcom, he worked in dif-ferent positions with Nokia, Nokia Siemens Networks and Renesas Mobile being active in the technical com-munity for over 15 years. During these years Jonne has been active both in techni-cal and policy organizations including the Internet Engi-neering Task Force (IETF), the Internet Corporation of Assigned Names and Num-bers (ICANN), the Internet Society (ISOC) and the Inter-net Governance Forum (IGF). Currently, Jonne Soininen is serving as non-voting tech-nical liaison to the ICANN board appointed by the IETF.

* All author pictures reprinted with permission. Except Peter Schaar: Alexander Klink | http://com-mons.wikimedia.org/wiki/File:Peter_Schaar_%282013%29.jpg#filelinks | https://creativecommons.org/licenses/by/3.0/de/ | CC-BY-3.0

Page 69: MIND #7: Privacy and Internet Governance

The Collaboratory is a non-partisan laboratory for the digital society. As a multistakeholder platform we facilitate projects and debates about the challenges of digitization for our society. Following an open and transparent approach the Collaboratory tries to find solutions for answers in the areas of governance and regulation, transformation of privacy and publicness, copyright and innovation, transformation of work and industry, cultural heritage and education, globalization and security.

As a community of practice the Collaboratory is open to anyone wishing to contribute constructively to the public discourse. Over 350 experts from various sectors are active in the Collaboratory’s network. We constantly develop new formats and develop projects with various partners to enrich the debate and provide solutions to the community.

The Collaboratory is a registered not-for-profit organi-zation based in Berlin since 2012. A small team heads the ongoing projects under the auspices of a steer-ing group and an advisory board. Membership is not required for participation in our activities. The platform is funded through donations and partnerships. Under our far-reaching transparency policies, you can find all information on our website on financial resources, people, supporters and results. The Collaboratory has its roots in the project initiated in 2010 by Google Germany who remain a key supporter of our platform.

As an open platform we welcome the participation of experts from all areas and the support of businesses, associations, foundations and academic institutions for our work. We are constantly open to new partnerships - and we actively seek funding. Contact us if you wish to support the Collaboratory.

Your direct contact: Sebastian Haselbeck, managing director [email protected]

70

Photo: Tobias Schwarz. CC BY

About the Internet & Society Collaboratory

Page 70: MIND #7: Privacy and Internet Governance

This Discussion Paper Series is one of the Collaborato-ry’s most successful projects and internationally renowed. MIND reaches the entire internet governance community and has established itself as a valued contribution to the discourse. It was an experiment when we first released it. Today, its editor is an ICANN director, and stakeholders from around the world read, contribute to, and ask to par-ticipate in this interdisciplinary platform.

MIND and the Collaboratory need your support. To make upcoming issues a reality, financial support to our non-profit organization is essential. Talk to us if your company or orga-nization would like to support this publication. Donations are tax deductible, sponsorships or ad placement are also an option.

The time for action is now. The future of the internet is at stake and few other platforms bring together experts of such caliber to make the debate about its governance accessible to the wider community. It is the Collaboratory’s stated mis-sion to provide a platform for constructive ideas on how our societies can cope best with the digitization of our world. Support this open platform, an essential, non-partisan com-ponent of the German internet policy ecosystem.

There are many ways to contribute to the realization of this project:

become a cooperation partner or offi-cial supporter of the Collaboratorysupport MIND with a donation or by cov-ering some of its costsupport MIND through sponsorship or by buying ad spacebecome a distribution partnerhost the magazine’s launch or press event

Talk to us if your organization wants to contribute respon-sibly to the discourse. We are looking forward to discussing new partnerships.

Sufficient funding provided, MIND #8 is scheduled for this September’s IGF in Istanbul.

Internet & Gesellschaft Collaboratory e.V.Bank: GLS Bank BerlinIBAN: DE79 4306 0967 1141 7119 00BIC: GENODEM1GLS

Questions? Contact Lorena Jaume-Palasí at [email protected]

71

MIND needs your support

Page 71: MIND #7: Privacy and Internet Governance

72

GERMAN ISSUE

Previous Issues and Authors of MIND

Human Rights and Internet Governance

BERLIN - BAKU - NOVEMBER 2012

MINDMULTISTAKEHOLDER INTERNET DIALOG

CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1

A publication by the Internet & Society Co:llaboratory Editor - Wolfgang Kleinwächter

Internet & Society

Co llaboratory

Shirin Ebadi, 2003 Nobel Peace Prize,

Iran

# 4

suppor ted by:

GOVERNMENT & PARLIAMENT

Carl Bildt - Minister for Foreign Aff airs, SwedenMarietje Schaake - European ParliamentAlice Munyua - Government of Kenya

CIVIL SOC

IETY

Joy Liddicoat - APC New ZealandJeremy M

alcol m - ConsumersInternational, M

alaysia

Graciela Selaimen - NUPEF, Brazil

PRIV

ATE

SEC

TOR

Jermyn

Broo

ks - G

lobal

Netw

ork In

itiativ

e

Ronald Kov

en - W

orld P

ress F

reedo

m Co

mmittee

Zahid

Jamil -

ICC

Pakis

tan

TECHNICAL & ACADEMIC COMMUNITYCees Hamelink - University of AmsterdamMarkus Kummer & Nicolas Seidler - ISOC

Raúl Echeberría - LACNIC, Uruguay

Internet und Demokratie

BERLIN - JUNI 2013

MINDMULTISTAKEHOLDER INTERNET DIALOG

CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1

Eine Publikation des Internet & Gesellschaft Collaboratory e.V. Herausgeber - Wolfgang Kleinwächter

# 5

Julian Nida-RümelinStaatsminister a.D.,

Ludwig-Maximilians-Universität München

PARLAMENT & REGIERUNGSabine Leutheuser-Schnarrenberger - Bundesministerin der Justiz

Th omas Schneider - BAKOM Schweiz

ZIVILGESELLSC

HAFT

Klaus Stoll - NPOCKarola W

ille - MDR

Andreas Krisch - AK Vorrat Wien

PRIV

ATSE

KTO

R

Wolf O

sthau

s - U

nitym

edia

Kabel

BWPh

ilipp

Grab

ensee

- Afi li

as

AKADEMISCH-TECHNISCHE COMMUNITYErika Mann - ICANN

Eric Schweighöfer - Universität Wien

Dirk Krischenowski - ISOC Deutschland

t

Internet and Security

BERLIN – BALI – OCTOBER 2013

MINDMULTISTAKEHOLDER INTERNET DIALOG

CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1

A publication by the Internet & Society Collaboratory Editor - Wolfgang Kleinwächter

Internet & Society

Co llaboratory

# 6

Toomas Hendrik Ilves President of the Republic of

EstoniaBruce Schneier

Author of Liars and Outliers: Enabling the Trust Society

Needs to Thrive

GOVERNMENT & PARLIAMENT

Th orbjørn Jagland - Secretary General, Council of Europe

Olga Cavalli - GAC, Foreign Ministry of Argentina

CIVIL SOC

IETY

Avri Doria - Non-Commercial UserConstituency, ICANN NCUC

Carlos Afonso - NUPEF, Brazil

PRIV

ATE

SEC

TOR

Ram

Moh

an -

CTO,

Afi li

asRaje

sh C

hhar

ia - I

nterne

t Serv

icePro

vider

Assoc

iation

of In

dia

TECHNICAL & ACADEMIC COMMUNITYXu Peixi - Communication University of ChinaLeonid Todorow - Coordination Center for .ruInstitute for Foreign Aff airs

Alexander Klimburg - Austrian

MIND #5 – INTERNET UND DEMOKRATIE

MIND #4 – HUMAN RIGHTS AND INTERNET GOVERNANCE

Cybersecurity is as important as the open-ness and freedom of the Internet. An insecure cyberspace undermines individual human rights, blocks online business and hinders the free exchange of information. But there is still no globally accepted definition of what Inter-net security – or, more broadly, cybersecurity – means in detail. Different stakeholders have different ideas. This paper will make a contri-bution to this topic.

Proposition Shirin Ebadi, Nobel Peace Price Iran

Government & ParliamentCarl Bildt - Minister of For-eign Affairs, SwedenMarietje Schaake - European ParliamentAlice Munyua - Government of Keny

Private SectorJermyn Brooks - Global Network InitiativeRonald Koven - World Press Free-dom CommitteeZahid Jamil - ICC Pakistan

Civil SocietyJoy Liddicoat - APC New ZealandJeremy Malcolm - Consum-ers International, MalaysiaGraciela Selaimen - NUPEE, Brazil

Technical & Academic CommunityRaúl Echeberría - LACNIC, UruguayMarkus Kummer & Nicolas Seidler - ISOCCees J. Hamelink - University of Amsterdam

PropositionJulian Nida-Rümelin - Staatsminister a.D., Ludwig- Maximilians-Universität München

Government & ParliamentSabine Leutheusser-Schnarrenberger - Fed-eral Ministry of Justice, GermanyThomas Schneider - Federal Office of Com-munications (OFCOM), Switzerland

Private SectorWolf Osthaus - UnitymediaKabelBWPhilipp Grabensee - Afilias

Civil SocietyKlaus-Dieter Stoll - ICANN Not-for-Profit Organizations Constituency (NPOC)Karola Wille - Central Ger-man Broadcasting, MDRAndreas Krisch - Working Group on Data Retention, Vienna

Technical & Academic CommunityDirk Krischenowski - Inter-net Society (ISOC) GermanyErich Schweighofer - Unversity of ViennaErika Mann - Internet Corpo-ration for Assigned Names and Numbers (ICANN), Facebook

PropositionToomas Hendrik Ilves - Presi-dent of the Republic of EstoniaBruce Schneier - Author of Liars and Outliers: Enabling the Trust Society Needs to Thrive

Government & ParliamentThorbjørn Jagland - Secretary Gen-eral, Council of EuropeOlga Cavalli - GAC, For-eign Ministry of Argentina

Private SectorRam Mohan - CTO, AfiliasRajesh Chharia - Internet Service Pro-vider Association for India

Civil SocietyAvri Doria - Non-Commericial User Constituency, ICANN NCUCCarlos Afonso - NUPEF, Brazil

Technical & Academic CommunityAlexander Klimburg - Austrian Insti-tute for Foreign AffairsLeonid Todorov - Coordination Center for .ruXu Peixi - Communication University of China

MIND #6 – INTERNET AND SECURITY

The internet has evolved from the technical playground of a few scientists to the operat-ing system of our global society. It is now the platform on which major value creation takes place. This raises several questions: Does the internet better our democracy? Is access to the internet a basic right? What does that mean for politics? MIND #5 deals with these questions.

This volume focuses on the struggle for freedom of speech and human rights on the internet, an area which is – most recently since the Arab Spring – at the foundation of today’s discourse. More and more actors are seizing the opportu-nity to shape the global network according to their respective interests and value systems.

Page 72: MIND #7: Privacy and Internet Governance

73

GERMAN ISSUE

GERMAN ISSUE

PREVIOUS ISSUES AND AUTHORS OF MIND

MIND stands for Multistakeholder Internet Dialogue. This discussion paper series is a platform for modern polemics in the field of internet governance. Each issue is structured around a central argument in form of a proposition of a well-known author, which is then

commented by several stakeholders from academia and the technical communities, the private sector, as well as civil society and government in form of replies.It is a Creative Commons BY (attribution) licensed vol-ume and freely available to anyone.

# 1 Grundrecht Internetfreiheit

BERLIN, IM MAI 2011

MINDMULTISTAKEHOLDER INTERNET DIALOG

CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1

Eine Publikation des Internet & Gesellschaft Co:llaboratory.Herausgeber · Wolfgang Kleinwächter

Bernd Holznagel + Pascal Schumacher

Wilhelms-Universität Münster

PARLAMENT UND REGIERUNG

Angela Kolb, Justizministerin Sachsen-AnhaltTh omas Jarzombek, Mitglied desDeutschen Bundestages

ZIVILGE

SE

LLSC

HA

FT

Alvar Freude, Internet-Enquete desDeutschen Bundestages und AK Zensur

Sandra Hoferichter, ICANN/ALAC

AKADEMISCHE UND TECHNISCHE COMMUNITY

Hans Peter Dittler, ISOC Germany Karl-Franzens-Universität GrazWolfgang Benedek,

PRIV

ATW

IRTS

CH

AFT

Christ

ian St

öcke

r, Spie

gel O

nline

Michael

Rote

rt, Ve

rband

der d

eutsch

enInt

ernetw

irtsch

aft (e

co)

Internet Policy Making

BERLIN · NAIROBI · SEPTEMBER 2011

MINDMULTISTAKEHOLDER INTERNET DIALOG

CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1

A publication by the Internet & Society Co:llaboratory Editor · Wolfgang Kleinwächter

Bertrandde La ChapelleProgram Director,

International Diplomatic Academy, Paris

# 2

GOVERNMENT & PARLIAMENT

Everton Lucero, Ministry of Foreign Aff airs, BrazilFiona Alexander, U. S. Department of CommerceCatherine Trautmann, Member of the European Parliament

CIV

IL SO

CIE

TY

Olivier M. J. Crépin-Leblond, ICANN ALAC

Anriette Esterhuysen, Association forProgressive Communications

Annette Mühlberg, ver.di

TECHNICAL & ACADEMIC COMMUNITYWilliam Drake, University of ZurichISOC India – Chennai

Sivasubramanian Muthusamy, Vint Cerf, Google

PR

IVAT

E S

EC

TOR

Th eresa

Swine

hart,

Veriz

on C

ommu

nicati

onsPete

r Hell

mond

s, Nok

ia Sie

mens

Netw

orks

Waudo

Siga

nga,

Comp

uter S

ociety

of Ke

nya

Grenzen der Internetfreiheit

),9305���4(0�����

MINDMULTISTAKEHOLDER INTERNET DIALOG

CO:LLABORATORY DISCUSSION PAPER SERIES NO. 1

Eine Publikation des Internet & Gesellschaft Co:llaboratoryHerausgeber - Wolfgang Kleinwächter

Rolf H. Weber,Universität Zürich

# 3

suppor ted by:

PARLAMENT UND REGIERUNG

Sabine Verheyen, Mitglied des Europäischen ParlamentsMatthias Traimer, Bundeskanzleramt Österreich

Jim

my Schulz, Mitglied des Deutschen Bundestags

ZIVILGE

SE

LLSC

HA

FT

Kenneth Roth, Human Rights Watch

Wolf Ludwig, ICANN EURALOChristian Bahls, M

OGIS

AKADEMISCH-TECHNISCHE COMMUNITYMatthias C. Kettemann, Universität GrazTobias Mahler, University of Oslo

Ingolf Pernice, Humboldt-Universität zu Berli

n

PR

IVAT

SE

KTO

R

Christ

oph S

teck,

Telef

Ónica

Euro

peOl

iver S

üme,

eco

MIND #3 –GRENZEN DER INTERNETFREIHEIT

MIND #2 – INTERNET POLICY MAKING

MIND #1 – GRUNDRECHT INTERNETFREIHEIT

PropositionRolf H. Weber, University of Zurich

Government & ParliamentMatthias Traimer - Federal Chancel-lery of the Republic of Austria, ViennaSabine Verheyen - Member of the European ParliamentJimmy Schulz - Federal Minister, Germany

Private SectorChristoph Steck - Telefónica EuropeOliver Süme - eco e.V.

Civil SocietyWolf Ludwig - ICANNChristian Bahls - MOGIS e.V.Kenneth Roth - Human Rights Watch

Technical & Academic CommunityIngolf Pernice - Humboldt Uni-versity of BerlinTobias Mahler - University of OsloMatthias C. Kettemann - University of Graz

PropositionBertrand de La Chapelle - Program Direc-tor, International Diplomatic Academy, Paris

Government & ParliamentFiona Alexander - U.S. Departe-ment of CommerceCatherine Trautmann - Mem-ber of the European ParliamentEverton Lucero - Ministry of For-eign Affairs, Brazil

Private SectorTheresa Swinehart - Verizon CommunicationsPeter Hellmonds - Nokia Siemens NetworksWaudo Siganga - Computer Society of Kenya

Civil SocietyAnriette Esterhuysen - Association for Progressive CommunicationsOlivier M. J. Crépin-Leb-lond - ICANN / ALACAnnette Mühlberg - ver.di

Technical & Academic CommunityWilliam Drake - University of ZurichVint Cerf - GoogleSivasubramanian Muthusamy - ISOC India, Chennai

PropositionBernd Holznagel & Pascal Schum-acher, University of Münster

Government & ParliamentAngela Kolb, Ministry of Jus-tice, Saxony-AnhaltThomas Jarzombek, Mem-ber of the German Bundestag

Private SectorChristian Stöcker, Spiegel OnlineMichael Rotert, Association of the German Internet Industry

Civil SocietyAlvar Freude,Internet-inquiry of the Ger-man Bundestag, working group censorshipSandra Hoferichter, ICANN / ALAC

Technical & Academic CommunityHans Peter Dittler, ISOC GermanyWolfgang Benedek, University of Graz

Who decides what information harms national security and what doesn’t? In which ways do expressions of opinion disturb public order? How can individual communication result in a catastrophe for the system that protects intellectual property? Who is in charge of judg-ing this? A government, a company, a party, a lobby group or internet users? This issue remains controversial, especially from the per-spective of different states.

With this edition, we want to jump-start the wider debate on multistakeholder governance. This seemingly technical issue has important ramifications for the future of our societies and our planet. Only when we find modes of gov-ernance that allow us to address the technical and philosophical challenges of our complex and interdependent online and offline lives will we be able to secure the future of humanity.

The internet is a technology of freedom. It is a liberating medium. Never in human history have individuals been able to move as freely as they can on the internet, where time and space disappear. How are these liberties guaranteed when individuals and companies ask for uni-versal human rights?

Page 73: MIND #7: Privacy and Internet Governance

74

EditorWolfgang Kleinwächter

ProductionJanina GeraSebastian Haselbeck

Editorial Board Prof. Wolfgang Kleinwächter, Depart-ment for Media and Information Studies at the University of Aarhus (Chair)Prof. Wolfgang Benedek, Institute for International Law and International Rela-tions, Karl-Franzens Universität GrazProf. Rafael Capurro, International Center for Information Ethics (ICIE), KarlsruheDr. William J. Drake, Institute of Mass Communication and Media Research, the University of Zurich.Prof. Dr. Jeanette Hofmann, Social Sci-ence Research Center (WZB), Alexander v. Humboldt Institute for Internet and Society (HIIG) BerlinProf. Bernd Holznagel, Institute for Telecommunication and Medialaw at the University of MünsterProf. Divina Meigs, Université Sorbonne Nouvelle, ParisProf. Milton Mueller, Institute for Inter-national Studies at the University of Syracuse, N. Y.Dr. Philipp S. Müller, Center for Public Management and Governance, SMBS, Paris-Lodron University SalzburgProf. Michael Rotert, Institute for Infor-matics, Karlsruhe University of Applied SciencesProf. Rolf Weber, Law Faculty of the University of Zurich

Layout & Design:Jan Illmann Original design concept of the seriesJessica Louis & Sabine Grosserwww.louisgrosser.com

Printed byOktoberdruck, Berlin

Cover Image:teachandlearn | https://flic.kr/p/5qK9PG | CC BY-NC-SA 2.0 | https://creativecommons.org/licenses/by-nc-sa/2.0/

Contact the Collaboratory or its boardDr. Michael Littger, Martin G. Löhe, Lena-Sophie Müller, Dr. Philipp S. Müller, Dr. Marianne [email protected]

More information about the organi-zation, the people, projects, current partners and financial structure is on our website. Our platform relies on third party funding, consider supporting the Collaboratory with a donation. We would love to talk to you about a possible cooperation. Visit us at www.collaboratory.de

,

Unless stated otherwise, all texts are published under a Creative Commons Attribution 4.0 International (CC BY 4.0) license. You are free to share, copy and redistribute the material in any medium or format, adapt, remix, trans-form, and build upon the material for any purpose, even commercially – under the condition of attribution. More information: creativecommons.org/licenses/by/4.0/legalcode Kleinwächter, Wolfgang (ed.). “Pri-vacy and Internet Governance”. MIND Multistakeholder Internet Dialog #7. Collaboratory Discussion Paper Series No.1, Internet & Society Collaboratory www.collaboratory.de – Berlin: June 2014

ISBN 978-3-00-046186-6

ImprintMIND #7 - Privacy and Internet Governance

Page 74: MIND #7: Privacy and Internet Governance

MIND is a multi stakeholder debate magazine on interdisciplinary challenges of internet governance. It is edited by Wolfgang Kleinwächter and published twice per year coinciding with the international or regional internet governance forum.

Future issues depend on your support! We are looking for funding and distribution partners to make the next issue a reality. Position your company or organization as enabler of this essential discourse, reach international decision makers, and support the Discussion Paper Series with a donation or sponsoring.

The Internet & Society Collaboratory is an open think tank and internet policy deliberation platform dedicated to enabling the interdisciplinary work of specialists from civil society, academia, the public and private sectors on solutions to tomorrow’s socio- political opportunities and challenges posed by the digital transformation at the intersection between the internet and society.

Contact us via [email protected] if you would like more information, if you require additional print copies of this or past issues, or if you are interested in supporting or participating in our projects.

The Collaboratory was initiated in 2010 by Google Germany, since 2012 it is an independent, non-profit organization based in Berlin. For more information on the Collaboratory, our projects and activities, our funding and participating experts please visit collaboratory.de

9 783000 461866

ISBN 978-3-00-046186-6

Internet & SocietyCo llaboratory

Visit the Internet & Society Collaboratory at: http://en.collaboratory.de