"Some stakeholders assert that the application of some data protection principles and obligations under EU law should be substantially reviewed to enable promising forthcoming developments in
big data operations to take place. The application of the principles of purpose limitation and data minimisation are presented as core concerns in this respect, as they require that data controllers
collect personal data only for specified, explicit and legitimate purposes, and do not further process such data in a way incompatible with those purposes. They also require that personal data must be
adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed. In this regard, some voices argue that the focus should be only on the use of
personal data, linking this to the level of risk of harm to individuals. The Working Party acknowledges that the challenges of big data might require innovative thinking on how some of these and other key
data protection principles are applied in practice. However, at this stage, it has no reason to believe that the EU data protection principles, as they are currently enshrined in Directive 95/46/EC, are
no longer valid and appropriate for the development of big data, subject to further improvements to make them more effective in practice. It also needs to be clear that the rules and principles are
applicable to all processing operations, starting with collection in order to ensure a high level of data protection. In fact, the Working Party strongly believes that complying with this framework is a
key element in creating and keeping the trust which any stakeholder needs in order to develop a stable business model that is based on the processing of such data."
"The first U.S. privacy law, the Fair Credit Reporting Act of 1970, nicely demonstrates the pitfalls of use regulation. It places almost no limit on how data are collected and instead
regulates how data are used in employment, credit, and tenancy. We have had 40 years of experience with the regime, and it governs our daily lives in important ways. Yet, in the various articles and
whitepapers on the subject, the FCRA goes unmentioned. At enactment, the FCRA was seen as a huge giveaway to industry. It created large, unaccountable bureaucracies that are notoriously unresponsive to
consumers. Consumer reporting agencies regulated by the FCRA’s use-based approach used messy data and fuzzy logic in ways that produced error that was costly, but diffuse. CRAs were given broad immunity
from defamation and invasion of privacy lawsuits in exchange for promises to treat individuals responsibly, but they have failed in this bargain. The use-regulation proposals just ignore this
"When we look to the digital future there is one anxiety from which all others derive: What kind of home will it be? Will we be masters in a community of masters, or something else - guests, fugitives, or perhaps unwitting slaves subdued by interests beyond our influence or understanding? If the digital future is to be our home, then it is we who must make it so."
"New information technologies have dramatically increased sellers’ ability to engage in price discrimination in retail consumer markets. Debates over using personal information for price discrimination frequently treat it as a single concern, and are not sufficiently sensitive to the variety of price discrimination practices, the different kinds of information they require in order to succeed, and the different concerns they raise. This paper explores the ethical aspects of the debate over regulating price discrimination facilitated by personal information. By drawing distinctions between various pricing practices and the motivations behind them, this paper seeks to clarify the ethical principles that should guide legal and regulatory efforts to control the use of personal information for pricing."
"Big Data entails a challenge to key privacy principles. Some claim that it will be impossible to enforce these principles in an age characterised by Big Data. According to this view, the
protection of privacy must primarily be safeguarded through enterprises providing clear and comprehensive information on how personal data is handled. The Working Group is of the opinion, however, that
the protection of privacy is more important than ever at a time when increasing amounts of information are collected about individuals. The privacy principles constitute our guarantee that we will not be
subjected to extensive profiling in an ever increasing array of new contexts. A watering down of key privacy principles, in combination with more extensive use of Big Data, may have adverse consequences
for the protection of privacy and other important values in society such as freedom of expression and the conditions for exchange of ideas."
"The data profiling that drives customer management will increasingly be replicated among employees as screening and monitoring move to a new level. Sensors check their location, performance
and health. The monitoring may even stretch into their private lives in an extension of today’s drug tests. Periodic health screening gives way to real-time monitoring of health, with proactive health
guidance and treatment to enable staff to perform more efficiently, reduce sick leave and work for more years before needing to retire. […] The ‘contract’ with employees is defined by the handing over
of data (e.g. health, performance, possibly even private life) in return for job security. More than 30% of the participants in our global survey would be happy for their employers to have access to their
personal data. Younger people tend to be more open to this than older generations, so this kind of monitoring could become routine in the years to come."
"Social media companies depend on selling information about their users’ clicks and purchases to data brokers who match ads to the most receptive individuals. But the Federal Trade Commission
and the White House have called for legislation that would inform consumers about the data collected and sold to companies, warning of analytics that have ‘the potential to eclipse longstanding civil
rights protections.’ Does the collection of data by companies threaten consumers’ civil rights?"
"This paper is intended to give an overview of the issues as we see them and contribute to the debate on big data and privacy. This is an area in which the capabilities of the technology and
the range of potential applications are evolving rapidly and there is ongoing discussion of the implications of big data. Our aim is to ensure that the different privacy risks of big data are considered
along with the benefits of big data - to organisations, to individuals and to society as a whole. It is our belief that the emerging benefits of big data will be sustained by upholding key data protection
principles and safeguards. The benefits cannot simply be traded with privacy rights."
"Paul Ohm’s 2009 article ‘Broken Promises of Privacy’ spurred a debate in legal and policy circles on the appropriate response to computer science research on re-identification. In this
debate, the empirical research has often been misunderstood or misrepresented. A new report by Ann Cavoukian and Daniel Castro is full of such inaccuracies, despite its claims of ‘setting the record
straight.’ We point out eight of our most serious points of disagreement with Cavoukian and Castro. The thrust of our arguments is that (i) there is no evidence that de-identification works either in
theory or in practice3 and (ii) attempts to quantify its efficacy are unscientific and promote a false sense of security by assuming unrealistic, artificially constrained models of what an adversary might
"Google thinks I’m interested in parenting, superhero movies, and shooter games. The data broker Acxiom thinks I like driving trucks. My data doppelgänger is made up of my browsing history, my
status updates, my GPS locations, my responses to marketing mail, my credit card transactions, and my public records.Still, it constantly gets me wrong, often to hilarious effect. I take some comfort that
the system doesn’t know me too well, yet it is unnerving when something is misdirected at me. Why do I take it so personally when personalization gets it wrong?"
"In this paper, we will discuss a select group of academic articles often referenced in support of the myth that de-identification is an ineffective tool to protect the privacy of individuals.
While these articles raise important issues concerning the use of proper de-identification techniques, reported findings do not suggest that de-identification is impossible or that de-identified data
should be classified as personally identifiable information. We then provide a concrete example of how data may be effectively de-identified — the case of the U.S. Heritage Health Prize. This example
shows that in some cases, de-identification can maximize both privacy and data quality, thereby enabling a shift from zero-sum to positive-sum thinking — a key principle of Privacy by
"Since 1967, when it decided Katz v. United States, the Supreme Court has tied the right to be free of unwanted govern-ment scrutiny to the concept of reasonable expectations of privacy. An
evaluation of reasonable expectations depends, among other factors, upon an assessment of the intrusiveness of government ac-tion. When making such assessment historically the Court consid-ered police
conduct with clear temporal, geographic, or substantive limits. However, in an era where new technologies permit the storage and compilation of vast amounts of personal data, things are becoming more
complicated. A school of thought known as ‘mosaic theory’ has stepped into the void, ringing the alarm that our old tools for assessing the intrusiveness of government conduct potentially undervalue
privacy rights. Mosaic theorists advocate a cumulative approach to the evaluation of data collection. Under the theory, searches are ‘analyzed as a collective sequence of steps rather than as individual
steps.’ The approach is based on the observation that comprehensive aggregation of even seemingly innocuous data reveals greater insight than consideration of each piece of information in isolation. Over
time, discrete units of surveillance data can be processed to create a mosaic of habits, relationships, and much more. Consequently, a Fourth Amendment analysis that focuses only on the government’s
collection of discrete units of data fails to appreciate the true harm of long-term surveillance—the composite."
"Data brokers collect and store a vast amount of data on almost every U.S. household and commercial transaction. Of the nine data brokers, one data broker’s database has information on 1.4 billion consumer transactions and over 700 billion aggregated data elements; another data broker’s database covers one trillion dollars in consumer transactions; and yet another data broker adds three billion new records each month to its databases. Most importantly, data brokers hold a vast array of information on individual consumers. For example, one of the nine data brokers has 3000 data segments for nearly every U.S. consumer."
"The Health and Social Care Act 2012 (HSCA) gave NHS England the power to direct the Health and Social Care Information Centre (formerly the NHS Information Centre) to collect electronic
patient records from GP practices. This was to be the first part of the ‘care.data’ initiative, the stated purpose of which is that using ‘information about the care you have received, enables those
involved in providing care and health services to improve the quality of care and health services for all’. […] It is undeniable that the rising cost of health and social care provision is a huge
societal problem, and it is also undeniable that health and social care services possess enormous quantities of hugely valuable patient data (whose value lies both in its potential benefits for future
service provision, and in potential commercial benefits to the private sector) but I am by no means certain that the people whose data is involved understand what is proposed, or what the potential
implications are. The suspicion – fair or not – that care.data is merely a front for the monetization of that valuable patient data, the suspicion that attempts were being made to implement it ‘under the
radar’ (remember that, initially, no national publicity campaign, or opt-out procedure was proposed) and the apparent reluctance of its proponents to engage with the complex questions of what
‘anonymisation’ and ‘pseudonymisation’ mean in our increasingly technical world, lead me to doubt that care.data is, currently, proportionate to the problem it seeks to address."
“‘This is really, really a privacy nightmare,’ says Deborah Peel, the executive director of Patient Privacy Rights, who claims that the vast majority, if not all, of the health data collected
by these types of apps have effectively ‘zero’ protections, but is increasingly prized by online data mining and advertising firms. Both the Food and Drug Administration and the FTC regulate some aspects
of the fitness tracking device and app market, but not everyone thinks the government has kept pace with the rapidly changing fitness tracking market. ‘The FTC and even the FDA have not done enough,’ says
Jeffrey Chester, the executive director of the Center for Digital Democracy, who says the lack of concrete safeguards to protect data in this new space leaves consumers at risk. ‘Health information is
sensitive information and it should be tightly regulated.’”