"New information technologies have dramatically increased sellers’ ability to engage in price discrimination in retail consumer markets. Debates over using personal information for price discrimination frequently treat it as a single concern, and are not sufficiently sensitive to the variety of price discrimination practices, the different kinds of information they require in order to succeed, and the different concerns they raise. This paper explores the ethical aspects of the debate over regulating price discrimination facilitated by personal information. By drawing distinctions between various pricing practices and the motivations behind them, this paper seeks to clarify the ethical principles that should guide legal and regulatory efforts to control the use of personal information for pricing."
"Big Data entails a challenge to key privacy principles. Some claim that it will be impossible to enforce these principles in an age characterised by Big Data. According to this view, the
protection of privacy must primarily be safeguarded through enterprises providing clear and comprehensive information on how personal data is handled. The Working Group is of the opinion, however, that
the protection of privacy is more important than ever at a time when increasing amounts of information are collected about individuals. The privacy principles constitute our guarantee that we will not be
subjected to extensive profiling in an ever increasing array of new contexts. A watering down of key privacy principles, in combination with more extensive use of Big Data, may have adverse consequences
for the protection of privacy and other important values in society such as freedom of expression and the conditions for exchange of ideas."
"The data profiling that drives customer management will increasingly be replicated among employees as screening and monitoring move to a new level. Sensors check their location, performance
and health. The monitoring may even stretch into their private lives in an extension of today’s drug tests. Periodic health screening gives way to real-time monitoring of health, with proactive health
guidance and treatment to enable staff to perform more efficiently, reduce sick leave and work for more years before needing to retire. […] The ‘contract’ with employees is defined by the handing over
of data (e.g. health, performance, possibly even private life) in return for job security. More than 30% of the participants in our global survey would be happy for their employers to have access to their
personal data. Younger people tend to be more open to this than older generations, so this kind of monitoring could become routine in the years to come."
"Social media companies depend on selling information about their users’ clicks and purchases to data brokers who match ads to the most receptive individuals. But the Federal Trade Commission
and the White House have called for legislation that would inform consumers about the data collected and sold to companies, warning of analytics that have ‘the potential to eclipse longstanding civil
rights protections.’ Does the collection of data by companies threaten consumers’ civil rights?"
"This paper is intended to give an overview of the issues as we see them and contribute to the debate on big data and privacy. This is an area in which the capabilities of the technology and
the range of potential applications are evolving rapidly and there is ongoing discussion of the implications of big data. Our aim is to ensure that the different privacy risks of big data are considered
along with the benefits of big data - to organisations, to individuals and to society as a whole. It is our belief that the emerging benefits of big data will be sustained by upholding key data protection
principles and safeguards. The benefits cannot simply be traded with privacy rights."
"Paul Ohm’s 2009 article ‘Broken Promises of Privacy’ spurred a debate in legal and policy circles on the appropriate response to computer science research on re-identification. In this
debate, the empirical research has often been misunderstood or misrepresented. A new report by Ann Cavoukian and Daniel Castro is full of such inaccuracies, despite its claims of ‘setting the record
straight.’ We point out eight of our most serious points of disagreement with Cavoukian and Castro. The thrust of our arguments is that (i) there is no evidence that de-identification works either in
theory or in practice3 and (ii) attempts to quantify its efficacy are unscientific and promote a false sense of security by assuming unrealistic, artificially constrained models of what an adversary might
"Google thinks I’m interested in parenting, superhero movies, and shooter games. The data broker Acxiom thinks I like driving trucks. My data doppelgänger is made up of my browsing history, my
status updates, my GPS locations, my responses to marketing mail, my credit card transactions, and my public records.Still, it constantly gets me wrong, often to hilarious effect. I take some comfort that
the system doesn’t know me too well, yet it is unnerving when something is misdirected at me. Why do I take it so personally when personalization gets it wrong?"
"In this paper, we will discuss a select group of academic articles often referenced in support of the myth that de-identification is an ineffective tool to protect the privacy of individuals.
While these articles raise important issues concerning the use of proper de-identification techniques, reported findings do not suggest that de-identification is impossible or that de-identified data
should be classified as personally identifiable information. We then provide a concrete example of how data may be effectively de-identified — the case of the U.S. Heritage Health Prize. This example
shows that in some cases, de-identification can maximize both privacy and data quality, thereby enabling a shift from zero-sum to positive-sum thinking — a key principle of Privacy by
"Since 1967, when it decided Katz v. United States, the Supreme Court has tied the right to be free of unwanted govern-ment scrutiny to the concept of reasonable expectations of privacy. An
evaluation of reasonable expectations depends, among other factors, upon an assessment of the intrusiveness of government ac-tion. When making such assessment historically the Court consid-ered police
conduct with clear temporal, geographic, or substantive limits. However, in an era where new technologies permit the storage and compilation of vast amounts of personal data, things are becoming more
complicated. A school of thought known as ‘mosaic theory’ has stepped into the void, ringing the alarm that our old tools for assessing the intrusiveness of government conduct potentially undervalue
privacy rights. Mosaic theorists advocate a cumulative approach to the evaluation of data collection. Under the theory, searches are ‘analyzed as a collective sequence of steps rather than as individual
steps.’ The approach is based on the observation that comprehensive aggregation of even seemingly innocuous data reveals greater insight than consideration of each piece of information in isolation. Over
time, discrete units of surveillance data can be processed to create a mosaic of habits, relationships, and much more. Consequently, a Fourth Amendment analysis that focuses only on the government’s
collection of discrete units of data fails to appreciate the true harm of long-term surveillance—the composite."
"Data brokers collect and store a vast amount of data on almost every U.S. household and commercial transaction. Of the nine data brokers, one data broker’s database has information on 1.4 billion consumer transactions and over 700 billion aggregated data elements; another data broker’s database covers one trillion dollars in consumer transactions; and yet another data broker adds three billion new records each month to its databases. Most importantly, data brokers hold a vast array of information on individual consumers. For example, one of the nine data brokers has 3000 data segments for nearly every U.S. consumer."
"The Health and Social Care Act 2012 (HSCA) gave NHS England the power to direct the Health and Social Care Information Centre (formerly the NHS Information Centre) to collect electronic
patient records from GP practices. This was to be the first part of the ‘care.data’ initiative, the stated purpose of which is that using ‘information about the care you have received, enables those
involved in providing care and health services to improve the quality of care and health services for all’. […] It is undeniable that the rising cost of health and social care provision is a huge
societal problem, and it is also undeniable that health and social care services possess enormous quantities of hugely valuable patient data (whose value lies both in its potential benefits for future
service provision, and in potential commercial benefits to the private sector) but I am by no means certain that the people whose data is involved understand what is proposed, or what the potential
implications are. The suspicion – fair or not – that care.data is merely a front for the monetization of that valuable patient data, the suspicion that attempts were being made to implement it ‘under the
radar’ (remember that, initially, no national publicity campaign, or opt-out procedure was proposed) and the apparent reluctance of its proponents to engage with the complex questions of what
‘anonymisation’ and ‘pseudonymisation’ mean in our increasingly technical world, lead me to doubt that care.data is, currently, proportionate to the problem it seeks to address."
“‘This is really, really a privacy nightmare,’ says Deborah Peel, the executive director of Patient Privacy Rights, who claims that the vast majority, if not all, of the health data collected
by these types of apps have effectively ‘zero’ protections, but is increasingly prized by online data mining and advertising firms. Both the Food and Drug Administration and the FTC regulate some aspects
of the fitness tracking device and app market, but not everyone thinks the government has kept pace with the rapidly changing fitness tracking market. ‘The FTC and even the FDA have not done enough,’ says
Jeffrey Chester, the executive director of the Center for Digital Democracy, who says the lack of concrete safeguards to protect data in this new space leaves consumers at risk. ‘Health information is
sensitive information and it should be tightly regulated.’”
"When a rare ice storm threatened New Orleans in January, some residents heard from a city official who had gained access to their private medical information. Kidney dialysis patients were
advised to seek early treatment because clinics would be closing. Others who rely on breathing machines at home were told how to find help if the power went out.
Those warnings resulted from vast volumes of government data. For the first time, federal officials scoured Medicare health insurance claims to identify potentially vulnerable people and share their names
with local public health authorities for outreach during emergencies and disaster drills."
"Big data drives big benefits, from innovative businesses to new ways to treat diseases. The challenges to privacy arise because technologies collect so much data (e.g., from sensors in everything from phones to parking lots) and analyze them so efficiently (e.g., through data mining and other kinds of analytics) that it is possible to learn far more than most people had anticipated or can anticipate given continuing progress. These challenges are compounded by limitations on traditional technologies used to protect privacy (such as de-identification). PCAST concludes that technology alone cannot protect privacy, and policy intended to protect privacy needs to reflect what is (and is not) technologically feasible. In light of the continuing proliferation of ways to collect and use information about people, PCAST recommends that policy focus primarily on whether specific uses of information about people affect privacy adversely. It also recommends that policy focus on outcomes, on the ‘what’ rather than the ‘how,’ to avoid becoming obsolete as technology advances."
"Big data technologies will be transformative in every sphere of life. The knowledge discovery they make possible raises considerable questions about how our framework for privacy protection
applies in a big data ecosystem. Big data also raises other concerns. A significant finding of this report is that big data analytics have the potential to eclipse longstanding civil rights protections in
how personal information is used in housing, credit, employment, health, education, and the marketplace. Americans’ relationship with data should expand, not diminish, their opportunities and