"The war on crime exemplifies how the deprivation of privacy makes one vulnerable to oppressive state social control. Scholars have severely criticized the war on crime’s subordinating effects
on poor urban people of color. The role that privacy deprivation plays in this subordination, however, has been under-theorized. This Article takes an initial step in addressing this gap in the
literature. It argues that one important reason why the war on crime is so abusive is because it oppressively invades individuals’ privacy; poor people of color have limited opportunities in the creation
of their life plans, participation in mainstream political discourse, and access to social capital in part because they have limited privacy. These privacy invasions also have an expressive aspect because
they send the message that the state does not trust these individuals to engage in valued activities in legitimate ways; therefore, they must constantly be watched. As a result, the deprivation of privacy
also results in serious dignitary harms."
"Some stakeholders assert that the application of some data protection principles and obligations under EU law should be substantially reviewed to enable promising forthcoming developments in
big data operations to take place. The application of the principles of purpose limitation and data minimisation are presented as core concerns in this respect, as they require that data controllers
collect personal data only for specified, explicit and legitimate purposes, and do not further process such data in a way incompatible with those purposes. They also require that personal data must be
adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed. In this regard, some voices argue that the focus should be only on the use of
personal data, linking this to the level of risk of harm to individuals. The Working Party acknowledges that the challenges of big data might require innovative thinking on how some of these and other key
data protection principles are applied in practice. However, at this stage, it has no reason to believe that the EU data protection principles, as they are currently enshrined in Directive 95/46/EC, are
no longer valid and appropriate for the development of big data, subject to further improvements to make them more effective in practice. It also needs to be clear that the rules and principles are
applicable to all processing operations, starting with collection in order to ensure a high level of data protection. In fact, the Working Party strongly believes that complying with this framework is a
key element in creating and keeping the trust which any stakeholder needs in order to develop a stable business model that is based on the processing of such data."
"The first U.S. privacy law, the Fair Credit Reporting Act of 1970, nicely demonstrates the pitfalls of use regulation. It places almost no limit on how data are collected and instead
regulates how data are used in employment, credit, and tenancy. We have had 40 years of experience with the regime, and it governs our daily lives in important ways. Yet, in the various articles and
whitepapers on the subject, the FCRA goes unmentioned. At enactment, the FCRA was seen as a huge giveaway to industry. It created large, unaccountable bureaucracies that are notoriously unresponsive to
consumers. Consumer reporting agencies regulated by the FCRA’s use-based approach used messy data and fuzzy logic in ways that produced error that was costly, but diffuse. CRAs were given broad immunity
from defamation and invasion of privacy lawsuits in exchange for promises to treat individuals responsibly, but they have failed in this bargain. The use-regulation proposals just ignore this
”[…] this opinion identifies the main data protection risks that lie within the ecosystem of the IoT before providing guidance on how the EU legal framework should be applied in this
context. The Working Party supports the incorporation of the highest possible guarantees for individual users at the heart of the projects by relevant stakeholders. In particular, users must remain in
complete control of their personal data throughout the product lifecycle, and when organisations rely on consent as a basis for processing, the consent should be fully informed, freely given and specific.
To help them meet this end, the Working Party designed a comprehensive set of practical recommendations addressed to the different stakeholders concerned (device manufacturers, application developers,
social platforms, further data recipients, data platforms and standardisation bodies) to help them implement privacy and data protection in their products and services.”
"I’ve spent much of the past week trying to better understand Apple’s security architecture, and the method they formerly used to provide law enforcement with access to user data. What I’ve read, and learned from talking with actual crypto experts, has affirmed my confidence in two core points. First: Apple is not just inexplicably thumbing its corporate nose at law enforcement. They are fixing a poor security design choice that previously left specific types of sensitive data on phones inadequately protected. Second: Apple, with its closed ecosystem, might actually be unusually well situated to satisfy the FBI’s demand for backdoors, but the idea is in profound conflict with more open computing models."
"When Europe’s highest court ruled in May that individuals had a ‘right to be forgotten’ - i.e., they have the right to request that outdated or ‘irrelevant’ information about them be removed
from search results - the shockwaves were heard around the world. Given the First Amendment and the traditionally strong emphasis on the public’s right to know in American culture, it may be difficult to
imagine such a ruling happening stateside. But American culture is also traditionally strong on protecting privacy - and in fact, in January 2015, variant legislation applicable only to minors will become
law in California. What if U.S. citizens start demanding the right to be forgotten, too? We at Software Advice were intrigued by the possibility, so we surveyed 500 adults in the U.S. to find out how they
felt about the right to be forgotten and the problems the law seeks to address. We then quizzed a panel of experts for their opinions on this complex issue."
"If the government howls of protest at the idea that people will be using encryption sound familiar, it’s because regulating and controlling consumer use of encryption was a monstrous proposal officially declared dead in 2001 after threatening Americans’ privacy, free speech rights, and innovation for nearly a decade. But like a zombie, it’s now rising from the grave, bringing the same disastrous flaws with it. For those who weren’t following digital civil liberties issues in 1995, or for those who have forgotten, here’s a refresher list of why forcing companies to break their own privacy and security measures by installing a back door was a bad idea 15 years ago: […]"
"A number of technological developments such as cloud computing and big data analysis have affected the way in which personal data are processed. These developments go coupled with the
currently prevalent business model of free online services that are financed through advertisements and an analysis of user data. Based on these developments, it seems that the new requirements have
exposed deficits in the current approach to data protection in the European Union. In the debate on this topic, one of the solutions that are discussed is to create market structures in which users can
sell personal data to businesses, thereby gaining control over the ways in which their data is used. Such an approach would constitute an alternative way to the protection of privacy, which is different
from the current form of data protection. In order to better assess the validity of claims about the effectiveness of such an alternative approach, it therefore is of importance to know the possible
effects that data markets would have on the privacy of online service users. This study investigates this question by means of an ethical evaluation."
"Last week, Nigerian President Goodluck Jonathan was one of the first citizens to receive a National eID card, a biometric identification card that will be rolled out to 13 million Nigerians
in the near future. Although a handful of countries already use biometric identification systems, Nigeria’s will be unique as its pilot program will be branded with MasterCard logos. The program will
eventually be expanded to encompass the rest of the country’s adult population, and the BBC says that all Nigerians will be required to have such a card by 2019 if they wish to vote in the country’s
"On 13 May 2014, the Court of Justice of the European Union acknowledged that under existing European data protection legislation, EU citizens have the right to request internet search engines
such as Google, to remove search results directly related to them. This landmark ruling has sparked a lively and timely debate on the rights and wrongs of the so-called right to be forgotten. It is
important to make sure the discussion is based on facts. A sober reading of the judgment shows that the concerns that have emerged in this debate are exaggerated or simply unfounded."
"Edward Snowden’s leaks laid bare the scope and breadth of the electronic surveillance that the U.S. National Security Agency and its foreign counterparts conduct. Suddenly, foreign
surveillance is understood as personal and pervasive, capturing the communications not only of foreign leaders but also of private citizens. Yet to the chagrin of many state leaders, academics, and
foreign citizens, international law has had little to say about foreign surveillance. Until recently, no court, treaty body, or government had suggested that international law, including basic privacy
protections in human rights treaties, applied to purely foreign intelligence collection. This is now changing: several U.N. bodies, judicial tribunals, U.S. corporations, and victims of foreign
surveillance are pressuring states to bring that surveillance under tighter legal control. This article tackles three key, interrelated puzzles associated with this sudden transformation. First, it
explores why international law has had so little to say about how, when, and where governments may spy on other states’ nationals. Second, it draws on international relations theory to argue that the
development of new international norms regarding surveillance is both likely and essential. Third, it identifies six process-driven norms that states can and should adopt to ensure meaningful privacy
restrictions on international surveillance without unduly harming their legitimate national security interests. These norms, which include limits on the use of collected data, periodic reviews of
surveillance authorizations, and active oversight by neutral bodies, will increase the transparency, accountability, and legitimacy of foreign surveillance."
"New information technologies have dramatically increased sellers’ ability to engage in price discrimination in retail consumer markets. Debates over using personal information for price discrimination frequently treat it as a single concern, and are not sufficiently sensitive to the variety of price discrimination practices, the different kinds of information they require in order to succeed, and the different concerns they raise. This paper explores the ethical aspects of the debate over regulating price discrimination facilitated by personal information. By drawing distinctions between various pricing practices and the motivations behind them, this paper seeks to clarify the ethical principles that should guide legal and regulatory efforts to control the use of personal information for pricing."
"This paper highlights some of the opportunities presented by the rise of the so-called ‘Internet of Things’ and wearable technology in particular, and encourages policymakers to allow these technologies to develop in a relatively unabated fashion. As with other new and highly disruptive digital technologies, however, the Internet of Things and wearable tech will challenge existing social, economic, and legal norms. In particular, these technologies raise a variety of privacy and safety concerns. […] The better alternative to top-down regulation is to deal with these concerns creatively as they develop using a combination of educational efforts, technological empowerment tools, social norms, public and watchdog pressure, industry best practices and self-regulation, transparency, and targeted enforcement of existing legal standards (especially torts) as needed."
"This guide explains how the Data Protection Act (DPA) applies to journalism, advises on good practice, and clarifies the role of the Information Commissioner’s Office (ICO). It does not have
any formal legal status and cannot set any new rules, but it will help those working in the media understand and comply with existing law in this area."
"Their argument: Since the tech industry is populated by meritocratic rationalists, it would be impossible for a talented female engineer not to rise to the top. Therefore, if few women are in
the industry, the problem is not sexism but the absence of some innate capacity or interest on the part of (most) women. In other words, the dearth of women in tech is only natural. […] The proportion
of programmers in India who are women is at least 30 percent. In the US it’s 21 percent. And this despite the fact that by most indexes - economic opportunity, educational attainment, health - women in
India have access to a narrower set of opportunities than women in the United States. So unless nature is working contrarily in South Asia, something about the culture of the Indian educational system and
tech industry is more hospitable to women than the American one. If we can figure out what that difference is, we can begin to change things for the better in the US."