"In this paper, we examine the application of Privacy by Design to the design and architecture of MLA systems through the work of Toronto-based MLA company Aislelabs. […] This paper has in
total four sections. It begins with a background discussion of MLA and how it works technologically (section 2). Next the paper discusses the unique privacy risks associated with MLA (section 3). Finally,
it introduces Privacy by Design, discusses Aislelabs’ MLA implementation, and shows how it designs in privacy from the outset (section 4)."
"This report is part of an effort by the Pew Research Center’s Internet Project in association with Elon University’s Imagining the Internet Center to look at the future of the Internet, the
Web, and other digital activities. This is the first of eight reports based on a canvassing of hundreds of experts about the future of such things as privacy, cybersecurity, the “Internet of things,” and
net neutrality. In this case we asked experts to make their own predictions about the state of digital life by the year 2025. We will also explore some of the economic change driven by the spectacular
progress that made digital tools faster and cheaper. And we will report on whether Americans feel the explosion of digital information coursing through their lives has helped them be better informed and
make better decisions."
"This is, at base, a factual dispute. Is it easy to draw sensitive inferences from phone metadata? How often do people conduct sensitive matters by phone, in a manner reflected by metadata? We
used crowdsourced data to arrive at empirical answers. Since November, we have been conducting a study of phone metadata privacy. Participants run the MetaPhone app on their Android smartphone; it submits
device logs and social network information for analysis. In previous posts, we have used the MetaPhone dataset to spot relationships, understand call graph interconnectivity, and estimate the
identifiability of phone numbers. At the outset of this study, we shared the same hypothesis as our computer science colleagues—we thought phone metadata could be very sensitive. We did not anticipate
finding much evidence one way or the other, however, since the MetaPhone participant population is small and participants only provide a few months of phone activity on average. We were wrong. We found
that phone metadata is unambiguously sensitive, even in a small population and over a short time window. We were able to infer medical conditions, firearm ownership, and more, using solely phone
"In Fraley [v. Facebook], the defendant Facebook had used the images of Facebook users (including minor children) to advertise products. A group of parents filed a class action lawsuit against
Facebook to vindicate the rights of children who had been subject to this advertising scheme. As a result of the lawsuit, Facebook and the parents agreed to a settlement, wherein Facebook would pay money
to organizations that advocate for children’s privacy. But the settlement agreement did not prevent Facebook from continuing to use children’s images in advertisements, and the organizations selected to
receive funds were not the groups that have objected to Facebook’s use of images in advertising since the scheme began. The settlement agreement was so bad that one of the groups who had been selected to
receive funds chose to turn the money down. The settlement agreement, said the group, left the class members worse off than they would have been without any settlement at all. If the settlement agreement
was that bad (and, personally, I think it was), is it possible that none of the plaintiffs’ rights were vindicated as a result of the lawsuit? Is there an argument to be made that the settlement agreement
both allowed Facebook to continue its injurious behavior and also prevented the plaintiffs from ever challenging that behavior again? Are the organizations whose interests actually do align with those of
the class members (for example, the group who refused the funds) barred from litigating the same issue? Or did the deficient settlement agreement reach back in time and opt everyone out of a class that
would not reap the benefits of a settlement agreement?"
"Revelations of large scale electronic surveillance and data mining by governments and corporations have fueled increased adoption of HTTPS. We present a traffic analysis attack against over
6000 webpages spanning the HTTPS deployments of 10 widely used, industry-leading websites in areas such as healthcare, finance, legal services and streaming video. Our attack identifies individual pages
in the same website with 89% accuracy, exposing personal details including medical conditions, financial and legal affairs and sexual orientation. We examine evaluation methodology and reveal accuracy
variations as large as 18% caused by assumptions affecting caching and cookies. We present a novel defense reducing attack accuracy to 27% with a 9% traffic increase, and demonstrate significantly
increased effectiveness of prior defenses in our evaluation context, inclusive of enabled caching, user-specific cookies and pages within the same website."
"The security of computer systems often relies upon decisions and actions of end users. In this paper, we set out to investigate user-centered security by concentrating at the most fundamental
component governing user behavior – the human brain. We introduce a novel neuroscience-based study methodology to inform the design of user-centered security systems. Specifically, we report on an fMRI
study measuring users’ security performance and the underlying neural activity with respect to two critical security tasks: (1) distinguishing between a legitimate and a phishing website, and (2) heeding
security (malware) warnings. At a higher level, we identify neural markers that might be controlling users’ performance in these tasks, and establish relationships between brain activity and behavioral
performance as well as between users’ personality traits and security behavior."
"Protecting associational freedom is a core, independent yet unappreciated part of the Fourth Amendment. New surveillance techniques threaten that freedom. Surveillance is no longer forward
looking. Law enforcement can obtain the same, if not more, information about all of us by looking backward. Forward-looking surveillance has limits. Some limits are practical such as the cost to place a
person in a car to follow a suspect. There are also procedural limits, such as the requirement that surveillance relate to criminal activity. In addition, surveillance such as wiretapping and using a GPS
tracker often requires a warrant. Warrants involve review by a neutral magistrate. The warrant sets limits on what information may be collected, how it is collected, and how it can be used. The
surveillance is also time limited and requires continual justification to a judge, or the surveillance will be shut down. With backward-looking surveillance all of these protections are gone. Law
enforcement can now use low-cost technology to track us or need only ask a business for the record of where we went, whom we called, what we read, and more. Revelation of the NSA’s vast Prism surveillance
project is but the most recent example of law enforcement engaging in this sort of over-reaching surveillance. The FBI has previously deployed similar programs to read mail, obtain lists of books read,
demand member lists, and generate watch lists of people to round up in case of national emergency. The efforts vary; the harm is the same. Law enforcement has a perfect picture of our activities and
associations regardless of whether they are criminal. With digital records these harms are more acute. Once the data about our activities is gathered, law enforcement may keep that data indefinitely. They
have a data hoard. That hoard grows with each new data request. Once created, the hoard can be continually rifled to investigate us but without any oversight."
"In the main report, contained in chapter III, the Special Rapporteur examines the use of remotely piloted aircraft, or drones, in extraterritorial lethal counter-terrorism operations, including in the context of asymmetrical armed conflict, and allegations that the increasing use of remotely piloted aircraft, or drones, has caused disproportionate civilian casualties, and makes recommendations to States."
"Two decades of analysis have produced a rich set of insights as to how the law should apply to the Internet’s peculiar characteristics. But, in the meantime, technology has not stood still.
The same public and private institutions that developed the Internet, from the armed forces to search engines, have initiated a significant shift toward robotics and artificial intelligence. This article
is the first to examine what the introduction of a new, equally transformative technology means for cyberlaw (and law in general). Robotics has a different set of essential qualities than the Internet
and, accordingly, will raise distinct issues of law and policy. Robotics combines, for the first time, the promiscuity of data with the capacity to do physical harm; robotic systems accomplish tasks in
ways that cannot be anticipated in advance; and robots increasingly blur the line between person and instrument."
"This policy brief sketches the outline of a common European position, rooted in the idea that outside zones of conventional hostilities, the deliberate taking of human life must be justified
on an individual basis according to the imperative necessity of acting in order to prevent either the loss of other lives or serious harm to the life of the nation. It argues that such a position would
now offer a basis for renewed engagement with the Obama administration, which has endorsed a similar standard as a matter of policy, even if its interpretation of many key terms remains unclear and its
underlying legal arguments remain different. Finally, it suggests that European states will need to clarify their own understanding and reach agreement among themselves on some parts of the relevant legal
framework as they refine their position and pursue discussions with the United States. None of these efforts will necessarily be easy. But unless the EU defines a position on remotely piloted aircraft and
targeted killing, it risks neglecting its own interests and missing an opportunity to help shape global standards in an area that is vital to international peace and security."
"Limited privacy protections for metadata may have made sense decades ago when technology to collect and analyze data was virtually nonexistent. But in today’s ‘big data’ world, non-content
does not mean non-sensitive. In fact, new technology is demonstrating just how sensitive metadata can be: how friend lists can reveal a person’s sexual orientation, purchase histories can identify a
pregnancy before any visible signs appear, and location information can expose individuals to harassment for unpopular political views or even theft and physical harm. Two separate committees assembled by
the executive branch — the President’s Review Group on Intelligence and Communications Technology and the Privacy and Civil Liberties Oversight Board —have joined lawmakers, academics, and judges in
calling for a reevaluation of the distinction between content and metadata. This paper examines how new technologies and outdated laws have combined to make metadata more important and more vulnerable
than ever, and proposes a way forward to ensure that all of our sensitive information gets the privacy protection it deserves."
"Modeling mass surveillance disclosure regulations on an updated form of environmental impact statement will help protect everyone’s privacy: Mandating disclosure and impact analysis by those
proposing to watch us in and through public spaces will enable an informed conversation about privacy in public. Additionally, the need to build consideration of the consequences of surveillance into
project planning, as well as the danger of bad publicity arising from excessive surveillance proposals, will act as a counterweight to the adoption of mass data collection projects, just as it did in the
environmental context. In the long run, well-crafted disclosure and analysis rules could pave the way for more systematic protection for privacy — as it did in the environmental context. Effective US
regulation of mass surveillance will require that we know a great deal about who and what is being recorded and about the costs and benefits of personal information acquisition and uses. At present we
know relatively little about how to measure these; a privacy equivalent of environmental impact statements will not only provide case studies, but occasions to grow expertise."
"This handbook on European data protection law is jointly prepared by the European Union Agency for Fundamental Rights and the Council of Europe together with the Registry of the European Court of Human Rights. It is the third in a series of legal handbooks jointly prepared by the EU Agency for Fundamental Rights and the Council of Europe. […] The aim of this handbook is to raise awareness and improve knowledge of data protection rules in European Union and Council of Europe member states by serving as the main point of reference to which readers can turn. It is designed for non-specialist legal professionals, judges, national data protection authorities and other persons working in the field of data protection."
"The United States healthcare system is marching diligently toward a more connected system of care through the use of electronic health record systems (EHRs) and electronic exchange of patient
information between organizations and with patients and caregivers. The Patient Identification and Matching Initiative, sponsored by the Office of the National Coordinator for Health Information Technology) or Health Information Technology (ONC), focused on identifying incremental steps to help ensure the accuracy of every patient’s identity, and the availability of their information wherever and whenever care is needed.
Matching records to the correct person becomes increasingly complicated as organizations share records electronically using different systems, and in a mobile society where patients seek care in many
healthcare settings. Many healthcare organizations use multiple systems for clinical, administrative, and specialty services, which leads to an increased chance of identity errors when matching patient
records. Additionally, many regions experience a high number of individuals who share the exact name and birthdate, leading to the need for additional identifying attributes to be used when matching
patient records. […] Driven by concerns for patient safety in the event of mismatched or unmatched records and the national imperative to improve population health and lower costs through care
coordination, this initiative studied both technical and human processes, seeking improvements to patient identification and matching that could be quickly implemented and lead to near-term improvements
in matching rates."
"Democratic and Republican senators have been busy drafting legislation that would establish national requirements for data security and breach notice. The following bills have been
introduced over the last year: Data Security and Breach Notification Act, Toomey (R-PA); Personal Data Privacy and Security Act, Leahy (D-VT); Data Security Act, Carper (D-DE) and Blunt (R-MO); Data
Security and Breach Notification Act, Rockefeller (D-WV); and Personal Data Protection and Breach Accountability Act, Blumenthal (D-CT). This post provides a side-by-side comparison of these five data-
breach bills, which would impose varying standards and penalties. The comparison focuses on the breach-notification requirements of each bill; it does not discuss the standards that some bills would
establish for internal security protocols to safeguard stored data."