”[…] smartphones are actually spy phones. But they don’t need to be. If we had enough open wireless networks available, we could change that. Startup companies—and open source projects—could
make devices that used the open networks without reporting your location and communications to phone companies. Devices that skip smoothly from one open wireless network to another don’t provide the kind
of granular information about your intimate activities that the current single-carrier systems do. We have two choices: let mobile privacy stay dead forever, or build an alternative open wireless
"We have built PlayDrone, a system that uses various hacking techniques to circumvent Google security to successfully crawl Google Play. […] We further show that […] Android applications
contain thousands of leaked secret authentication keys which can be used by malicious users to gain unauthorized access to server resources through Amazon Web Services and compromise user accounts on
Facebook. We worked with service providers, including Amazon, Facebook, and Google, to identify and notify customers at risk, and make the Google Play store a safer place."
"We study Facebook Connect’s permissions system using crawling, experimentation, and surveys and determine that it works differently than both users and developers expect in several ways. We
show that more permissions can be granted than the developer intended. In particular, permissions that allow a site to post to the user’s profile are granted on an all-or-nothing basis. We evaluate how
the requested permissions are presented to the user and find that, while users generally understand what data sites can read from their profile, they generally do not understand the many different things
the sites can post. In the case of write permissions, we show that user expectations are influenced by the identity of the requesting site which in reality has no impact on what is enforced. We also find
that users generally do not understand the way Facebook Connect permissions interact with Facebook’s privacy settings. Our results suggest that users understand detailed, granular messages better than
those that are broad and vague."
"75. […] It is irrelevant that Mr. Schrems cannot show that his own personal data was accessed in this fashion by the NSA, since what matters is the essential inviolability of the personal
data itself. The essence of that right would be compromised if the data subject had reason to believe that it could be routinely accessed by security authorities on a mass andundifferentiated basis. 76.
Third, the evidence suggests that personal data of data subjects is routinely accessed on a mass and undifferentiated basis by the US security authorities."
From ‘1) What is the case about and what did the Court rule?’:
"In 2010 a Spanish citizen lodged a complaint against a Spanish newspaper with the national Data Protection Agency and against Google Spain and Google Inc. The man complained that an auction
notice of his repossessed home on Google’s search results infringed his privacy rights because the proceedings concerning him had been fully resolved for a number of years and hence the reference to these
was entirely irrelevant. He requested, first, that the newspaper be required either to remove or alter the pages in question so that the personal data relating to him no longer appeared; and second, that
Google Spain or Google Inc. be required to remove the personal data relating to him, so that it no longer appeared in the search results. The Spanish court referred the case to the Court of Justice of the
European Union asking: (a) whether the EU’s 1995 Data Protection Directive applied to search engines such as Google; (b) whether EU law (the Directive) applied to Google Spain, given that the company’s
data processing server was in the United States; (c) whether an individual has the right to request that his or her personal data be removed from accessibility via a search engine (the ‘right to be
"EFF recently kicked off our second Tor Challenge, an initiative to strengthen the Tor network for online anonymity and improve one of the best free privacy tools in existence. The campaign -
which we’ve launched with partners at the Freedom of the Press Foundation, the Tor Project, and the Free Software Foundation - is already off to a great start. In just the first few days, we’ve seen over
600 new or expanded Tor nodes—more than during the entire first Tor Challenge. This is great news, but how does it affect you? To understand that, we have to dig into what Tor actually is, and what people
can do to support it. Support can come in many forms, too. Even just using Tor is one of the best and easiest things a person can do to preserve privacy and anonymity on the Internet."
"We examine the cost for an attacker to pay users to execute arbitrary code—potentially malware. We asked users at home to download and run an executable we wrote without being told what it
did and without any way of knowing it was harmless. Each week, we increased the payment amount. Our goal was to examine whether users would ignore common security advice — not to run untrusted executables
— if there was a direct incentive, and how much this incentive would need to be."
"On a bright April morning in Menlo Park, California, I became an Internet spy. This was easier than it sounds because I had a willing target. I had partnered with National Public Radio (NPR)
tech correspondent Steve Henn for an experiment in Internet surveillance. For one week, while Henn researched a story, he allowed himself to be watched—acting as a stand-in, in effect, for everyone who
uses Internet-connected devices. How much of our lives do we really reveal simply by going online?"
"A significant number of corporations have responded to the disclosures by introducing a range of accountability and security measures (transparency reports, end-to-end encryption etc).
Nonetheless, while acknowledging that these reforms are ‘a promising start’ nearly sixty percent of legal and IT professionals surveyed for this report believe that they do not go far enough, with more
than a third of respondents reporting that they felt the measures were ‘little more than window dressing” or are of “little value’ outside the US. Civil society and the tech community have not adequately
adapted to the challenges raised by the Snowden revelations. For example, the interface and the communications between policy reform (e.g. efforts to create greater accountability measures, privacy
regulations) and technical privacy solutions (e.g. designing stronger embedded security) is worryingly inconsistent and patchy. Few channels of communication and information exchange exist between these
"This is our inaugural Law Enforcement Disclosure Report. We are also one of the first communications operators in the world to provide a country-by-country analysis of law enforcement demands
received based on data gathered from local licensed communications operators. We will update the information disclosed in this report annually. We also expect the contents and focus to evolve over time
and would welcome stakeholders’ suggestions as to how they should do so."
"Fourteen mobile industry companies and 10 in-car navigation providers that GAO examined in its 2012 and 2013 reports—including mobile carriers and auto manufacturers with the largest market share and popular application developers—collect location data and use or share them to provide consumers with location-based services and improve consumer services. For example, mobile carriers and application developers use location data to provide social networking services that are linked to consumers’ locations. In-car navigation services use location data to provide services such as turn-by-turn directions or roadside assistance. Location data can also be used and shared to enhance the functionality of other services, such as search engines, to make search results more relevant by, for example, returning results of nearby businesses. While consumers can benefit from location-based services, their privacy may be at risk when companies collect and share location data. For example, in both reports, GAO found that when consumers are unaware their location data are shared and for what purpose data might be shared, they may be unable to judge whether location data are shared with trustworthy third parties. Furthermore, when location data are amassed over time, they can create a detailed profile of individual behavior, including habits, preferences, and routes traveled—private information that could be exploited. Additionally, consumers could be at higher risk of identity theft or threats to personal safety when companies retain location data for long periods or in a way that links the data to individual consumers. Companies can anonymize location data that they use or share, in part, by removing personally identifying information; however, in its 2013 report, GAO found that in-car navigation providers that GAO examined use different de-identification methods that may lead to varying levels of protection for consumers."
"Homo economicus reliably makes an appearance in regulatory debates concerning information privacy. Under the still-dominant U.S. ‘notice and choice’ approach to consumer information privacy,
the rational consumer is expected to negotiate for privacy protection by reading privacy policies and selecting services consistent with her preferences. A longstanding model for predicting these
preferences is Professor Alan Westin’s well-known segmentation of consumers into ‘privacy pragmatists,’ ‘privacy fundamentalists,’ and ‘privacy unconcerned.’ […] This Article contributes to the ongoing
debate about notice and choice in two main ways. First, we consider the legacy Westin’s privacy segmentation model itself, which as greatly influenced the development of the notice-and-choice regime.
Second, we report on original survey research, collected over four years, exploring Americans’ knowledge, preferences, and attitudes about a wide variety of data practices in online and mobile markets.
Using these methods, we engage in considered textual analysis, empirical testing, and critique of Westin’s segmentation model."
"We sought to re-examine the conclusions of the classic paper Why Johnny Can’t Encrypt, which portrayed a usability crisis in security software by documenting the inability of average users to
correctly send secure email through Pretty Good Privacy (PGP). While the paper’s authors primarily focused on user-interface concerns, we turned our attention to the terminology underlying the protocol.
We developed a new set of metaphors with the goal of representing cryptographic actions (sign, encrypt, etc.) rather than primitives (public and private keys). Our objects were chosen such that their
real-world analogs would correctly represent the security properties of PGP. Since these metaphors now corresponded to physical actions, we also introduced new forms of documentation that explored
narrative techniques for explaining secure email to non-technical users. In quiz-based testing, we found that, while our new metaphors did not dramatically outperform traditional PGP, we were able to
convey equivalent levels of understanding with far shorter documentation. Subsequent lab testing confirmed that metaphors with physical analogs and the accompanying briefer instructions greatly eased the
process of using secure email. Our results indicate that crafting new metaphors to facilitate these alternative forms of documentation is a fruitful avenue for explaining otherwise challenging security
concepts to nontechnical users."
"Since 1967, when it decided Katz v. United States, the Supreme Court has tied the right to be free of unwanted govern-ment scrutiny to the concept of reasonable expectations of privacy. An
evaluation of reasonable expectations depends, among other factors, upon an assessment of the intrusiveness of government ac-tion. When making such assessment historically the Court consid-ered police
conduct with clear temporal, geographic, or substantive limits. However, in an era where new technologies permit the storage and compilation of vast amounts of personal data, things are becoming more
complicated. A school of thought known as ‘mosaic theory’ has stepped into the void, ringing the alarm that our old tools for assessing the intrusiveness of government conduct potentially undervalue
privacy rights. Mosaic theorists advocate a cumulative approach to the evaluation of data collection. Under the theory, searches are ‘analyzed as a collective sequence of steps rather than as individual
steps.’ The approach is based on the observation that comprehensive aggregation of even seemingly innocuous data reveals greater insight than consideration of each piece of information in isolation. Over
time, discrete units of surveillance data can be processed to create a mosaic of habits, relationships, and much more. Consequently, a Fourth Amendment analysis that focuses only on the government’s
collection of discrete units of data fails to appreciate the true harm of long-term surveillance—the composite."
"Over the past year, as the Snowden revelations have rolled out, the government and its apologists have developed a set of talking points about mass spying that the public has now heard over
and over again. From the President, to Hilary Clinton to Rep. Mike Rogers, Sen. Dianne Feinstein and many others, the arguments are often eerily similar. But as we approach the one year anniversary, it’s
time to call out the key claims that have been thoroughly debunked and insist that the NSA apologists retire them."