Visual Privacy? Visual Anonymity? Toward principles of visual privacy

Session Type:
Working group
Session Category:
Session Leader:
Helen Nissenbaum (NYU), Sam Gregory (WITNESS)


Nathan Freitas,
The Guardian Project

James Grimmelmann,
Associate Professor,
New York Law School

Rich Jones, Director,
The Open Watch Project

Jillian C. York,
Director for International Freedom of Expression at the Electronic Frontier Foundation

Day: Saturday
Time: 10:30am – 12:00pm
Room: W300

Session Notes

Summary: With cameras now so widespread, and image-sharing so routine, it is surprising how little public discussion there is about visual privacy and anonymity. Everyone is discussing and designing for privacy of personal data, but almost no-one is considering the right to control personal images in a socially-networked age or to be anonymous in a video-mediated world. To what extent should we facilitate this, and how can we do it better? Who has a right to visual anonymity or privacy, and who doesn’t?

Agenda: Following a 360-degree perspective opening, the session will focus around focused discussions of five scenarios for visual privacy and visual anonymity, each led by one of the participants. In each scenario we will look at the visual privacy and anonymity challenges,
pitfalls and potential solutions. Scenarios which will be discussed include a non-activist Facebook user in the USA, an LGBT-identified person in the Middle East, a human rights activist in the current protests in Syria, a citizen documenting police activities or misconduct, and citizens using social media to identify people alleged to have committed crimes (for example, as in the recent Vancouver and London riots). We will look for areas of overlap, of shared concern, and to the right mix of tools, legal, norms and market-based solutions to address them.

Detailed Description: Imagine a landscape where companies are able to commercially harvest, identify and trade images of a person’s face as easily as they share emails and phone numbers. While it is technically illegal in some jurisdictions (such as the EU) to hold databases of such images and data, it is highly likely that without proactive policymaking, legislative loopholes will be exploited where they exist. So far, policy discussions around visual privacy have largely centred on public concerns about surveillance cameras and individual liberties. And now, with automatic face-detection and recognition software being incorporated into consumer cameras, mobile apps and social media platforms, the potential for identification of activists and others – including victims, witnesses and survivors of human rights abuses – is growing.

Services increasingly store users’ personal and other data in the digital cloud. Cloud data is processed and handled across multiple jurisdictions, creating potential inconsistencies and conflicts in how users and their data are protected. More worryingly, cloud storage renders data vulnerable to multiple attacks and data theft by any number of malicious hackers. Hostile governments, in particular, can use photo and video data – particularly that linked with social networking data – to identify, track and target activists within their countries.

In an increasingly video-mediated world, the right to visual anonymity – within an understanding of how the right to freedom of speech is often facilitated by the ability to be anonymous, has not yet been fully articulated or developed.

Outcome: This session will work towards a greater understanding of what we might mean by visual privacy and visual anonymity, and how this relates to other conceptions of privacy and anonymity. It will address the opportunities identified by commercial providers who are beginning to use facial recognition and identification within social network products, and create a dialogue around key issues. It will also consider what tools and technologies could be incorporated into core web and mobile functionality to enable both human rights activists and general users to exercise greater control over visual privacy and anonymity. This includes building “visual privacy” checks (including masking data encoded into images, such as location, time, type of camera, etc.) as well as standard privacy checks into product design, development and marketing workflows, drawing on risk scenarios outlined through human rights impact assessments.