Like raising kids, there is no handbook that tells you how to make the thousands of decisions and judgment calls that shape what a conference grows into. Seven year into organizing the Theorizing the Web conference, we’re still learning and adapting. In years past, we’ve responded to feedback from our community, making significant changes to our review process (e.g., diversifying our committee and creating a better system to keep the process blind) as well as adopting and enforcing an anti-harassment policy.

This year, we’ve been thinking a lot about what we can do to ensure that presentations respect the privacy of the populations they are analyzing and respect the context integrity of text, images, video, and other media that presenters include in their presentations. I want to offer my take—and, hopefully, spark a conversation—on this important notion of “context integrity” in presenting research.

In “Privacy as Contextual Integrity,” Helen Nissenbaum observes that each of the various roles and situations that comprise our lives has “a distinct set of norms, which governs its various aspects such as roles, expectations, actions, and practices” and that “appropriating information from one situation and inserting it in another can constitute a violation.” It’s often social scientists’ job to take some things out of context and bring understanding to a broader audience. But, how we do that matters.

The ethical challenge for social scientists who use methods that remove information from its context (such as observation or content analysis) is figuring out how to still respect the norms of that context as well as the dignity of the people who are a part of it. We have not always done this well. Early anthropologists and sociologists were complicit in racism and colonialism. In my field of sex work research, previous generations of social scientists distorted or ignored sex workers’ own narratives to such a degree that sex work community, as a whole, remains skeptical of researchers.

Let’s consider a concrete example of how taking information out of context for research purposes can prove problematic: Laud Humphreys’ “Tearoom Trade” study. Humphreys described the sex habits of a community of gay men who met in certain public bathrooms. In the process of observing the men, he took down their license plate numbers and later visited their homes to conduct a health survey (which he posed as unrelated). Most of the criticism aimed at the study concerned how Humphreys used deception and risked outing individual men by collecting and storing identifiable data. This latter issue is often framed as a violation of privacy. However, I think this only gets at some of what was wrong with the Humphreys study, and I’d like to suggest that this case actually points to the limits of privacy as foundation for ethical decision-making in research.

Notably, the most sensitive information was obtained in public spaces, while the primary risk was exposure to the private (i.e., home/family) sphere. Already, this troubles conventional privacy discourses that tend to frame exposure as a uni-directional flow of information from private to public. Rather than seeing privacy and publicity as a simple dichotomy in which only that which is private is at risk of being exposed to that which is public, we might take Nissenbaum’s suggestion and frame our lives as consisting of numerous, sometimes overlapping social spheres with different norms of disclosure. From this perspective, the problem with the Humprey’s study is that it collapsed these contexts in potentially harmful ways. For example, though he altered his appearance, Humprey’s subjects would likely have been distressed to have him in their homes if they to recognized him from the tearooms. Worse yet, by taking information and observations out of the specific context of the tearooms (where he was assumed to be just another participant), Humprey’s research posed an existential threat to the community, making the men more susceptible to public moralizing and police actions.

It’s a staple of qualitative method courses to discuss how observation changes behavior. Part of the reason is that research, itself, is a context in which norms of disclosure may be different than other social situations. In the Humphreys study, subjects’ behavior and responses may have been completely different had they known they were participating in a research study—in fact, many men likely would have opted out altogether. This is something we should ask ourselves when presenting any data: “If the research subjects knew the manner in which I am presenting their information, would it change what they share?” To share data in a way that ignores the implicit norms and expectations of the context in which the information was shared is, at best, negligent and, at worse, exploitative (i.e., using someone else in pursuit of one’s own goals).

Data collection via the Web, further highlights why the concept of context integrity is a desirable alternative to the conventional public/private dichotomy. Researchers are sometimes tempted to believe that, because something is public (i.e., searchable on the Web), it is fair game for them to use as they wish (regardless of a site’s norms or the user’s original intent); but, such thinking is often rooted in a slippage between what information can be collected and held as a matter property rights and what is useable as a matter of ethics. Part of the problem is that discourse around public/private information has been incorporated into the market logic of copyright law. According to this logic anything done in public (legally defined as that which lacks “a reasonable expectation of privacy”) can be captured and become the property of whomever recorded it.* However, establishing ownership of data does not intrinsically imply that it’s ethical to share that data. In fact, IRB’s regularly compel researchers to destroy identifiable data that they rightfully own. Simply saying “well, it was public” and, therefore, legally obtained, in no way excuses harm done by placing information in another context.

To think about what context integrity means for Web-related research, it may be useful to consider a second case (one with parallels to the tearooms Humphreys observed): namely, hookup/dating sites like Grindr, Tindr, Fetlife, SwingLifeStyle, Craigslist. These sites are publicly accessible, and it is extraordinarily easy to capture screenshots from them (or even to systematically scrape data). Such research activities may violate terms of service, but they certainly aren’t violations of criminal law. So, assuming for a moment that a scenario exists where it is both legal and ethical to attain information about a hookup/dating site’s users, the question is then: How do we determine what aspects of this “public” information can ethically be shared by researchers?

Nissenbaum’s theory of context integrity suggests that we should look to the norms of disclosure on sites and try to remain consistent with them. Specifically, we might infer that users only intended for the personal information on their profiles to be seen by potential dates. These profiles may contain information about their sex life or relationship status (e.g., non-monogamy) that they would not want to share with family or co-workers. In fact, some may obscure their faces or certain other personal details as an additional precaution against their information leaking into another context.

The obvious conclusion in this case, then, is that sharing any potentially identifiable information (images, location, unique stories, etc.) would fail to respect the implicit assumptions made by users in posting their data. But, even if personal information can’t easily be linked back to the user, it may still be unsettling to see intimate things taken out of context. Moreover, we shouldn’t assume that de-identified aggregate level data is intrinsically benign; it can still, potentially, violate context integrity (as the “Tearoom Trade” study demonstrated). Increasing general attention to a site can have negative consequences, outing communities writ large. We saw this just last week with FetLife as increased attention (much resulting from the Fifty Shade of Grey craze) led to the banning of many sorts of content from the site after credit card companies threatened to stop processing payments unless things they objected to were removed.

This isn’t to say that all research into hookup/dating sites is ethically dubious, just that, in such sensitive cases, no disclosure should go without careful consideration and scrutiny. It’s the researcher’s job to anticipate the consequences of bringing information into another context and to mitigate whatever risks this transfer entails.

Finally, we need to pay special to the most sensitive cases: namely, those where the subject matter involves victims (e.g., research into police violence, sexual assault, revenge porn, etc.). When the context in which the information originates is an instance of violation, humiliation, and/or violence, circulation of certain pieces of this information (e.g., names, images, specific acts, etc.) may amplify this harm, re-victimizing the target. If there is any reason to believe that a research subject (or subjects) might be embarrassed to have a piece of information shared in conference setting and the connection to the subject cannot be anonymized, pseudonymized, or otherwise obscured, then I think that obtaining explicit consent is the way to go. This is doubly important for vulnerable populations.

In reflecting on “Tearoom Trade” study and considering how the lessons learned from it might apply to current research on hookup/dating site, I’ve suggested that both privacy and ownership are weak ethical frameworks for information sharing practices; much harm could be avoided by, instead, centering ethical consideration on context integrity and consent. In particular, I think it’s important to recognize research as its own context and that the basic purpose of methods such as observation and content analysis is to pull information out of their original context. While one-size-fits-all rules are difficult to establishe (given the wide variety of contexts explored by social scientific research), I’ve suggested that sharing sensitive information disclosed in other contexts (including images and audio) merits careful consideration and usually requires protections (such as de-identification) and/or explicit consent.

*Criminals laws regulating public recording vary by state and local municipality.

PJ Patella-Rey (@pjrey) is a sociology PhD candidate writing about the experiences of sex cam models.