How is harmful content reported in end-to-end encrypted systems?
BY GINA MANTICA
Messaging apps like WhatsApp and Signal use encryption to allow people to chat privately, but as a result they lack an easy way for users to report harmful content securely. The use of end-to-end encryption in these and other messaging apps prevents third parties from accessing data transferred between individuals. A team of researchers at Boston University created a new moderation scheme that enables users to report abuse, including content from forwarded messages, while otherwise being able to have end-to-end encrypted security.
The system, developed in part by Mayank Varia, Co-Director of the Center for Reliable Information Systems and Cyber Security and Associate Professor in the Faculty of Computing & Data Sciences, and Computer Science PhD students Nicolas Alhaddad and Rawane Issa, protects both the reporter’s identity and the content of unreported messages. The paper was accepted for publication at USENIX Security 2022.
WhatsApp and Signal achieve a property called deniability. When a conversation is over, neither party has a provable record of what was said. Deniability can be useful: imagine for example if a whistleblower tries to send critical information to a journalist, but doesn’t want the authorities to be able to capture evidence of this transmission afterward. But this level of deniability prevents individuals from showing a content moderator that they have received abusive content.
Varia and colleagues’ content moderation scheme enables reporting while maintaining a certain level of deniability. The team’s key insight is to have the content moderator, itself, produce tokens that pinpoint the sender’s identity, which the sender attaches to each message. As a result, when an individual reports a message, the moderator (and only the moderator) can use the token to learn who wrote the message, but it can’t use the token as evidence beyond the scope of the report because it is the creator of tokens in the first place.
The system, called Hecate, could be effective in addressing bullying and cybercrime in secure messaging systems. Hecate is compatible with end-to-end encrypted systems that provide sender anonymity, like the Signal app, and it supports message forwarding. Their proof-of-concept implementation shows that Hecate is also computationally faster and simpler than other systems. “We built it in a way that uses fast, standard crypto primitives and stands up in anonymous networks with integrity,” says Varia. While the decision on whether to use abuse reporting is a policy decision that must be done in collaboration with experts from other fields, this technology provides new possibilities for policymakers to consider.
To learn more about the Hariri Institute’s transformational research, click here to sign up for our newsletter.