How BU’s Cryptographers Made Equity Unhackable

When the City of Boston asked researchers to find a way for companies to share payroll data without exposing employees’ privacy, it posed a tough question: how can we measure fairness when the numbers themselves feel too risky to share?

Headshot of Mayank Varia, BU Faculty of Computing & Data Sciences
Associate Professor Mayank Varia

Boston University researchers turned to cryptography. The result is a privacy-preserving platform that was featured in the Massachusetts Workforce Data Report that supports the Frances Perkins Workplace Equity Act’s statewide wage-equity analysis, all without revealing individual employers’ payrolls. The project responds to a law designed to promote salary transparency while also providing aggregate demographic snapshots of workplaces across the Commonwealth. “The purpose is to understand employment across the Commonwealth by variables like gender and race and ethnicity, like how many people are working in urban areas versus rural areas,” says Professor Mayank Varia.

Varia, an associate professor in the Faculty of Computing & Data Sciences, led the cryptographic side of the effort. He views privacy not as an obstacle, but as the foundation of trust. As Varia explains, the team “can do computations over this encoding without knowing what the individual underlying data says.” In other words, researchers “can calculate averages, … can calculate counts, [and] can make histograms and manipulate the encoded data” without ever seeing raw payroll numbers.

Privacy in practice has more than one layer. The platform keeps inputs encrypted during computation and, as part of the calculation, intentionally adds a small amount of noise to the data. That deliberate distortion protects against re-identification later in the process, and even with those added safeguards, Varia emphasizes, “the data is very accurate.”

The Boston Women’s Workforce Council (BWWC) provided an early proving ground, with more than 200 local employers joining the BWWC’s compact to share anonymized data. Their participation proved that strong privacy protections encourage collaboration rather than discourage it.

This math-centric approach worked especially well because it was paired with social science insight. Collaborating with Neha Gondal, associate professor of Sociology and CDS, ensured the analysis would be meaningful to policymakers, researchers, and advocates alike. “They’re the domain experts who know the best way to represent the data in ways that are meaningful to stakeholders,” Varia says. Determining how to make data interpretable without being misleading is where their expertise proved essential.

Scaling the system statewide came with tight constraints. “I think the hardest challenge we had in this project was just sheer time—or lack of it,” Varia recalls. Companies submitted data through February, and the team completed final reports and dashboards by the end of April. Still, participation was strong for the inaugural run: “somewhere around 30 to 40%,” he estimates.

Looking ahead, the team is already thinking beyond a single report cycle. The next phase will explore stronger privacy–accuracy trade-offs. “We were initially limited by time,” Varia says, “but now time is a little bit more on our side to provide stronger privacy–accuracy trade-offs; to try to reduce that noise and distortion even more using more sophisticated mathematical techniques.”

The platform is more than an academic success: it’s civic infrastructure that could scale beyond Massachusetts. Varia believes privacy-preserving computation is key to ensuring that people feel safe sharing the very information needed to understand and address systemic inequities. That trust is what turns raw data into policy, and policy into change.

List of grants obtained by Varia