Who Should Police the Internet?

BU LAW Professor Danielle Citron testifying Wednesday before a House Committee on Energy and Commerce subcommittee. Photo by David Scavone
Who Should Police the Internet?
LAW Professor Danielle Citron testifies before Congress on ways to make the internet safer, healthier
Over the weekend, a violent animated spoof video exploded on social media showing President Trump on a shooting rampage inside a church, blasting away at journalists and political foes; the video sparked widespread anger and debate.
Days later, on Wednesday, October 16, privacy and constitutional law scholar Danielle Citron, a School of Law professor of law, joined a panel of industry and thought leaders invited to speak before a subcommittee of Congress’ House Committee on Energy and Commerce in Washington. The subject of the hearing was Fostering a Healthier Internet to Protect Consumers, and it was aimed at searching for ways to create a safer internet, where free speech is embraced, but where hate speech, harassment, and illegal activities, such as human trafficking and the selling of drugs, are aggresively sought out, banned, and suppressed.
One committee member called the problem an epidemic and said people are dying as a result of it, and the packed room inside the Rayburn House Office Building reflected the urgency of the subject.
Citron, who last month was awarded a 2019 MacArthur Fellowship for her work countering hate crimes, revenge porn, deepfake videos, and cyber abuse, said Congress cannot rely solely on the largest platforms—Google, Facebook, Reddit, YouTube, and others—to solve the internet’s most worrisome problem, because they rely so heavily on online advertising. “We must have reasonable content moderation practices,” Citron testified. “We’ve got to do something, because doing nothing has costs.”
It was a busy day for Boston University on Capitol Hill. At the same time and in the same building where Citron was testifying, Michael Siegel, a School of Public Health professor of community health sciences, was testifying before another subcommittee of the House Committee on Energy and Commerce, one of several witnesses addressing legislation to reverse the youth tobacco and e-cigarette epidemic.
Central to the issue at Citron’s hearing is whether the largest social media and internet platforms are capable of policing themselves and their millions of users by removing inappropriate content, under Section 230 of the Communications Decency Act, or whether stricter oversight and regulation is needed. Section 230 largely protects platforms from civil penalty if the inappropriate content that appears on their sites was user-generated and user-posted. There is debate over whether 230 should be left alone, amended, or possibly eliminated.
At the hearing, committee members seemed unanimous that 230 is not working and needs to be tightened up—quickly. But whether the courts, the Federal Trade Commission, or the internet’s users should be the ultimate policing group for inappropriate content prompted strong disagreements.
We’ve got to do something, because doing nothing has costs.
How widespread is the problem of moderating bad content? Katherine Oyama, Google’s global head of intellectual property policy, told subcommittee members the company removed 35,000 videos last quarter from YouTube (which it owns) and suppresses 19 billion links as spam every day, and she noted that 2 billion illegal ads a year are stricken before ever appearing. She said the company has 10,000 people who work on content moderation. Another witness, Reddit cofounder and CEO Steve Huffman, told the subcommittee that his company devotes about 20 percent of its 500-strong workforce to content moderation.
In her remarks, Citron focused on online harassment, which she has studied for years. “The costs are significant to women and minorities,” she testified. “When a Google search of their name contains threats, a nude photo without their consent, a home address, it’s hard to get a job, let alone to keep a job.” She said women in particular are often the ones terrorized, feeling forced to change their names or even move. “It’s not necessarily a win for free speech,” she said. She said dating websites that don’t ban impersonator profiles are extremely troublesome, and cited a specific case involving the dating website Grindr.
Reddit’s Huffmann, in his submitted remarks, described how the company works: “The way Reddit handles content moderation today is unique in the industry. We use a governance model akin to our own democracy—where everyone follows a set of rules, has the ability to vote and self-organize, and ultimately shares some responsibility for how the platform works.”
At least one committee member found that sort of approach far too weak.
“You better get serious about self-regulating,” Congressman Bill Johnson (R-Ohio) said to the panelists, “or you’re gonna force Congress to do something that you might not want to have done.”
In his opening remarks, Congressman Mike Doyle (D-Pa.), chairman of the subcommittee, revisited the deadly 2018 Tree of Life synagogue shooting in Pittsburgh. The shooter, he said, “had posted anti-Semitic attacks on a fringe website first before going in.” He also referred to the 2016 presidential election, saying: “Foreign adversaries used the power of these platforms against us. Clearly we all need to do better.”
Doyle said that Section 230 enables comments, honest and open reviews, and free and open discussions, and affords marginalized communities a voice they would otherwise not have. “That cannot be overstated,” he said.
In one surprising, and shocking, moment during the hearing, Gretchen Peters, executive director of the Alliance to Counter Crime Online, asked the committee members, “When was the last time anybody here saw a dick pic on Facebook?” She paused, saw no hands go up, and continued. “If they can keep genitalia off of these platforms, they can keep drugs off of these platforms, they can keep child sexual abuse off these platforms. The technology exists. These are policy issues.”
Citron agrees. She said it was vital that companies “not only have policies, but are clear about them and accountable.” She said Section 230 is valuable, but that it needs to be modified because it’s providing bad Samaritans with a legal shield. “They could be more transparent about processes they use when they make decisions. More accountability.”
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.