• Doug Most

    Assistant VP, Executive Editor, Editorial Department Twitter Profile

    Doug Most is a lifelong journalist and author whose career has spanned newspapers and magazines up and down the East Coast, with stops in Washington, D.C., South Carolina, New Jersey, and Boston. He was named Journalist of the Year while at The Record in Bergen County, N.J., for his coverage of a tragic story about two teens charged with killing their newborn. After a stint at Boston Magazine, he worked for more than a decade at the Boston Globe in various roles, including magazine editor and deputy managing editor/special projects. His 2014 nonfiction book, The Race Underground, tells the story of the birth of subways in America and was made into a PBS/American Experience documentary. He has a BA in political communication from George Washington University. Profile

Comments & Discussion

Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.

There is 1 comment on Who Should Police the Internet?

  1. I haven’t watched the whole testimony, but I’m curious: what solutions were proposed? I see a lot of suggestion that “more” needs to be done, but what does that look like to Ms. Citron?

    I think it’s fair to say that there are costs to doing “nothing,” as she says, but there are also costs to doing “something.” Indeed, there are costs for every action and inaction. If restrictions are placed on online expression, presumably by a government entity, the cost would come in the form of censorship, a slide toward orthodoxy, a chipping away of free speech. If you think that a governing body can police what is said on the internet while not infringing on free speech rights, answer these questions: who would make the decisions about what is acceptable and not acceptable? Is it a panel of people? Who decides who gets to be on the panel? Is it a machine algorithm? Who programs the algorithm? What if you don’t agree with the ideology of the people making these decisions?

    The way I understand Section 230 is that it was created in the early days of the internet to protect internet browsers from being held accountable for content posted by individuals on their platforms. As the Internet evolved, this protection was extended to social media. This is how we have arrived at the “platform vs. publisher” dilemma. If online platforms remain agnostic and allow all content on their sites, with the noted exceptions of content that violates an existing law or is not protected by the 1st amendment, then they will enjoy the legal protections as outlined in Section 230. If, however, they begin to curate, censor, or filter the content that is posted, then they will cease to be an impartial platform and instead be considered a “publisher.” As a publisher they are allowed to make editorial decisions about what appears on their site, but many would argue that they should then lose their legal protection and they could rightfully be sued for libelous or slanderous content. Right now, online platforms are acting like publishers but a government granted immunity.

    I think this debate is long overdue. Social media companies are engaging in censorship and it extends beyond content that violates current law. They restrict content based on ideological positions. This is fine, since they are private companies, but they should lose their protections under Section 230.

    As for a government-led Internet policing approach, I can think of no more appalling and damaging idea. The benefits are dwarfed by the “costs” of a slide toward tyranny.

Post a comment.

Your email address will not be published. Required fields are marked *