• Molly Callahan

    Senior Writer

    Photo: Headshot of Molly Callahan. A white woman with short, curly brown hair, wearing glasses and a blue sweater, smiles and poses in front of a dark grey backdrop.

    Molly Callahan began her career at a small, family-owned newspaper where the newsroom housed computers that used floppy disks. Since then, her work has been picked up by the Associated Press and recognized by the Connecticut chapter of the Society of Professional Journalists. In 2016, she moved into a communications role at Northeastern University as part of its News@Northeastern reporting team. When she's not writing, Molly can be found rock climbing, biking around the city, or hanging out with her fiancée, Morgan, and their cat, Junie B. Jones. Profile

Comments & Discussion

Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.

There is 1 comment on Can the Bias in Algorithms Help Us See Our Own?

  1. By framing algorithms as potential mirrors that reflect societal and structural biases the article approaches the topic of algorithms and human bias in a novel way. The article engages me to find how we should adjust ourselves looking through the bias of algorithms. I believe, as algorithms were written by human the bias level of the algorithms depends on the person who wrote it. Personally, I was inaccurately banned by algorithms in Linkedin twice which created many problems.
    The study also underscores the persistent challenge of how algorithms, which are trained on historical data, might perpetuate existing biases if not carefully designed and monitored. These biases also corresponds with the algorithms in Linkedin, as they puts their previous datas into the algorithm the algorithms became more and more biased.

Post a comment.

Your email address will not be published. Required fields are marked *