BU Spark! Demo Day: Students’ App Helping Identify Racial Bias Named Judges’ Choice Winner
BU Spark! Demo Day: Students’ App Helping Identify Racial Bias Named Judges’ Choice Winner
B Scanner designed for use by high schoolers, scans for biased words
Finding unbiased resources on the internet when working on papers and research projects can be a problem, but a new app designed by four BU students promises to make the job easier. Called B Scanner—short for Bias Scanner—the tool was developed this past semester as part of the BU Spark! Innovation Fellowship Program. Unlike a lot of competitors, it searches for racial bias and the intent of the text, not just possible misinformation. The app took home the the Judges’ Choice award at BU Spark’s Fall Demo Day on December 11.
Team leader Shateva Long (CAS’23) came up with the idea after a conversation with a BU student who was confused when reading about the Black Lives Matter movement over the summer.
“I thought that because there is so much information on the internet, they should be able to get an unbiased and impartial view of what the movement is, and then make their own judgment off of that,” Long says. “But they couldn’t even make it past the step of understanding what was going on. The first thing I thought was, how can I fix that? There should be no reason why someone shouldn’t be able to understand the movement, or any movement.”
Long found contradictory, often confusing, information as she looked at various articles about Black Lives Matter, in part because authors were writing from their own perspectives—and biases. She teamed up with classmates Juan Almanzar (CAS’23), Lucy Baik (CFA’21), and Ike Okoye (CAS’22) to create an app to help people struggling with the same problem.
B Scanner works both as a detection tool and as an educational tool. Users copy and paste text they’ve found on the internet into the app. It then locates words that “potentially show racial bias” and the development team enters them in a dictionary that highlights the words, and explains why they may be biased. At the moment, the tool addresses only racially biased words, but the team plans to expand the database of words and add machine learning to find other types of bias. They also hope to add a web scraper so that all a user has to do is provide a link to the source: the web scraper will then retrieve the source’s text for them.
Long, originally a computer science major, read books and articles about bias in language and used the knowledge she gained to build the dictionary of biased words and phrases. As a result of working on the app, she decided to double-major in computer science and linguistics.
“This project is really what brought on the added major,” she says.
Okoye and Almanzar, both computer science majors, worked on the app’s coding with Long. Both say that they enjoyed the collaborative aspect of the project—the group met every day via Zoom and spent between 5 and 10 hours a week working on the app—because they don’t have opportunities to do group work in their traditional computer science courses.
Baik designed the app, which is not yet available to the public. She’d had no experience working on user experience (UX) or user interface (UI) design, and credits Spark! creative director James Grady, a College of Fine Arts assistant professor of art and graphic design, with helping her hone the necessary skills to create a successful UX/UI design. Additional meetings with other Spark! designers and mentors from Red Hat Software helped her refine the design even more.
Her responsibilities consisted of drawing the display and working with the rest of the team refining it to make it work as efficiently as possible for users. She says her biggest takeaways from the project were learning how to communicate with people unfamiliar with graphic design and how to think like an app user instead of a creator.
“Spark! really allowed me to see things that I would have never learned otherwise,” Baik says. “It teaches a lot of things that would happen in the field if you go into this job.”
The B Scanner team intends the app to be a tool to learn about bias and reflect on one’s own bias—people can copy and paste their own writing into B Scanner to find their own biases. And, they say, it’s designed especially with high school students in mind, who are just starting to conduct research and want to find unbiased, credible sources.
Long gives this sentence as an example of biased language: “There are ten nurses at a hospital, and five male nurses.” That text is biased because it lacks parallel language and implies that the writer sees female nurses as the norm.
Team members say they have been extremely careful in crafting the wording of the app’s results. “We try to be very careful in how we talk about [bias],” Baik says. “Because we’re not saying that if someone does that [uses certain language], then they are biased or misogynist or something. It’s more along the lines of: we all have biases and we don’t even notice them. We aren’t telling users not to use a source or that a source is inherently bad because of its bias, but we are sharing the bias that the source may have. Users should make the appropriate decisions in how they use the information from the source.”
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.