Big Tech, Big Heart
Even after being laid off from Twitter in November, Esther Ohrt is a busy woman. There’s the food business she started with a friend, whose mission is to promote African cuisine. There’s the animation company she cofounded to produce and distribute engaging African content for children. As a woman in tech, she’s worked to make social media safer for users, most recently as a senior program manager for human rights at Twitter. (Hers was one of the positions cut when Elon Musk took over the company in November.) Oh, and did we mention she’s a new mom?
In all of it, not least raising a baby girl, Ohrt carries with her a deep commitment to representation, justice, freedom of speech, and affirming the rights of all. Those values took root at an early age, when she was growing up in a multiracial family in Accra, Ghana, and deepened during her time at BU’s College of General Studies and Frederick S. Pardee School of Global Studies, and Harvard’s Graduate School of Education.
“When I was in school, I wanted to save the world. Really, that’s what I wanted to do,” says Ohrt (’11, Pardee’13). “I didn’t care how, I just needed to save the world.”
That mission would take her to Silicon Valley—first to Facebook and later to Twitter—to work on making the biggest social media platforms safer for users. Amid battles over free speech and bipartisan concerns that the platforms have become too big and invasive, and even following her own sudden ouster from Twitter by its controversial new owner, Ohrt continues to believe that social media can do more good than harm. She’s not ready to give up on it yet.
From Ghana to Comm Ave
Ohrt has been crossing cultures her entire life. The daughter of a Ghanaian mother, a Lebanese biological father, and an American stepfather, she attended a French international school (becoming bilingual in the process) and later an American college preparatory program for high school. She discovered international relations courses at CGS, and knew she’d found a career. She chose internships that had some level of social impact and involved international development, Ohrt recalls. These included roles at the Institute for Policy Studies in Washington, D.C., Geneva for Human Rights in Switzerland, and Amnesty International in New York City. She even spent a couple of summers back in Ghana, teaching at an orphanage.
Armed with a BA in international relations, Ohrt looked for opportunities to improve education systems in developing countries, like her beloved Ghana. She accepted a two-year Americorps teaching fellowship with Citizen Schools in Houston, Tex., then earned a master’s in international education policy at Harvard. She joined the nonprofit Education Development Center in Washington, D.C., where she implemented large-scale education projects in sub-Saharan Africa, such as supporting early grade literacy programs in Mali, promoting a culture of reading in Ghana, and improving youth entrepreneurship opportunities in Senegal. “It was a really cool experience because I got to travel to these countries and set up a project from the ground up,” she says.
After four years, though, Ohrt says she felt conflicted.
“If I continue doing this, I’m kind of bucketing myself as an education expert, and I wasn’t 100 percent sure that’s what I wanted to do—even though I was passionate about education,” she says. She went to work for Deloitte in its human capital practice—which consults clients in “human resources, talent, and organizational priorities”—where she was designing training curriculum one day and facilitating a leadership retreat the next. “I got to work with so many different clients from government agencies to nonprofit organizations, and it was a rich experience,” she says.
Then, Facebook called.
Heading into the 2020 US presidential election, Facebook was under considerable pressure to shore up its platform against foreign meddling. Russia-backed Internet Research Agency had used Facebook to spread misinformation and smear candidates beginning in 2014 and continuing through election day 2016 with the objective of influencing the results to favor Donald Trump, according to separate reports from independent counsel Robert Mueller and the bipartisan Senate Committee on Intelligence.
Mark Zuckerberg, the CEO of Facebook (which was renamed Meta in 2021), publicly made securing the 2020 election a priority, hiring more than 300 employees whose job it was to identify misinformation and threats pertaining to elections in the United States and around the world. Ohrt’s background in international development and consulting lent itself perfectly to this work, and in June 2020 Facebook recruited her to work as a program manager focusing on civics and global elections. Her projects centered on events happening abroad, including “high-priority elections” and the fallout from natural disasters, violence, and other crises. She worked with nongovernmental organizations on the ground in countries with high-stakes elections to understand what people were saying on and off Facebook about candidates and issues. The goal was to ensure the platform had in place the tools and people—content reviewers who spoke the local language, for example, or processes to monitor certain hashtags—to intercept harmful posts. Ohrt says she and the elections team applied strategies proven domestically elsewhere.
“Given that Facebook is an American company, it’s always going to do something for us first, before it expands,” Ohrt says. “They have people who focus on the rest of the world because we need to make sure that we’re advocating for these other countries, too.”
When Facebook ordered its workers back into the office after the pandemic crested in 2021, Ohrt preferred not to relocate to California from Washington, D.C., where she and her husband were building a life. That’s when she saw a posting for a senior program manager for human rights at Twitter, which was fully remote. She got the job. Broadly speaking, Ohrt says, she and the others on Twitter’s human rights team worked to defend the rights of the platform’s users—“freedom of expression, the right to privacy, and freedom from harassment and bullying,” according to Ohrt—as well as those not on Twitter who could be negatively affected by certain speech. The team also worked to protect the free speech rights of users, often political dissidents, who were tweeting under authoritarian governments. In her role, Ohrt worked to identify specific risks—primarily in sub-Saharan Africa and Latin America—and collaborated with teams across Twitter to create solutions to address those risks. For instance, if Ohrt learned that women in Turkey were being harassed on Twitter with violent and abusive comments, she might have coordinated with the product team to give female users the option to disable comments. “That would be a product intervention that came from understanding the human rights risks in that market,” she says.
As with Facebook, the uptick in content moderation at Twitter prompted complaints that the platform censors certain political speech. Serious enough infractions can cause a user to be suspended from or kicked off Twitter, as was the case with President Donald Trump following the January 6, 2021, insurrection at the US Capitol (until Musk reinstated his Twitter account in late November 2022).
“If you’re someone who has a lot of power, and you have a lot of followers, then what you say holds a bit more weight than the average Joe who has one follower,” Ohrt says.
A Bump in the Road
The work Ohrt and her human rights colleagues were doing at Twitter came to a screeching halt in November. That’s when Twitter’s shareholders completed their sale of the company to billionaire entrepreneur Elon Musk, an outspoken critic of Twitter’s content reviewing and moderation efforts. On November 4, 2022, Musk laid off half of the company’s 7,500 employees, including Ohrt and the entire human rights team. With few staffers left to review content, daily instances of hate speech against Jews, the LGBTQ+ community, and Black people more than doubled in the weeks after the company changed hands.
Ohrt says she’s not sure which direction Musk is trying to take Twitter—“I’m watching along like the rest of the world”—but believes social media companies like her former employer still have a role as tools “to share critical information, support important movements, help grow businesses, but also as a way for people to share their lives in a way that is not harmful to others.” The safety of users, she says, must always be front and center for social media companies.
“This should be done by taking into account diverse perspectives, experiences, and voices and always thinking of the risks or harms that it can cause if not used properly,” she adds.
While she figures out what’s next, Ohrt stays busy building the companies she cofounded and leads: Afro91, an afro-fusion cooking and catering company, and Afrotoons, an animation start-up for children that creates stories set in an African context. Neither company would have gotten off the ground without a social media presence, she says. Her most important job, however, is raising a baby girl, who was born in June 2022. Most of her time spent on social media these days are “when I’m just bored at 4 am breastfeeding and I need to watch funny videos of cats.” Looking into her daughter’s eyes has renewed Ohrt’s commitment to protecting the rights of others, on social media platforms and beyond.
As a woman of color in the tech sector, Ohrt feels as though she’s always had to work 10 times harder. “It breaks my heart to know that my child may have to go through that,” she says, “and that there will be times where she will be discriminated against because of the color of her skin, or because she is a woman. So, it does make these fights to protect rights even more real for me. I do want my child to grow up in a better world. I am trying to do what I can to at least leave a positive mark.”