I created chatbot ‘aunt’ to help abused women after family tragedy
2025-11-03 00:09:08
Megha MohanBBC World Gender and Identity Correspondent
gravelA horrific murder in her family inspired South African Leonora Tema to create a digital platform where people, mostly women, can talk about and track abuse.
Leonora’s relative was just 19 years old, and nine months pregnant, when she was killed, and her body dumped on the side of a highway near Cape Town in 2020.
“I work in the development sector, so I have seen violence,” Leonora says. “But what caught my attention was that the violent death of a member of my family was seen as very normal in South African society.
“Her death was not reported in any news outlet because the sheer volume of these cases in our country cannot be considered news.”
The killer was never caught, and what Leonora saw as silent acceptance of a woman’s violent death became the catalyst for her app, Gender Rights in Technology (Grit), which features a chatbot called Zuzi.
This is one of the first free AI tools made by African creators to address gender-based violence.
“This is an African solution designed in partnership with African communities,” says Leonora.
The goal is to provide support and help collect evidence that can later be used in legal cases against abusers.
The initiative is gaining interest among international women’s rights activists, although some warn against using chatbots to replace human support, emphasizing that survivors need empathy, understanding and emotional connection that can only be provided by trained professionals.
Leonora and her small team visited communities in the townships surrounding her home in Cape Town, speaking to residents about their experiences of abuse and the ways technology fits into their lives.
They asked more than 800 people how they use their phones and social media to talk about violence, and what prevents them from seeking help.
Leonora found that people wanted to talk about their abuse, but “were wary of traditional methods like the police.”
She says: “Some women post about this on Facebook, and even tag their attacker, to be handed defamation papers.”
She felt that existing systems failed victims twice, first when they failed to prevent the violence itself, and again when victims tried to raise their voices.
With financial and technical support from Mozilla, the Gates Foundation, and the Patrick McGovern Foundation, Leonora and her team began developing Grit, a mobile app that can help people record, report, and get a response to abuse as it happens.
The app is free to use, although it requires mobile data to download. The Leonora team says it has 13,000 users and received about 10,000 requests for help in September.
At its core, Grit is built around three key features.
On the home screen there is a large circular Help button. When pressed, it automatically starts recording 20 seconds of audio, capturing what’s happening around the user. At the same time, it triggers an alert to a special rapid response call center – professional response companies are common in South Africa – where a trained operator contacts the user.
If the caller needs immediate assistance, the response team either sends someone to the scene themselves or contacts a local victim organization that can go to their aid.
The app was created with the needs of abuse survivors at its core, says Leonora: “We need to gain people’s trust. These communities are often overlooked. “We’re asking a lot of people when it comes to sharing data.”
gravelWhen asked if the Help feature had been abused, she admitted there had been a few strange presses — people testing to see if it actually worked — but nothing she could call abuse of the system.
“People are cautious,” she says. “They are testing us as much as we are testing the technology.”
The second element of Grit is the “vault,” which Leonora says is a secure digital space where users can store evidence of abuse, dated and encrypted, for later use in legal proceedings.
Images, screenshots and audio recordings can be uploaded and saved privately, protecting important evidence from deletion or tampering.
“Sometimes women take pictures of injuries or save threatening messages, but these messages can be lost or deleted,” says Leonora. “The vault means that evidence isn’t just sitting on the phone that can be taken or destroyed.”
This month, Grit will expand again with the launch of its third feature – Zuzi, an AI-powered chatbot designed to listen, advise and guide users to support the local community.
“We asked people: ‘Should it be a woman? Should it be a man? Should it be a robot? Should it look like a lawyer, a social worker, a journalist, or some other authority figure?'” explains Leonora.
People told them they wanted Zosie to be an “aunty figure” – someone warm and trustworthy, whom they could trust without fear of judgement.
gravelAlthough it was designed primarily for women experiencing abuse, during the testing phase, Zuzi was also used by men seeking help.
“Some of the conversations are from the perpetrators, with men asking Zuzi to teach them how to get help with anger issues, which they often direct at their partners,” explains Leonora. “There are also men who have been victims of violence and have used Zozi to talk more openly about their experiences.
“People like to talk to AI because they don’t feel like they are being judged by it,” she adds. “He’s not human.”
UN Women reports that South Africa experiences some of the highest levels of gender-based violence in the world, with a femicide rate five times the global average. Between 2015 and 2020, an average of seven women were murdered every day, according to South African police.
Many, including Lisa Viten, a specialist in gender-based violence in South Africa, agree that technology is bound to play a role in tackling this problem.
But she also cautions caution about using AI in trauma-focused care.
“I call them big language models, not artificial intelligence because they are involved in linguistic analysis and prediction – nothing more,” she says.
She can see how AI systems might be able to help, but she knows of examples where other AI chatbots have given women incorrect advice.
“I worry when they give women overly confident answers to their legal problems,” she says. “Chatbots can provide useful information but are unable to deal with complex, multi-faceted difficulties. Most importantly, they are no substitute for human counsel. People who have been harmed need help to trust and feel safe with other humans.”
@judith.Litvine/MEAEGriet’s approach has attracted international attention.
In October, Leonora and her team presented their app at the Feminist Foreign Policy Conference hosted by the French government in Paris, where world leaders met to discuss how to use technology and politics to build a more gender-equal world. At the conference, 31 countries signed a pledge to make tackling gender-based violence a major political priority.
Conversations are buzzing about the use of AI, says Lyric Thompson, founder and president of the Feminist Foreign Policy Collaborative, “but the moment you try to bring gender into the conversation, to bring up the dangers of racial and gender bias and xenophobia, the eyes gouge out and the conversation shifts — most likely into a back lane where there aren’t any annoying women to bring up the topic.”
Heather Hurlburt, an associate fellow at Chatham House, who specializes in AI and its use in technology, agrees that AI “has huge potential to either help identify and address gender discrimination and gender-based violence, or to entrench misogyny and inequality,” but adds that the path we take “is very much up to us.”
Leonora explains that the success of AI in tackling gender-based violence depends not only on the engineering, but on who can design the technology in the first place.
A 2018 World Economic Forum report found that only 22% of AI professionals globally are women, a statistic that is still often cited.
“AI as we know it now was built using historical data centered on the voices of men, and white men in particular,” Leonora says.
“The answer is not just more female creators. We also need more female creators of color, more from the Global South, and more from less-advantaged socioeconomic backgrounds.”
Only then, Leonora Tema concluded, can technology begin to represent the reality of those who use it.
You may also be interested in:
Getty Images/BBChttps://ichef.bbci.co.uk/news/1024/branded_news/9a06/live/224f5d20-b654-11f0-ba75-093eca1ac29b.jpg




إرسال التعليق