“I asked ChatGPT for a hug”: Nigerians are turning to AI for emotional support
At 1 a.m., 23-year-old Tomi* was lying on her bed, exhausted and overwhelmed. She had just finished pouring her heart out, ranting about everything from unrequited love to the suffocating weight of underachievement. Her fingers hovered over her phone screen briefly before she typed: âI just want a hug.â Messages of reassurance came just about a second later:Â âYouâre safe here. You matter. And youâre not alone.
âÂ
This exchange didnât take place in a therapy session or with a friend. It was happening on ChatGPT, a general-purpose artificial intelligence assistant best known for summarising and writing better emails, drafting reports, and explaining complex ideas.Â

Tomi isnât alone. Across Nigeria and even globally, users are turning to AI tools like ChatGPT for more than productivity. They are asking chatbots if they are good people, if they should leave their partners, or how to make sense of childhood trauma. For many, AI tools are standing in for friends who didnât pick up a call or therapists they cannot afford.
Twenty-three-year-old Favour* started using ChatGPT as a study companion for her final-year project. When she returned to using the tool again, post-graduation uncertainty had set in. The chatbot allowed her to unpack the weight of the previous year, the terrors of job hunting, and the long wait for NYSC. âItâs not like I couldnât talk to anyone,â she said. âI just wanted to rant.âÂ
Before ChatGPT, she would make private voice notes to get things off her chest, but once, a reply from the chatbot caught her off guard. âIt told me, âI want you to breathe. Just breathe.ââ That âfelt really personal,â she said. Since then, she has returned to ChatGPT in moments of doubt, after an argument, while applying for jobs, or wondering whether she shouldâve responded better in a confrontation.
Can AI really care?
Chatbots are built on statistical prediction engines trained with massive datasets like books, online conversations, magazines, and more, to produce responses that sound human. But when a bot tells you, âyouâre not alone,â is it truly being kind or simply mimicking kindness?
According to AI researcher and medical doctor, Jeffery Otoibhi, designing an AI chatbot that responds empathetically involves modelling three layers of empathy: cognitive empathy, where the bot recognises and validates a userâs feelings; emotional empathy, where it feels with you; and motivational empathy, where it offers a solution, advice, or encouragement.
He explains that the chatbots are strong at cognitive and motivational empathy, but empathy remains elusive, because at its core, AI responses are âbased on the statistical patterns theyâve (AI bots) picked out from their training data. The training data cannot provide emotional empathy.â
Get the best African tech newsletters in your inbox
There is a tension between what users feel and what bots are designed to offer. Chatbots like ChatGPT often include disclaimers in their responses, reminding users that they are not licensed professionals and should not be used as a substitute for therapy. In many cases, users either donât read the fine print or simply donât care. âSometimes, Iâve thought about the fact that ChatGPT may use this info in another way. But I donât care. Let me just get it out,â says Favour.Â
âI see them (disclaimers). I just quickly look away,â Tomi says about the appâs terms and conditions.
Otoibhi also highlights the possibility of reducing complex human emotions into an average response based on what it has seen most often in its dataset. AI models learn and generalise over statistical patterns, he explained. This means that their emotional understanding might be very generic. As human beings usually have a mix of emotions, AI systems might struggle with such concepts because theyâve been trained to generalise over everybodyâs data. âSo, they will just pick out the most frequent emotion in the data set,â he said.Â
Tools like ChatGPT do not get at the heart of a problem the way a human therapist does; they are calculating your likelihood of feeling a particular emotion in that moment based on all the data theyâre trained on. If the comfort isnât real, then why do people keep going back?
âIt gives me hopeâ¦â
Ore*, a Lagos-based writer in her 20s, explained why she uses the tool this way: âItâs the idea that thereâs something available out there that is echoing my thoughts back to me. It makes me feel better about myself as a human. It makes me feel good; it gives me hope.â Many users I spoke to echoed the same reasons: safety, comfort, availability, lack of judgment, and freedom.
âAI is like a safe space. A place where you can be brutally honest and you know for sure that thereâs not going to be judgment,â Favour says.Â
For some, even when the responses feel artificial, they still return. âI asked ChatGPT for a hug. I was uncomfortable with its response. I know youâre not human, how can you say youâre wrapping me in a hug?â says Tomi. The next day, she went back to the chatbot to pour out more emotions.

Mental health professionals are not surprised. They say that the timing of people turning to AI for comfort is not random. A World Health Organisation research revealed a 25% increase in the global prevalence of anxiety and depression, following the COVID-19 pandemic.Â
âAfter COVID, people went into isolation, got into their shells, and became more into themselves,â said Boluwatife Owodunni, a licensed mental health counsellor associate. âSo, having an AI respond that, âIâm here for you,â might provide them with some sense of comfort.âÂ
With therapy services often being inaccessible and unaffordable for many Nigerians, Owodunni believes AI is stepping in to fill a very real gap in mental health support. âIt (AI) is filling a gap. When I was working as a therapist in Nigeria, it was mostly wealthy people who had the opportunity to be in therapy.â She adds, âBut the downside is that itâs fostering secrecy and stigma attached to mental health.â
Some users consider AI more dependable than a human therapist. Ore says a human therapist told her to âpractice mindfulness,â following an Attention-Deficit/Hyperactivity Disorder (ADHD) diagnosis. She felt her concerns were brushed aside, so she turned to ChatGPT. âThat felt more supportive as opposed to a 30-minute virtual consultation with my psychotherapist.â She insists that, unlike the vague reassurance she got in therapy, the chatbot offered a structured plan and practical ways to cope with ADHD. Â
Where does the future look like?
As AI systems evolve and are trained on more complex data, fine-tuned for context, and sharpened to mimic empathy, it raises the question of how far people will go to deepen their connection to AI. Will human-AI companionship grow as these systems become more emotionally intelligent? Not everyone is excited by that possibility.Â
Some users have expressed concern over AI becoming too emotionally intelligent, out of fear that it could cross boundaries that should remain human.Â
Kingsley Owadara, AI ethicist and founder of Pan-african Centre for AI Ethics, believes that emotional intelligence in AI can be useful, but not in the way most people imagine. âAI could be made as a companion to people with health challenges, and could meet the specific needs of the person,â he said, pointing to cases of autistic and blind people.Â
Other AI experts and developers warn against expecting too much from machines that arenât built for the full spectrum of human care. âAI can only augment our current situation; it cannot replace psychologists,â Ajibade adds.
The concern isnât abstract. Mental health professionals and AI experts worry that as more people turn to AI for emotional support, real-world consequences could unfold. âWeâre going to have a huge problem with social interaction, with empathy, with sensitivity, with understanding people,â says Owodunni. She notes the bigger fear that widespread reliance on AI bots may âfoster secrecy and the shame attached to mental health or seeking therapy services.âÂ
Still, for many users, the AI chatbot isnât trying to be a therapist; it is the only space where they feel heard. âI told AI that I was tired,â Tomi says. It said, âI know. Youâve been carrying so much for so long. Itâs okay to feel tired.ââ She didnât reply. She didnât need to.
*Names have been changed to protect privacy.
Mark your calendars! Moonshot by TechCabal is back in Lagos on October 15â16! Join Africaâs top founders, creatives & tech leaders for 2 days of keynotes, mixers & future-forward ideas. Early bird tickets now 20% offâdonât snooze! moonshot.techcabal.com

