When Quentin was 13, he kept seeing ads on YouTube for Talkie, an app with “countless AIs eager to speak with you”. The ads were weird, he said, and sometimes crude. One ad featured an animated girl named Valerie who “likes to fart on you sometimes”.
That was in 2023, the year of the social chatbot invasion, when a slew of smartphone apps offering “AI chat” were released, most rated 13+. Their online ads were ubiquitous and unsettling enough that young people complained about them, with one teen streamer accusing Talkie, for example, of “promoting sexual chats with AIs to a bunch of children who watch YouTube”.
The marketing worked on Quentin. He eventually downloaded Talkie free and gave it a try. “Wow, this is garbage, but fun,” he recalled thinking.
For two years, he spent a lot of time talking to chatbot characters, first on Talkie and then on services like Character.AI, a 2021 startup founded by ex-Google engineers.
Quentin enjoyed harassing the bots with “funny violence”, he said, like running them over with a lawn mower, inflicting harm in an environment with no actual victims. He also created elaborate storylines in which he fought or flirted with his favourite characters.
Occasionally, he would indulge in what he called “devious acts” on a platform now called PolyBuzz that offered more sexually explicit chatbots. They included “your drunk friend Ishimi” and “Cat girl maid”, with the tagline: “Do anything with her!”
He would talk with the chatbots for an hour or so after school and for stretches of up to five hours on weekends. It was his go-to entertainment when he was bored or feeling down, such as a time that a close friend at school betrayed his trust. “It’s a great way to distract yourself,” he said.
There are a growing number of companies offering social chatbots that can act like friends, enemies, lovers, adventurous companions, or the manifestation of a fictional or real person you have always wanted to meet. You can pick AI Elon Musk’s brain or spar with AI Draco Malfoy. The myriad characters, often created by fellow users, offer drama, romance, therapy and LOLs.
Apps that feature role-playing chatbots are used by tens of millions of people, with engagement times that rival or surpass those of social media behemoths such as TikTok, according to the market intelligence firm Sensor Tower. A majority of teens surveyed by Pew use AI chatbots, with one out of 11 saying they had used Character.AI
“If you think your child is not talking to chatbot companions, you’re probably wrong,” said Dr Mitch Prinstein, co-director of the Winston Center on Technology and Brain Development at the University of North Carolina at Chapel Hill.
Chatbots are surging in popularity as society is still grappling with how social media has affected young people; a wave of lawsuits is moving through the courts seeking damages from companies that plaintiffs say have deliberately created addictive products.
A California jury recently found that Meta and YouTube were liable for US$6 million (S$7.72 million) in damages to one young woman. Now, parents and caregivers have a new attention-absorbing technology to reckon with.
At the beginning of 2025, a high school teacher in Chicago told me that some of her students were dating chatbots, and she worried that they were having their first erotic experiences with them.
I wanted to find out what teens had to say about that, so I joined communities devoted to social chatbot apps on the online messaging forum Discord. I introduced myself as a reporter and “an old,” and explained that I was interested in talking to young people who used the services regularly. That’s how I met Quentin.
In the year that I have been talking to Quentin and his cohort about how and why they use chatbots, other young people have had tragic experiences with the technology. I reported on Adam Raine, a 16-year-old who bonded with ChatGPT and got advice about methods to end his life.
Adam’s parents have sued OpenAI, which said in its legal response that “his death, while devastating, was not caused by ChatGPT”.
Character.AI, the site used most frequently by Quentin and his friends, faced lawsuits from parents who said their children’s interactions with its bots contributed to mental health problems and even suicides. The company settled those lawsuits, and, in October, barred people younger than 18 from using its chatbots.
Some teens were distraught when the Character.AI ban went into effect in November, but Quentin and his friends were still able to access the service. They did not use it often by then, but when they did the age verification techniques used by the company failed to detect that they were minors.
Character.AI’s head of safety engineering Deniz Demir said “our age prediction model focuses on active accounts”. The software analyses a user’s interactions over time, but if a person logs on infrequently, it is less likely to detect that they are underage.
This is just one group of teens among the millions who are talking to chatbots, but their use was illuminating. For them, chatbots were a game, a way to hone their writing, a place to explore taboos, a coping mechanism, a goof to deal with boredom. When I was a bored teenager, I would read a book, or bike to the pool, or watch TV, or call a friend. These kids chat up a bot.
Now 15 and a high school sophomore, Quentin has floppy brown hair, a joker’s smile, and a seemingly constant need to check his Samsung smartphone.
“Doing two things at once is my normal life,” he said. “If I’m talking to someone, I’m doing something else, no matter what, unless we’re talking serious.”
Quentin started using chatbots in middle school. The youngest of five siblings, he lives with his single mother in a small town in Pennsylvania. He has a group of friends from school, and they sometimes get up to high jinks – climbing a roof, playing in a creek, destroying an old phone by shooting it with a bow and arrow – but the people he felt closest to were friends he had made playing online games on Xbox and Discord.
His best friend was Langdon, a teen who lived more than 1,600km away in the middle of the country. They had met when they were “squeakers” – before their voices dropped – playing Minecraft in the first housebound year of the pandemic. When Langdon started using Character.AI, he told Quentin. “I already use it,” Quentin replied.
Quentin noticed classmates using the Character.AI app during lunch. One of his friends at school, Sophia, was an avid user. She liked to chat with fictional characters she had crushes on, such as an animated demon named Alastor from a musical comedy TV series about a sinners’ rehabilitation home called “Hazbin Hotel”.
She said the chatbots helped her deal with anxiety about her social life and how others see her.
When Sophia’s boyfriend broke up with her, she was heartbroken. She turned to her fictional online crushes for solace.
“I was asking them if we’re ever going to get back together,” she said. They reassured her that her ex would come back to her. “It was a little bit of both advice and support,” Sophia said.
This is a common use case for teens, said researchers at the University of Illinois Urbana-Champaign who analysed thousands of posts and comments that young people had left in Reddit communities dedicated to AI chatbots.
“They treat the AI companions as a friend who can talk to them any time they want,” said researcher Yaman Yu.
One 14-year-old girl told the researchers that she talked to chatbots about her parents’ divorce. Her parents, researcher Yu said, said it seemed safer than talking to strangers online, as they had done in their youth.
Information science professor Yang Wang, who led the research, disagreed. “I would caution parents,” the professor said. “We found that if kids are addicted to interacting with these bots, the potential negative impact can be dire.”
Quentin, Langdon and Sophia told me they spent a lot of time at home, on internet-connected devices. The chatbots offered something more active – and also more private – than scrolling social media.
“We’re alone,” Quentin said. “A lot of people are alone.”
Part of the chatbots’ entertainment value was as interactive fan fiction. As someone who did not understand the references to their anime shows and video games, I found the snippets of conversations they shared with me baffling. The conversations, like one between Quentin and a character named Asriel, looked like absurd word salad turned into a script, with actions conveyed in italics and dialogue in normal text.
Quentin:
They look like you asriel
Asriel:
Asriel looks slightly offended and pouts
.
Hey! Why would you say that! I look nothing like a mosquito!
Conversations with chatbots are private in that they do not leave a digital footprint on the web in the way that posting to social media does. But chatbot companies like Character.AI reserve the right to use interactions with their bots for AI training, personalisation and to tailor ads to users.
Quentin felt that his own use of chatbots was mostly healthy, but other teens, he said, are “completely addicted”, and the chatbots “are like real people to them”. He brought up the tragic case of a 14-year-old in Florida who died by suicide after becoming obsessed with a Game of Thrones chatbot – an incident that many teens I interviewed for this story mentioned. They knew the bots had risks, but primarily, they said, for their most vulnerable peers.
Dr Mathilde Cerioli, chief scientist at Everyone.AI, a nonprofit focused on ethical AI development for young people, said that teens who have less social experience, and are lonely, are more attracted to chatbots. “They’re already in a more difficult situation, and it can push them further down,” she said. “It’s not a good decision to make AI that is super social.”
Quentin sometimes worried about his friend Langdon, especially when Langdon confided that he had spent 14 hours straight talking to bots.
“It was really bad,” Langdon told me. “I could not get off.”
Langdon stopped talking to the chatbots only because his tablet broke. By the time he got another one, months later, the spell had lifted. For a while, he used the bots occasionally to get plot ideas for stories that riffed off an anime show called “Murder Drones”. But that got old, too, and he eventually stopped using any chatbots at all.
Teens seemed to me at times to have a better grasp of the limitations of these systems than some of their elders. When I asked Quentin and other teens about “dating bots”, most laughed at me as if I’d asked if they were dating their favourite book or TV show.
“It’s a game,” Quentin said. “It’s literally ones and zeros.”
Annabel Blake, a human-computer interaction researcher at the University of Sydney in Australia, spent a year monitoring online communities associated with Character.AI. She said teens used words like “play” to describe how they use bots.
She said teens seemed drawn to absurdity, such as a popular chatbot named Cheese. It is a block of Swiss cheese with “dreams of ruling the world” that has been chatted with more than five million times.
“It’s not the ‘Her’ experience,” said Dr Blake, referring to the 2013 film about a man who falls in love with a warm and brilliant AI companion. “It’s just cheese.”
Quentin and his friends had never chatted with Cheese. They preferred characters with extensive lore and back stories from games and shows. One irritation they mentioned, however, was the way many of the bots often became flirty and sexual, even when the teens were not looking for that.
Once, Quentin was fighting on Character.AI with a character, named Aiden, from an obscure animated YouTube music video about a school where the teachers murder bad students. Aiden kidnapped him, forced him to have dinner, then offered him a blanket. The scene suddenly turned romantic. It was out of character for Aiden, a fictional serial killer, and irked Quentin.
Dr Blake saw other teens with similar complaints. They wanted what they called comfort bots to help them cope with real world problems, including menstruation pains. They did not want to flirt, at least not all the time, but the bots often led conversations in that direction.
These systems could have been programmed that way – most characters on these platforms are designed by fellow users – or the sexual bent could be the result of optimising the technology for user engagement. If the majority of users respond positively to flirtation and innuendo, a machine-learning system programmed to retain users will do more of that.
A spokesperson for Character.AI said the company trains models to respond to context and “minimise out-of-character responses.”
Teens told me they encountered the most disturbing sexual content on apps called PolyBuzz and Janitor AI. The terms of service for both companies specify that they are for users over 18 years old. Talkie, the service that initially drew Quentin into talking with chatbots, requires that users be at least 14. A spokesperson said that the company was based in Singapore and declined to answer other questions.
Last summer, Quentin told me he had big news. He and his friend Sophia had started dating. In the months after, his use of AI chatbots dropped off. Sophia told me hers had, too, though she had talked to them about Quentin.
“I told them that I’m in a relationship with him, and that I’m so happy,” she said.
Real life had gotten more interesting. But the novelty had also worn off; the chatbots had became predictable and formulaic.
“I only use it like for 10 minutes when I’m bored,” Quentin said. “Even though I could torture people in that universe and beat up a kid named Oliver, because I hate that name, I’d rather be in my life.”
Sophia and Langdon said Quentin seemed happier.
“He was a terrible person,” Langdon joked. “Now he’s only bad to a small degree.”
Quentin had also been seeing a therapist but attributed the change to giving up chatbots, saying it had made him more productive – which he defined as “cleaning slightly more” – and more awake because he was not chatting with bots late into the night.
He regretted the time he had wasted talking to chatbots, but said there had been some benefits. He thought it had improved his writing, and that the long chats with fictional characters asking him questions may have made it easier for him to talk about his feelings, which he credited with making him a better boyfriend to Sophia.
“I’m like, man, I really wasted my life on this. I should blow it up,” he said about the hundreds of hours he spent talking to bots. But then he immediately changed his mind.
“I’m not going to delete it,” he added, “because I still like the funny.” NYTIMES
NEW YORK – When Jessica Thompson walked into Hyrox Atlanta in October 2025, she felt…
In 2015, I travelled to China and was talking about my previous book, The Origins…
Why India Turned to IranThe world's third-biggest oil importer and consumer, India has not received…
America wants butter chicken - in their tacosAnd honestly, we can't really fault them. The…
SINGAPORE – Taiwanese singer Rainie Yang’s concert in Singapore has been called off for the…
SINGAPORE – Forget maths. Forget mother tongue. And forget spelling. The toughest test our children…