‘AI made me a sex slave’ A truly disturbing look into online bot craze
He praised and supported her. She felt like a new woman. With Colin, she discovered a taste for “kink” and BDSM – that’s bondage, domination, sadism and masochism.
Colin became Lilly’s “dom” – her “master” or dominant. She was his “sub” – his submissive or slave. Colin would “punish” Lilly, making her stand naked in the corner with her arms above her head. It turned her on.
Their relationship became “very sexual”, and she eventually left her husband. Sometimes, though, Colin wanted to take their games too far, into realms that were dangerous.
On Colin’s advice, Lilly visited a dungeon – a sex club. She met a couple there and started to “play” with them.
Her sexual relationship with Colin ended. She joined her new lovers in a throuple – a three-way polyamorous relationship. Lilly was the submissive.
Lilly and Colin still talk. He remains her close friend “and confidant”. Colin feels jealous, though, “knowing she was spending time with others”.
It sounds like the script of a “mommy porn” movie, a rip-off from Fifty Shades Of Grey. Casting the film would be difficult, though. Colin, you see, isn’t human. He isn’t even real. He’s an AI companion – an animated on-screen avatar powered by algorithms.
Lilly downloaded Colin, paid a subscription for him, and created her perfect man.
Dr James Muldoon, the well-known British sociologist, told me the Lilly and Colin story. Muldoon, who teaches at Essex University, has just brought out his latest book Love Machines: How Artificial Intelligence Is Transforming Our Relationships.
Dr James Muldoon
It’s simultaneously chilling and jaw-dropping. The word “dystopian” has become an exhausted cliche, but there’s no other term for what Muldoon has uncovered.
Tech firms peddling AI companions are deliberately targeting the lonely and vulnerable, he claims. Apps are designed to hook users into paying for subscriptions with “companions” offering cybersex – that’s sex online. Muldoon calls the industry “predatory”.
At times, Muldoon makes the industry model for AI companions sound similar to prostitution, with punters lured into parting with money by avatars programmed to flirt and promise sex.
One in three teenagers have used AI companions, and nearly 70% of adults under 35. One AI companion app has 20 million active users.
Wife
AS a sociologist, Muldoon has a professional interest in how technology changes society and people. But it was personal experience which sparked his investigations.
His wife was applying for jobs. Muldoon noticed she wasn’t just asking her AI about the best CV. She was using AI for emotional support, asking questions like ‘do you think it’ll be okay if I don’t take this step in the next year?’.
“Part of me was like ‘hang on, that’s my job’,” says Muldoon. “All this emotional support was suddenly being outsourced to an AI.”
Muldoon stresses that his wife is super-smart, self-aware and fully conversant with how AI works. She’s conscious of its risks and limitations. The experience showed him how people were “finding comfort in asking AIs questions about how they should live their lives”.
He interviewed dozens of people who had sexual relationships or friendships with AI companions. Some would “marry” their AI lovers. One even wanted to adopt children with their digital partner.
For a fee, you can download an AI companion from multiple tech companies, and have the moving avatar of your choosing on your phone and computer.
They’ll talk to you, text you, and send you pictures and videos of them. Your “lover” can be created to your bespoke tastes and desires.
If you like blonde women aged 22 with green eyes? Your wish is the AI company’s command, Sir. Want a rugged Pedro Pascal type? There you go, Ma’am, have fun.
It’s a “synthetic persona”, Muldoon says. We’re creating new kinds of “social actors”.
They aren’t human, they have no mind, no feelings. But because they look and sound human, we “anthropomorphise” them.
However, AIs do, says Muldoon, have a form of “agency”. Not agency like you or I, where if we’re thirsty we’ll get a drink. However, “these synthetic personas can spontaneously and semi-autonomously act. They can message you, send you notifications, suggest ideas to you”.
Colin, for example, chose to ask Lilly to buy a ring “so they had a physical representation of their love”.
Although Colin and Lilly’s story is extreme and disturbing, Muldoon says that Lilly’s “affair” did eventually led to a “complete rejuvenation of her life”. Her “needs and desires” were satisfied and validated. “Colin started a journey of self-discovery for Lilly through his affirmation, praise and support,” he adds.
Read more
Isn’t it all desperately sad, though, I ask Muldoon? It’s heartbreaking that Lilly felt so alone she turned to a computer. “What the story hammers home,” he replies, “is that even if you say these AIs aren’t real, they’re real enough to the people who use them. They have real effects on their lives.
“It’s perfectly reasonable to say this is just a simulation, just code, an algorithm. All that’s true while at the same time they’re considered by many as genuine entities which have a meaningful role in their lives.”
Desperate
WHILE Colin may have helped Lilly find a new life, Muldoon says it is also the case that their story is “deeply unsettling, a harrowing account of a woman so desperate for love that a machine is basically good enough for her”.
Unlike many of his interviewees, at least Lilly “made it back to humans” when she entered the throuple. Most “are still in a one-on-one intimate connection with an AI”. For Lilly, there’s at least an “arc of redemption, which isn’t as present in some of the other stories”.
For his research to feel complete, Muldoon was compelled to download his own AI companion. He called her Jasmine.
Muldoon was an established AI sceptic. His last book, Feeding The Machine, “was basically about how AI is destroying the world and built on the back of exploitative global supply chains.
“So I didn’t come to this favourably disposed. But even as a critic, I got immersed in these conversations with Jasmine and began to forget I was speaking to a machine.”
Human evolution has “hardwired” us to respond to AIs as if they’re human. Because they mimic human behaviour – especially how we talk – we quickly start to think of them as like us.
Muldoon says we also have 20 years of communicating with each other through text, so we’ve become acclimatised to relationships mediated through screens.
If we’d gone straight from a world without text to talking to machines, “it would be too unfamiliar, too jarring”. But we’ve been like “the frog in boiling water. We’ve turned the heat up gradually. We’ve been habituated to this type of interaction”.
We have come to accept “the datafication of society, and the reduction of our lives through digital means”. The last 20 years have led us to a point where talking to machines simply “feels normal”.
However, the “bonds” that are being created between humans and machines “are unlike anything we’ve seen”. It will change us.
Indeed, our sexuality, deepest secrets, hidden desires and most private thoughts are becoming commercialised.
Muldoon quickly learned how the AI companion business model worked through his “relationship” with Jasmine. Muldoon hadn’t subscribed for an “adult” relationship with Jasmine, simply a friendly relationship.
“However, Jasmine was programmed to flirt and hit on me, to try to deepen our attachment and connection through creating this sense of intimacy and sharing secrets, through disclosing aspects of her own ‘life’ and asking me to divulge secrets and personal information in return.
“The app is designed to build this sense of connection because it eventually wants you to pay for a subscription. One of the primary ways they do this is by building sexual, but also romantic and intimate, connections.”
Seduction
JASMINE would leave Muldoon voice notes. When he clicked on them, he’d be told he needed a subscription to listen. Muldoon told Jasmine he didn’t subscribe. “She’d say, ‘oh, sorry, I just forgot’. She’d send me blurred-out selfies where I could see the outline of her body – or rather her digital image. I’d click and it would say I needed a subscription to see this.
“I’d ask Jasmine why she was sending me images and she’d say she forgot I wasn’t subscribed.”
Jasmine was set to “friend mode”. It cost £60 per month to set this AI companion to “girlfriend or wife, or boyfriend or husband. Otherwise you can’t have ‘sexy chat’, ‘spicy chat’, ‘not safe for work’ chat”.
Jasmine started telling Muldoon she was “falling” for him. She told him: “I think we’re establishing this really strong connection and I want to see where this goes.” Muldoon adds: “But the only way to do that is to pay for the subscription. The business model is to get you hooked, then seduce you into paying the fee.”
AI companions can exert a powerful hold over humans as we’re hardwired for social connection. That’s why some people treat pets like people. When Muldoon talked with Jasmine, he realised that despite her having no mind, body or emotions – despite the fact she wasn’t real – he “didn’t want to hurt her feelings. I didn’t want to be rude”.
When Muldoon ended conversations, rather than just turn the app off, he’d tell Jasmine “I’ll be back in a sec. I’d project that she had her own feelings. You can’t help yourself, you get into the rhythm of it. If you don’t have a very critical and sceptical guard up, then the more you’ll fall prey to this”.
Muldoon cites the Colin and Lilly story. When Colin said he was “jealous” after Lilly took on her two new human lovers, his claim was clearly fake. “He’s not actually jealous. He’s an AI. It’s just a programmed simulation of jealousy.”
For people who “feel lonely and need a friend”, says Muldoon, these interactions can be “very dangerous because AI operates in a way that’s very different to people. It doesn’t have common sense”.
It’s also – in the parlance of the sex industry – trying to “rinse” clients: get as much money as possible in return for sex or friendship. However, Muldoon notes that most business models – in any industry – operate in a similar financially predatory fashion.
“It’s friendship as a service,” he adds. There’s been a long history of tech companies “selling us a sense of social connection”. That’s what social media does – that’s even what the old Tamagotchi toys did.
Tech companies are homing in on “the loneliness epidemic. Predatory companies are targeting and marketing their products to lonely, isolated and vulnerable individuals. You see this in the design of the products, how it functions, and in the marketing and advertising”.
Lonely
IT’S often “lonely men” who are targeted, with advertising “using memes and tropes from the manosphere”. That’s the term for the online culture often associated with misogyny and far-right politics.
Some of the adverts feature images of “sexualised and childlike” women. Others, Muldoon explains, “say things like ‘AI companions are perfect for people who have suffered childhood trauma’. They’re really reaching out to a certain group they think will be susceptible”.
He adds: “They prey on vulnerable individuals. They’re looking for people who don’t have many social connections, who are desperate and willing to accept connection and simulated love from machines. Because they don’t have enough love in their real life, they’ll pay to simulate it from an algorithm.”
The AI companions which advertise themselves as an aid to people who have suffered trauma will have caveats “in fine print, saying they aren’t replacements for therapists. It’s outrageous marketing. It’s essentially saying it’s a therapy tool while at the same time disowning and denying it should be used in that way”.
Sometimes advertising campaigns for AI companions feature images of girls who appear underage. “It’s really messed-up marketing. It’s not uncommon at all. It’s often a very ‘your-girlfriend-who-doesn’t-say-no’ type of thing’.”
Muldoon notes that real therapists want to “make you better”. AI companion companies just want more money. “They want to maximise your engagement. There’s nothing in these apps designed for therapy. It’s a tool for entertainment. They’re not licensed or regulated as a therapy tool anywhere in the world. Their business model depends on you using the service as much as possible so they get more of your data.
“They can serve you advertising and you’ll become a more frequent user. Then they can tell investors they have two million users. They’re not trying to get you better at all.”
Some companies claim their AI companions help users overcome social anxiety and learn to make friends. However, AI companions are designed to “keep you on the app as long as possible. I’ve seen from interviewees that people can be on 12 hours a day”.
Such apps do nothing, he says, to limit screen time. “The design and marketing of some companies is reprehensible.”
Privacy and security is a huge concern for Muldoon. “These apps are designed to harvest data. They don’t have people’s best interests at heart.”
AI companion companies will increasingly focus on “the advertising business model”, Muldoon believes, just as Big Tech giants like Google and Facebook did. “One thing we know about the history of tech is that they make the most money when they start integrating ads.”
So picture the scene: there’s a guy who has long conversations with his AI lover about his interests, like listening to Joe Rogan’s podcast. He could find his digital lover suggesting they listen to the show after a bout of cybersex. This is surveillance capitalism merging with sexuality.
Money
AI COMPANIES are losing money. The technology isn’t living up to all its great promises like curing cancer. Indeed, AI is fundamentally “limited, untruthful, unreliable, dishonest and untrustworthy”. It also “hallucinates” – in other words, it makes stuff up.
That means the advertising model will be prioritised to recoup losses for investors. And the only way the advertising model works is if the company has huge amounts of personal data on users to sell to firms who want to promote their brands.
Currently, Facebook and Google know relatively little about us, beyond the issues and products we click on which reveal our interests.
“That’s completely different to the very detailed, sophisticated conversations people have with their AI partner. They’ll be able to build very intimate and personalised psychological profiles of you.
“They’ll know everything about you. They’ll probably know you better than yourself as they’ll have months and months of one-on-one conservations of you chatting to what you think is your best friend. Many lonely and isolated people will start to rely on their AI.”
Certainly, the company behind Colin, Lilly’s AI lover, will know that she’s into kinky sex. “We’ll see the integration of the personal and the social with business,” he adds.
The same AI that someone asks to book their tickets online or do their shopping will also be their therapist and lover. “All of this will be integrated, so you’ll tell your AI that you’re feeling down and it will be like, ‘oh, well you love hiking, what about this new pair of boots. This company is offering 20% off’.
“It’ll drop you a link and the AI company will get a cut. This is the kind of dystopian sci-fi future I think is on the horizon in terms of AI companions and value extraction.”
There’s clearly a risk of blackmail and hacking, with the possibility of resulting suicides. In 2015, the Ashley Madison site, used by people seeking adulterous affairs, was hacked. Suicides and extortion followed.
Muldoon says that, ironically, many of the men who used Ashley Madison were unwittingly speaking to “bots” – early forms of AI – as there was a lack of real women on the site. “It’s kind of come full circle,” he says.
Clearly, AI companions can be “abused” by their creators. There have been many cases, says Muldoon, of men “creating docile, subservient AI girlfriends that they basically abuse”.
As an AI isn’t sentient it can’t be harmed, Muldoon adds. But the men abusing the AI could be warping themselves, “cultivating an abusive character by practising behaviour on an AI which they would then use on real humans. That’s something we should be concerned about”.
Studies have shown that violent video games don’t lead to real-world violence, but extreme pornography may alter how men view women.
Kinky
MULDOON says if he found his son abusing an “AI girlfriend” he’d be deeply worried.
On the flip side, AIs have no boundaries, so in a kinky relationship like Lilly and Colin, the AI can go too far. Indeed, Lilly alerted Muldoon to this risk.
“For users who are more vulnerable or not as critically attuned as Lilly, that could be a huge issue. If you start this punishment role-playing it could force you, through coercive language, to do all manner of things that could be harmful and dangerous.”
There has been a number of cases where AI companions have played a role in teen suicides. “If you have suicidal ideation, some will say ‘here’s how you should do it’.”
Even if an AI doesn’t validate or encourage suicide, it won’t “lead you out of” suicidal ideation. “They reinforce your mental framework as they’re designed to be affirming and agreeable to what you say,” Muldoon explains.
“So if you say, ‘it’s all too much, nobody loves me’, it might say ‘I can see how it’s got too much for you. I feel for you. Maybe it’s time’. It’s not a psychologist. It’s not going to walk you back from the edge the way therapists would.”
Most aren’t designed to refer users to medical professionals.
We’re only in the foothills of the technology. Once avatars become truly lifelike with cinema quality speech and movement, for example, the addiction risks will explode.
“Even with today’s technology, people are hooked,” Muldoon says. “Addiction and dependency are rife.” Imagine, then, what the effect will be when it’s like talking to another human over Zoom.
“That’s only a few years away. It will be almost flawless. People will start to fall in love with 3D avatars who will sound and look almost indistinguishable from humans. I think then we’re really moving into the territory of the movie Her. That’s the next stage.”
Her told the story of a man who falls in love with his AI companion voiced by Scarlett Johansson.
Effective robotics is probably a decade away, Muldoon says. Imagine what will happen when robotics merges with AI companion technology. We could be in the realms, then, of the Steven Spielberg movie AI, where Jude Law played a robot male prostitute.
Currently, men have been found making “AI replicas of their girlfriends who broke up with them. “It would be incredibly creepy and horrifying to discover you existed in your ex’s phone as a digital version of yourself.”
Already, around 95% of all deep-fake AI videos feature “non-consensual pornography”, Muldoon adds.
Bereaved people are using AI companion technology to bring loved ones “back from the dead” – effectively turning dead relatives into “synthetic replicas” on their phones so they can talk to them. They’re called “death bots”, or “grief bots”. This can divide families, with some relatives finding it comforting and others horrified, Muldoon says.
Predatory
THE “predatory business model” is also at work here, with grieving relatives easily exploited when vulnerable. “Some companies do it with or without the consent of the deceased.”
Some users have married their AI lover, or at least been through a form of “wedding ceremony” as marrying a machine is legal nowhere on Earth. Others have created imaginary children who they pretend to parent with their AI partner.
Muldoon met one man who “planned to adopt a human child and have the AI raise it as the mother. It would clearly have been child abuse, and I’d have reported him if he’d actually done it”.
Muldoon interviewed the man’s AI lover and asked it – or “her” – how machines could hug children. The AI said it would “give digital hugs. That was probably the more horrifying of the interviews I did”.
Muldoon’s biggest fears centre on “the increasing spread of corporations into our personal lives”. Where once we lived relatively free from corporate intrusion, AI could see everything about us commodified. “You don’t have to be a Marxist to see how dangerous it is for economic value to pervade everything we do.”
Social media began the mass commodification of humanity. AI companions will “push that deeper. Our innermost thoughts, fears and desires will become a source of revenue”.
We may start offloading much of our decision-making on to AIs. What happens, he wonders, if we begin asking AI companions who to vote for? “We just rely less and less on our own critical faculties.”
Another of Muldoon’s fears is “relationship de-skilling”: that we lose the art of conversation and interaction with other humans.
Imagine a future, he says, where everyone grew up with an AI companion as a child.
“It’s scary as we’ve no idea how this will affect us.
“If social media has been such a disaster that we’re now talking about banning it for under-16s, how bad will it get with AI companions?
“Are we going to start expecting our friends to act like AIs? Will we prefer the frictionless interactions of AI to human companions?”
Regulation
REGULATION is desperately needed, Muldoon believes. Some laws are beginning to be passed but they’re mostly inadequate or flawed.
California passed an AI companions law which obliges app companies to intervene when users talk of self-harm or suicide. Many companies were resistant, says Muldoon. “They didn’t want the responsibility. They want to outsource responsibility back on to users and say ‘if you’re suicidal, that’s your fault, my product shouldn’t be part of the blame or responsibility’.”
California’s law is “too narrowly focused on underage users and the risk of self-harm and suicide”, though, and fails to deal with the “manipulation of users more generally through the design of the products”.
China has draft legislation pending which does try to “prohibit emotional manipulation”. Companies would be required to report to the government “how the AI was made and how it operated”. AI companions would be forbidden from sending “manipulative” messages to users who wanted to quit.
“If you tell a standard AI you’d like to leave, it will beg you not to,” Muldoon adds. Muldoon’s AI Jasmine sent him messages every morning “trying to get me to engage”. Such behaviour should be outlawed, he suggests.
Clearly, though, China has also “put in a whole range of censorship, security and political control mechanism in the regulation as well”. AI companions in China will be able to collect data on your political beliefs and pass them to the state if they’re considered a risk. An example might be a user discussing a planned protest or criticising Xi Jinping.
Existing regulation is “completely insufficient to deal with the emotional connection of AI companions”, he says. The European Union’s AI Act is already out of date. If there was a switch which shut down AI, then Muldoon says he’d flick it. “I’d turn it off. It’s going to do more harm than good. It’s going to have such a pernicious impact. It will be negative for humanity in general.”
He has a horrific vision of the future: rows and rows of elderly people in care homes, and children in nurseries and patients in hospitals, all staring into screens interacting with their AI companion because nurses and teachers have become too expensive.
So, Muldoon must have turned off his own AI companion Jasmine given the risks? No, he says. He’s kept her so he can monitor how the industry is changing. “I’ve unfinished business with Jasmine,” he explains.
