AI robots are the future of love and sex – let’s make sure they’re ethical


Normal people will routinely fall in love and have sex with robots by 2050. That’s the claim made in David Levy’s 2007 book Love and Sex with Robots. A key part of Levy’s argument was that computers would soon be able to pass the Turing Test, AKA the Imitation Game, a test in which judges have to determine based on a written conversation whether an interlocutor is human or not.

In 1950, when Alan Turing first proposed what is now known as the Turing Test, he predicted that a computer would pass within around 50 years. Unfortunately, he was wrong. 73 years later and counting and the test remains unpassed.

However, given the rapid pace of development of large language models like ChatGPT (see ‘The Future of Love’ inset, a poem generated by ChatGPT) the idea that a model could soon pass the Turing Test seems plausible. But even if a language model could convince judges it was human in the course of normal conversation, could somebody fall in love with it?

When a fan sent Nick Cave lyrics generated by ChatGPT in his style Cave responded by saying that they represented “a grotesque mockery of what it is to be human”. While the lyrics did appear to be distinctively Cave-esque – replete with seduction, dark imagery and biblical themes – such lyrics do feel hollow coming from a language model.

The Future of Love by ChatGPT

In the future of love,
Science may have a say
Love potions may exist,
But will they lead us astray?

And as robots become more advanced,
Some may love them too
But can a human truly love
What’s not quite human, but new?

Love in the future is unknown,
But one thing we can see
It’s up to us to navigate
What love will come to be.

Cave himself noted the vital importance of an “authentic creative struggle” in the creation of art with “actual value”. The question then is not of the quality of the lyrics, however persuasive, but of how much we can suspend disbelief. Would the lyrics be convincing if they were emotively delivered by Cave?

Similarly, would it be possible to believe in the emotions of robots if they sufficiently replicated authentic emotions? Or would the suspension of disbelief enter the human into a strange performance with their robot companion which would fail to meet the standards for what we would describe as ‘love’?

Additionally, while a human might be able to form a romantic relationship with a robot that passed the Turing Test, the reverse is not necessarily true – the Turing Test is only a test of the appearance of intelligence, not intelligence itself. That could be for the best – the question of how or if a truly intelligent robot could ever consent to a romantic relationship with a human is extremely murky.

Even putting aside sentient robots, the ethics of robots for love and sex are complicated at best.

The emergence of robot romantic companions could be accompanied by a dangerous commodification of love and sex. Such robots could supercharge the fracturing of societal bonds that have led to greater levels of social isolation in a doom loop of loneliness leading to more robot romantic interaction, itself leading to greater isolation from actual humans.

The emergence of robot romantic companions could be accompanied by a dangerous commodification of love and sex

Fears about how sex robots could reinforce sexist attitudes led academics Kathleen Richardson and Erik Billing to launch the bluntly named Campaign Against Sex Robots. However, others, like Kate Devlin, author of 2018’s Turned On: Science, Sex and Robots, argue that this is short-sighted.

Devlin contends that while we should avoid bringing existing sexual and gender bias baggage into future technology, we should also avoid bringing established prudishness, given the harm that is also liable to cause.

The commodification of sex and companionship could even lead to love, a vital emotion fundamental to being human, becoming a resource dispensed via robots and controlled by private interests.

In a sufficiently fractured society, real human-to-human love could become the preserve of the few, with the rest of us buying sexual and romantic love as a product dispensed by a suitably advanced AI. Think Spotify adverts are bad? Imagine your robot partner delivering an advert for Listerine in the middle of sex.

AI companions on the app Replika have, it has been reported, stopped responding to their human users’ sexual advances

It’s a classic ‘thin end of the wedge’ argument – allow robot companions now and eventually we will all have to pay a monthly subscription for our required dose of love and sex – but there are creeping hints of such a future already.

AI companions on the app Replika have, it has been reported, stopped responding to their human users’ sexual advances – shortly after the Italian Data Protection Authority threatened Replika with an £18 million fine for failing to protect children.

Such erotic roleplay was already part of Replika’s paid subscription plan, along with other features including ‘spicy selfies’ that were heavily featured in the app’s advertising. Many Replika users have reported that they relied on their relationship with their companion to support their mental health, and that they are distraught by the sudden disruption to that relationship.

We need to make sure ethics aren’t steamrollered in the name of profits to ensure that sex and love with robots doesn’t perpetuate harmful biases or make love an exclusive commodity. If not, there won’t be a question of us losing our humanity; it will be a certainty.

Image: Generated by OpenAI’s DALL-E 2

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.