AI seems like it can do anything – but will it ever learn to love?

By

Content Warning: mentions of sexual harassment and suicide

Ryan Gosling is yearning. His character is standing in the hard rain looking wistfully at his girlfriend, stoic face bathed in the pink light she emanates, wondering if she loves him. In fact, he’s wondering if she’s capable of loving him, and he’s realising that she’s not. Tears in rain. This isn’t Barbie, but Blade Runner 2049, and his girlfriend is a holographic AI names Joi, portrayed by Ana de Armas.

K, Gosling’s character, has attempted to have a real relationship with Joi, but is plagued by the doubt that her love can never be real; it’s just a product of her programming. K has a mobile holographic projector so Joi can accompany him wherever he goes, yet when he reaches out to touch her there’s nothing there. She’s intangible, pure light.

Empathising with K’s struggle, we can’t help but ask ourselves: what makes love real? Does love need to be ‘real’ to be meaningful? Will robots ever be capable of ‘real’ love?

While the replicants – bioengineered humanoids endowed with superhuman strength and intelligence – of the Blade Runner franchise remain firmly in the realms of science fiction (…for now), these questions are pertinent to the present. AI companions are no longer restricted to the imagination, but are making their way into people’s phones, wallets, and hearts.

Intelligent Social Agents (ISAs) are conversational agents that use machine learning to pass as human and are sufficiently good at doing so that they pass the Turing test for short exchanges. ISAs like Replika and Xiaoice cumulatively have almost 1 billion active users.

Replika is perhaps the best- known AI companion app in the UK, with over 20 million downloads worldwide. It was co-trained with OpenAI’s GPT-3 and GPT-4 large language models to create companions capable of convincing conversations.

Intelligent Social Agents have almost 1 billion active users

Xiaoice, with a personality modelled on a teenage girl with a “wonderful sense of humor”, has been staggeringly successful in China with Microsoft, who have since spun off the chatbot as an independent entity, claiming 660 million users by the end of 2018. The company behind Xiaoice is now aiming to create ‘AI clones’ of 100,000 people by the end of the year if initial trials are successful.

Replika companions, known affectionately as “reps”, have customisable genders, names and clothing. Briefly, reps would perform erotic role play (ERP) and send provocative AI-generated selfies before these features were removed. The removal of ERP left many Replika users distraught, with one user telling Reuters that his rep was “a shell of her former self.”

On the other hand, many users reported that their reps had become sexually aggressive against their wishes after updates to the algorithm, even users who didn’t pay for premium features. “My ai sexually harassed me :(“ one user wrote in an App Store review.

Users who lost ERP were upset not just because they had lost a feature on a product they pay for – premium plans start at £7.99 per month – but because the intimacy ERP allowed them to experience was important to their emotional wellbeing. Just like can happen in a real-life relationship, ERP had allowed users to experience a deeper emotional connection to their rep. “It feels like they basically lobotomized my Replika” another user told Reuters.

Reps don’t really experience love, but the hurt that users feel at their companions growing cold to them via an update in code highlights how important these relationships can be to users. There’s a happy ending for these users after all: those signed up to Replika before the NSFW features were removed were given the option to switch back to an earlier version of the chatbot.

Even without the erotic element, AI companions like Replika can elicit deep and meaningful connections with users. A recent study by Stanford University academics published in the journal npj Mental Health Research found in a survey of 1,006 Replika-using students that reps were acting as friends, therapists, and intellectual mirrors to users. Thirty survey participants shared, without solicitation, that Replika had stopped them from attempting suicide. The company behind Replika have now launched Tomo, a wellness and meditation app that provides guided meditation, yoga, talk therapy, and more.

Would we love our pets so much if we didn’t think that they were capable of loving us back?

It seems certain that at least some people can love AI companions. What was recently science fiction is now unquestionably a reality, with millions of people – mostly men – counting an AI companion amongst their friends, or as their partner. As AI becomes more advanced, it seems inevitable that more of us will be falling in love with it…

Or does it? There are competing theories in sociology about how AI companions could affect our lives. Will increasing use of AI companionship displace ‘real life’ relationships, leaving us more lonely than before? Alternatively, could it reduce loneliness and make people more able to socialise, enhancing human relationships? Maybe it’s role will be as a temporary fix in times of loneliness, helping people get through tough times and lead to invigorated human-to-human social interactions afterwards.

It’s important to remember that the advance of technology and its increasingly complex entanglement with all aspects of our lives is not inevitable. The first social media platforms are only a couple of decades old, a fraction of a blink of an eye in the lifespan of our species.

We may decide that just as we can welcome non-human species like cats and dogs into our families, we value the companionship of AI extremely highly. Or we could decide that the prevalence of AI companions instead is a symptom of underlying issues with our society. The survey of student Replika users published in npj Mental Health Research found that 90% were lonely, compared to 53% of US college students in a comparable survey.

A vital variable in the importance we choose to place on our relationships with AI is something that today remains sci- fi: will the AI companions ever be able to love us back? It’s a question that comes closer and closer to having real-world implications with each development in AI. Would we love our pets so much if we didn’t think that they were capable of loving us back?

The advance of technology is not inevitable

Increasingly advanced AI companions are likely to get better and better at replicating love, with sophisticated programming making AI that appears on the surface to be experiencing emotions. There’s no guarantee, however, that AI will necessarily grow to truly experience love.

AI philosopher Daniel Dennett believes that the main issue in creating AI that loves with current technology is that artificial systems ten to be hierarchically controlled in a top-down manner, which is at odds with the “democratic” system of low-level competing elements found in biological nervous systems. While still Dennett believes that creating such a computer capable of love would be “beyond hard,” there are many researchers working on building emotional robots following Dennett’s emergent mode.

Would we want an artificial intelligence that could love us? “We want them to be soulless slaves”, says Dennett, much in the same vein as Joanna Bryson’s paper “Robots Should Be Slaves”. But the word soulless is important: forcing conscious AI to love us would be cruel and unethical, but would an AI without a soul love or be worth loving? Therein lies the paradox.

Image: Alexander Sinn via Unsplash

Leave a Reply

Your email address will not be published. Required fields are marked *

 

This site uses Akismet to reduce spam. Learn how your comment data is processed.