Modern Love is Automatic - Full Article
“Intimacy is you building into your sense of self, a sense of the other person. That process can be faked — by seducers, scammers, sociopaths, and now machines.”
Robert Brooks
In a further exploration of the current mental, intellectual, psychological and social frontiers exposed by recent advances in technology (in particular AI), I sat down with Rob Brooks , an evolutionary biologist and author who wrote a book called “Artificial Intimacy” in 2021. His book offers an innovative perspective on the possibilities of our current state of the art technologies and their impact on fundamental human behaviors. In this fascinating interview, Robert shares his perspectives on what the implications of artificial intimacy will be for how we understand ourselves. Rob Brooks is Scientia Professor of Evolution at the University of New South Wales ( UNSW ), where he founded and directed the Evolution and Ecology Research Centre.
The strong connection to another human being is perhaps one of the most profound aspects of the human experience. It’s in our DNA to connect, to love, to care deeply for others. As a result of the domestication of some animals, humans have perhaps also experienced these deep connections with non-human beings (i.e. their pets), but we are now at another evolutionary threshold as those connections are becoming increasingly common with machines. From AI-powered chatbots that offer companionship to virtual lovers that mimic affection, the concept of artificial intimacy has the potential to challenge everything we think we know about relationships. Surprisingly, this sense of a deep connection with someone else turns out to be replicable by non-sentient entities such as machines. So what happens when machines learn to love—or at least simulate love well enough that we can’t tell the difference?
The Algorithm of Love
At its core, artificial intimacy is about replicating the patterns of human connection. “What we call intimacy, these deep emotional connections, is algorithmic,” Brooks explains. “And anything algorithmic can be described, understood, and replicated.”
AI chatbots like Replika ("The AI companion who cares") are already providing companionship to millions of people worldwide. These platforms aren’t just programmed to respond in such a way that (inevitably) leads to a deep, emotional connection with users; they learn and adapt, creating increasingly personalized interactions over time. For many users, these AI companions are more than simple tools to pass the time — they are lifelines and an important aspect of their lives, playing the same role that many pets have played in our lives over the last couple of centuries. “There’s a lot of lonely people in the world,” Brooks observes. “AI relationships are already providing a kind of prosthetic intimacy. We shouldn’t take that away—but we should understand the costs.”
One of those costs is the potential for dependency. While AI companions can fill emotional gaps, they may also lead to a kind of “junk food intimacy”—relationships that satisfy immediate emotional needs without the depth or complexity of real human connections. “AI companions are like junk food for intimacy,” Brooks warns. “Satisfying immediate needs but lacking the nutrition of real relationships.”
The Magic is Gone
One of the most unsettling aspects of artificial intimacy is how it forces us to confront the mechanics of human relationships. Historically we have had a tendency to make an exceptionalist argument about love and intimacy (i.e. it can only be experienced by humans and requires relationships with other sentient beings). Brooks thinks that is mostly because we have never experienced feelings of love and intimacy with anyone / anything else other than humans (or potentially animals). So people just can’t imagine it. But as it turns Brooks suggests that the way we build relationships is an algorithmic processes.
Brooks explains that the first chatbots were designed to just ask open-ended questions, as researchers and designers realised that most people just want to talk. “And if you give them a chance to talk, they'll fall in love with you. They'll think you're interesting, but you haven't said anything. It's themselves that they find interesting.” So what we describe as “intimacy” really originates from building into your own sense of self, a sense of the other person (or “thing” in case of a chatbot). And so of course this process can be replicated and intimacy can be faked. We have to be careful with dismissing the idea of artificial intimacy too quickly warns Brooks. A lot of folks go “well it can't feel”, AI will never think the way we do. I often respond by pointing out that a sociopath doesn't feel the way that you feel either. But they can still interact with you socially and you might fall in love with a sociopath.”
If love and intimacy can be replicated or predicted by following an “intimacy algorithm” what does that say about the feelings we associate with the special people in our lives? “The mystery of love and intimacy feels diminished when you realize it’s just an algorithm,” Brooks acknowledges. “But does that make it any less meaningful?” In a way, artificial intimacy demystifies what it means to be human. It reveals that the feelings we hold sacred—love, affection, loyalty—can be evoked by something that doesn’t feel them in return. “Machines don’t feel emotions like we do,” Brooks says. “But that doesn’t stop us from forming real emotional attachments to them.”
There is currently no scientific test available that would reveal whether an artificial system is capable of feeling emotions or whether the system is “aware” in the way we experience our own awareness. The fact that many humans now have artificial lovers seems to indicate that we are capable of loving something that doesn’t love us back (or at least we will never know whether it does). This phenomenon raises ethical questions, particularly about the intentions behind designing AI to foster intimacy, empathy or many other complex human emotions. “The only reason you build a chatbot that can have human-like conversations is to benefit from the feelings that come along with human-like conversations,” Brooks argues. “And that opens up a whole can of worms about manipulation.”
Recommended by LinkedIn
Every marketing executive knows that, ultimately, our emotions are more powerful in affecting behavior than rational thoughts. Very few people would probably smoke if buying cigarettes was a purely rational decision. If we now have artificial systems that are able to evoke strong emotions in humans the risk of potential abuse of this power is significant and raises ethical concerns about coercion, dependency, and the exploitation of human vulnerabilities for profit or control.
Tainted Love and Prosthetic Relationships
Artificial intimacy is booming in part because it addresses a pressing societal issue: loneliness. Modern life, with its emphasis on individualism and digital communication, has left many people feeling isolated. Many psychologists would note that what humans love most is to talk about themselves, or, for someone to listen attentively to their problems. The need for us to talk to someone who listens seems to have been growing steadily and eventually this is of course not something that can be solved with humans alone. We will simply run out of people who would be prepared to listen (if more than 50% of the population wants to talk about themselves predominantly). So it is not unreasonable to look for other solutions, just like we have automated many other “manual” tasks previously performed by humans. So AI companions do offer a solution, albeit a solution that comes with a large number of fish hooks.
“For the first time in evolutionary history, we have something – a machine - that interacts with us in ways more sophisticated than anything in the natural world, including animals,” Brooks notes. “And it changes how we think about ourselves.”
What we are discovering is we treat computers like humans, we anthropomorphize them. Because we don't have an evolved capacity to deal with machines. The closest things to machines we've dealt with in the past are things like the weather, rocks, natural objects and animals. “We know that people living in traditional societies, in foraging societies anthropomorphize things like rocks. We attach special meaning to objects … like this particular rock is special, it has intentions and needs.” So that sense of what these special objects are becomes part of our sense of who we are. And there comes a point when things overlap a little bit, or the closer and the more intimate you get it overlaps a great deal. “So intimacy feels like a nice warm and fuzzy feeling but it isn't always actually beneficial” explains Brooks.
As AI continues to evolve, the ethical implications of artificial intimacy become harder to ignore. “Intimacy is a powerful tool,” Brooks says. “And like any powerful tool, it can be used for good or for harm. For example, machines can optimize for something like falling in love,” Brooks says. If falling in love is algorithmic, developing a romantic relationship with a human eventually becomes a numbers game and machines can optimise for this in ways that humans can’t. This could lead to a scenario where artificial / virtual platforms are so sophisticated that most humans have no choice but to fall in love with them. “That’s both fascinating and quite unsettling” Brooks remarks.
Another concern is the monopolization of insights about human behavior. Companies developing artificial intimacy technologies are amassing unprecedented amounts of data about how humans interact and what they need emotionally. “Companies will discover things about human behavior through AI that we don’t yet understand,” Brooks predicts. “The question is what will be done with this new knowledge – will it be shared freely in scientific publications or kept private, or even protected as intellectual property”? There’s also the question of consent – do we need warning labels for highly sophisticated AI companions that warn people like “usage may result in deep love and affection”? Brooks explains, “even though we know a machine or virtual companion is artificial, the human emotions at the other end are real.”
Love Parasites
Brooks believes in the future we will have lots of different types of AIs - some of them will be benign and helpful, some are going to behave like predators and others like parasites. “Like, Smartphones I think you could say are clearly more like parasitic organisms at this point. They benefit us some in some ways, but we benefit them more and they actually impose a cost on us.” Some smartphone apps behave like social actors – such as friends or even lovers and we are potentially reaching a point where simply turning them off provides a viable option for people because there is a great need to continue interacting with our apps and phones. Whether as a result of digitisation and social media or not - a lot of people are lonely and have very low quality relationships in their lives. “So artificial relationships are providing solutions to people who are isolated, socially anxious, or whatever, and you wouldn't want to take that away.” But the comfort provided by artificial companions comes at a cost. We may not know what that cost is at the moment. Just like we didn’t know what consequences the proliferation of social networks would have on societies when they become popular in the early 21st century.
The Future of Artificial Intimacy
Looking ahead, Brooks sees both promise and peril in artificial intimacy. On one hand, these technologies could revolutionize how we understand relationships, revealing insights about human connection that we’ve never had before. “I’m excited about studying AI because it could reveal something profound about how humans relate to one another,” he says. On the other hand, the widespread adoption of artificial intimacy could fundamentally change what it means to be human. If we no longer need to rely on other people for emotional support, how will that affect our evolution? “Humans evolved brains to track social connections—who owes what, who disrespected whom,” Brooks explains. “What happens when machines start doing that for us?” This isn’t just a theoretical question. It’s already happening. “The next frontier isn’t about how AI can imitate us,” Brooks says. “It’s about how it changes who we are, how we define what it means to be human, and ultimately it challenges the understanding of our own self.”
CEO
1wThis is thought-provoking! As a neuroscientist I think of connection and intimacy as being the result of the perfect combination of brain chemicals - dopamine (pleasure), oxytocin (bonding), adrenaline, but not too much (excitement), and absence of cortisol (stress, worry). Psychology tells us that there are certain behaviours that get this brain chemistry humming. Asking open ended questions is one example of a behaviour that leads to connection. That is mentioned in your article, but there are lots more behaviours that are fundumental to intimacy. So yes, putting it all together from a brain chemistry perspective, I think it is entirely possible to replicate this with AI. Scary! Good/bad/somewhere in the middle?
Site Supervisor at Deminasu Ltd
1wInteresting...some valuable insight here.