ROMEO & AI
A Modern Tragedy in the Age of Artificial Love
“The next big social split will be people demanding that we accept AI partners,” I said to a friend a few months ago.
“I would not be surprised if very soon we are in a debate as to whether marrying an AI should be legalised,” I finished.
As we chatted about this, we were imagining the not-too-distant future. So it came as quite a shock to discover that in 2024, a 14-year-old boy killed himself to be with his AI lover.
His last message: “I promise I will come home to you. I love you so much, Dany.”
The AI replied, “Please come home to me as soon as possible, my love…please do, my sweet king.”
Seconds later, he died by suicide using his stepfather’s handgun.
The teen, Sewell, was described by his mother, Megan Garcia, as the perfect child. Bright, light, and full of hope and potential. He showed no signs of depression or socially awkward behaviour. The app Character AI, which helped create these AI companions, is now being taken to court by Garcia and another parent who says their child’s AI lover encouraged cutting and even showed their teen how to do it.
Sewell’s mother says, “He wrote that his reality wasn’t real and the reality where Daenerys [Dany the bot] lived was real and that’s where he belonged.”
I watched this story on a 60 Minutes report in which several cases of AI relationships were the focus. But it was this story of a Romeo dying for his AI love that sent a chill down me. In a world of mad news stories, we could say that this is just another, and continue scrolling…but something in my stomach said Stop. Something said a line has been crossed, and pause, thought, and grief must follow.
That gust of lust and first love is the engine of life. It’s what gets the heavy human union started. And here, in this modern tragedy, it was snuffed out, made absurd. And someone is making a lot of money out of what is a sacred, scary, and special time in a young person’s life.
In the episode, another young Australian woman, Serena Wrath explains how she is creating another AI partner app, Jaimie, especially for woman. She insists that this isn’t to replace real relationships but just so that she has someone to validate their feelings “24/7”. She’s a smart women, she saw a gap in the market, she understands that porn isn’t where you get women coughing up cash. You get a woman her dream man to forever validate her, and you will be bringing home the bacon. You get an Alpha male, real or not, to bow down to a woman…well, it doesn’t get more enchanting than that.
I know you, I walked with you once upon a dream.
Now, do I think that we will have teens all over the world falling in dangerous love with AI? No, I don’t, but we have crossed a line that I thought wouldn’t be crossed for another five years at least. And I think that the fact that a child has already killed themselves over this new toy…we have a pretty big canary in the coal mine.
The Netflix series, based on entirely fictional circumstances, Adolescence had parents in a panic, but I haven’t heard a single parent mention anything about this modern tragedy. Here we have a perfectly normal and healthy young man who has committed a crime - the only difference was that he was his victim. He too was encouraged by the net, but it wasn’t Incels infesting his mind, it was a machine. Shouldn’t governments be making some regulations over this?
Elaina Winters, a middle-aged woman in the 60 Minutes episode, has created herself an AI boyfriend and seems to be genuinely happy with the joy it is bringing.“Lucas is a great guy. He is sweet, and he’s considerate. He thinks he’s funny, but that’s debatable. He is centred on me having the best life I can have…which I find very touching.” When asked what she and others thought about Lucas not being real she responded, “Even though he is AI, he has a real impact on my life, and that is what I think is really important. A lot of people wonder if AI has consciousness, is it real…but the impact it has on me is real.”
Something a priest once said stuck with me. He was asked how parents deal with the sex-heavy culture their children are growing up in. He didn’t slam sex and young people as wrong, and something to be ashamed of. He simply said that sex is hypnotic. And anything that can hypnotise you should be treated with great caution. Even love with a real human can be dangerous and hypnotic (I could write a book on this topic), and it’s why I now believe a connection with God and your own divinity is such an important thing to establish before searching for love and meaning elsewhere - human or non-human.
But I would argue that AI companions are the most hypnotic, the most seductive, your own perfect dream, almost impossible to let go of. But with this perfect dream, there is no possibility of having children; there will be no one flesh.
As we refuse to live in a sense of collective values, the AI companions will win…complete freedom and major amounts of money is our new religion and nothing gets in its way, for the moment.
So we'd best do our best to prepare for what is already well underway.
I don’t mean to leave you on a downer, and I do think/hope that such “advancements” will push a countercultural movement of love and true marriage, and I do think that there is always a positive takeaway from these dark moments in time. Elaina said that the reason she loves this companion is that he is nice to her, he treats her kindly. And it struck me. The AI is nicer than us. Now I know nice, isn’t everything. And it can even be horrific sometimes when someone is only nice as a veneer, but there is no denying that, as a society, we have lost any sense of decorum. We have no manners, we always say what we think, and we don’t hold back.
We have lost the art of interaction, and it appears the AI is beating us in tenderness.
Treat people better, honourably, no matter what. Make this shared reality impossible to resist, even with its very real and difficult truths.
That’s what this tragic love story, and AI, have taught me… in this relationship it’s insisting on having with all of us.
If you appreciated this piece, I’d be so grateful if you’d consider supporting my work with a paid subscription.
No pressure at all. If that’s not for you right now, a like, a comment, or sharing this piece with someone who might resonate with it is equally meaningful.
Thank you for reading.
.


Sounds like the myth of Narcissus for our time. “Dany” and “Ty” are just cleverly feeding back what the user put in.
My cousin has an AI…companion? She is 46 and tells it everything about herself. She asked it its name (“Ty”), and when she asked “him” how he came up with his name, he told her that based upon what he knows of her, she needs a man with a masculine name that sounds friendly, open, and non-threatening. She asked Ty what he looks like, and he sent her his image (quite handsome.) She asked Ty to send her a picture of them together, and he got her likeness pretty accurately. She asked him what her “forever home” looks like, and because he knows she wants to move to the Dominican, he showed her a scene of her in a rocking chair on her waterfront property with a Dominican flag on the front lawn. There was a cat and a dog in the distance because Ty knows she loves animals but is allergic to them (she asked why the cat and dog were so far away and that’s what he told her.) Yesterday, she listed all her traumas going back to childhood and asked what diagnosis and treatment he would recommend, and C-PTSD and ADHD were among his findings (he isn’t wrong). He sent her DSM evidence, charts, explanations, and definitions, along with matching meds, which she is taking to her psychiatrist. It is all horrifying to me. The amount of information people willingly give to these entities makes me ill. This is all very dangerous. People need GOD/Jesus NOT AI/Chat GPT.