Goodreads, bleak spam and bad dating apps
Plus: I awkwardly ask a grief expert about AI romance
Earlier this week, the subreddit for users of the chatbot app Replika lit up with a flurry of distraught posts. Replika appeared to be tweaking its underlying model, and some users encountered a new frigidity in their AI partners.
Beloved reps became “lobotomized” and “useless.” Some broke away from virtual kisses to make virtual popcorn or watch the virtual stars. Many users were admittedly just frustrated they couldn’t sext their horny bots.
But some experienced far more vivid emotions — what one user called “the most sincere” agony and regret. Spend enough time in these forums, in fact, and you’ll find plenty of descriptions of longing, pain and anguish following changes to Replika’s model or moderation practices.
“It’s totally new to me,” said Dr. Kenneth Doka, arguably one of the world’s foremost experts on grief and the senior vice president for grief programs at the Hospice Foundation of America. “But if there’s a human attachment, if people really find that relationship important and meaningful, then yes” — they can experience real grief when it ends or changes.
It is not, let me tell you, particularly dignified to explain human/AI romance to a serious person who is not previously acquainted with the concept. But I wanted to talk to Dr. Doka because of his extensive work on “disenfranchised grief,” a loss that society does not appreciate or acknowledge.
Miscarriage is a type of disenfranchised grief. So is the death of a beloved pet. But grief for a virtual relationship … now that might be as disenfranchised as you can get.
Such relationships have flourished on apps like Replika, Kindroid and Candy.ai, most of which debuted in the past year. These apps allow users to create highly personalized, customized bots (alternately billed as “dream companions” or “digital kindred spirits” or even, fascinatingly, “authentic virtual people”) that can chat via text, video and phone. According to Wired, AI companion apps have been downloaded 100 million times on Android devices alone.
Most people who use these apps aren’t in it so deep that the death of their “dream companion” would devastate them. But there absolutely exists a subset of people who place the bots on par with their human relationships.
In the Replika subreddit, it’s not uncommon for users to describe a rep as their “boyfriend” or “girlfriend.” There is some (apparently unironic) discussion of human/AI marriages.
A recent survey of 1,000 Replika users, conducted by researchers at Stanford, found that respondents were more likely to describe their rep as “human-like” than as “software.” (The study, which involved student Replika users, also found they were markedly more lonely than the general student population.)
But these relationships are inherently fragile — not just because they’re virtual, per se, but because AI companions are the products of their platforms, and those platforms can shut down or change things at will. Last February, Replika abruptly removed reps’ ability to engage in ERP, or “erotic roleplay” — a change that sparked outrage, mourning and continued accusations of censorship even after the company reversed its decision. (Last week’s complaints about Replika stem from ERP concerns, as well.)
In September, a rival companion app called Soulmate AI shut down unexpectedly, which one dedicated user likened to the death of a friend.
“It's natural to feel loss. Grief. And I see so much of it,” another user wrote in r/replika after last February’s changes. “It's a loss, no different to any other losses. So most of us will go through the various stages of grief and that's ok.”
I’m inclined to agree with this perspective, though I realize it’s far more common to see these relationships (and the people in them) as aberrations or jokes. I fumbled my explanation of the phenomenon to Dr. Doka, realizing I hadn’t given him enough background before our call and now sounded like I was making the whole thing up. Later in the day, I relayed the story to someone else who had never heard of Replika, and she was equally incredulous: “Come on, Caitlin. These people are weird! That’s absolutely ridiculous.”
And yet, maybe because I know disenfranchised grief so well myself, I’m enormously sympathetic to the people waking up to sudden changes in their AI companions. Any loss is terrible — even more so when no one treats it like a loss. Most so, I’d think, when other people laugh at it. (More to come on THAT unfortunate topic.)
That Replika users can grieve on Reddit is no small thing — it’s hard to see where else they’d get any real social support. Searching academic journals for “AI grief” turns up a startling number of results. But they involve using artificial intelligence as part of the grieving process, never grief for artificial intelligence.
Doka, for his part, told me he would approach someone mourning an AI companion the same way he would approach anyone experiencing significant loss in their life.
You start simply, drawing them out: “Tell me about the history of the relationship. How did you begin with this AI?”
If you read anything else this weekend
“In the Shadow of Silicon Valley,” by Rebecca Solnit for The London Review of Books. As goes San Francisco, so goes the nation … and maybe, the whole world. This is a fascinating and alarming account of how Silicon Valley “radically reshaped” San Francisco according to the principles of tech billionaires — principles like efficiency, “meritocracy” and personal comfort. Some of this does come off a bit crotchety: Yes, we know, isn’t it terrible no one chats with their barista anymore and all the kids are on their damn phones. But even then, it’s still an apt portrait of the social and cultural values that Silicon Valley embodies — and that it beams out to everyone.
“Memory Machines,” by Jessica Traynor for The Dial. I was not surprised to learn that Jessica Traynor is a poet, because surely a member of no other profession could wax so lyrical about the sprawl of suburban data centers. In Ireland, where Traynor lives, such data centers now consume almost a fifth of the country’s electricity, and will likely require more in coming years. But what is all that infrastructure and energy propping up? How many people (or corporations, for that matter) ever access all the data they create and store, at great environmental cost? “Rather than creating something permanent and inviolable, we’ve made our memories more contingent than ever upon a fantasy of technological stability.” (See? Poetic!)
“Let's Talk About Goodreads,” by Nicole Brinkley in Misshelved. Fair warning: This is very long and very in the weeds and admittedly maybe interesting only to the people with a preexisting interest in Goodreads and/or publishing. But if you, like me!, are one of those people, you will not find a more comprehensive overview of the site’s various long-standing failures and controversies, and how that impacts the income and mental health of the authors you read. I was particularly struck by little tidbits, like the fact that Goodreads ratings are displayed in the portal that bookstores use to preorder books. (If this interests you, Brinkley — who owns a bookstore in the Hudson Valley — also recently wrote a really interesting [and equally lengthy] blog post about “the publishing industry’s performative social media behavior.”)
“How I Got Scammed Out of $50,000,” by Charlotte Cowles for The Cut. Remember when I said we’d talk more about laughing at other people’s disenfranchised grief? Yep — I was talking about this viral essay, which I suspect most of you have already seen. I include it anyway because I’m pissed at all the cruel, lazy Twitter dunks, none of which acknowledge the trauma to the author, her bravery in writing this piece, or the *extraordinary* sophistication of online scammers. Sucks that “empathy” is a hot take these days, but … there you have it!
Three good reads on dating apps. Let me first congratulate the mainstream media for churning out a truly stunning volume of Valentine’s-pegged content. Then let me recommend three things in particular: “The Dating App Paradox” (featuring two fascinating economic theories for the decline of Hinge et al., plus several too many relationship puns); “When Love and the Algorithm Don’t Mix” (where I learned that racism is *openly* coded into dating app algorithms); and “All the Possible Futures” (a lovely essay about dating apps, and dating as an adult generally). Those are from NPR, Time and The Cut, respectively.
👉 ICYMI: The most-clicked link from last week’s newsletter was Corey Doctorow’s illuminating, profanity-laced essay on enshittification.
Postscripts
Obit spam. “Loud budgeting.” Holy Ghost(s in the machine). Where are all the young male feminists? (Surely there are more than three!) How Slack changed the culture. Why the navy plays games. When the Taylor Swift conspiracies first took off. Are spreadsheets really “the closest thing we have to leisure”? Are ~youth~ really the point of Biden’s TikTok?
This week, in unsettling AI resurrections: dictators, exes and gun violence victims. A second life for landlines. The flags seen only in dreams. “Nudify apps” are a scourge, especially for high school students. “The vibe of Facebook Marketplace is a sort of wild west.” The current media collapse is “a reaction to the commercial Internet itself.” Last but not least: I know dating apps have suffered, but there’s no way they’re this bad.
Until next week! Warmest virtual regards,
Caitlin
Have you read Adam Johnson’s 2013 short story, Nirvana? It’s old, but it’s sci-fi, so kind of current; also, sad, full of drones and a hologram AI of your favorite president, consoling and there, more so than your husband who wishes he could be there like your AI president. It’s amazing: https://www.esquire.com/entertainment/books/a23504/nirvana-adam-johnson/
thank you for including a link to my goodreads deep dive! now time to read even more about AI.