New AI chatbot could (virtually) bring your loved ones back from the dead
By Dalvin Brown
Almost eight years ago, season two of the sci-fi series "Black Mirror" arrived on Netflix with an eerie episode anchored around grief. Now, the technology deployed on the show is appearing in real ways.
To recap, the show introduced viewers to Martha, a young woman grappling with the loss of her partner, Ash, who died in a car crash. At his funeral, Martha finds out about a digital service that would enable her to communicate with a chatbot version of her late partner. She reluctantly subscribes to it.
"I only came here to say one thing. I'm pregnant," Martha revealed to the fake Ash. The chatbot responds: "Wow. So, I'll be a dad? I wish I was there with you."
It's a haunting episode based on a not-too far-fetched premise now that companies are racing to create digitized human clones capable of engaging with real-world people. Last month, news broke that Microsoft received a patent for software that could reincarnate people as a chatbot. The computer software giant patented "conversational" chatbots based on a specific person, dead or alive. The program would work by pulling data from the person's social media posts and text messages, just like the unnamed software on "Black Mirror."
"The social data may be used to create or modify a special index in the theme of the specific person's personality," the patent says. The tech giant would then use that information to train machine learning engines, and the result would be artificial intelligence that could "think" and respond like someone you knew.
Imagine writing a letter to a lost friend and receiving a response that captures their personality. Or picture yourself on a video call with a 2D version of someone who has passed. Those are the types of capabilities such a product would unlock.
It might even provide some temporary relief for people reeling from the loss of a loved one. But resurrecting the dead via chatbots could have dangerous implications long-term, grief counselors say.
"My fear is that it would become more like an addiction," said Elizabeth Tolliver, assistant professor of counseling at the University of Nebraska Omaha, who studies grief. "I'm concerned that people would want more and more of the technology to feel closer to the person that they've lost rather than living the life they're currently alive in."
Others question the ethics behind scouring social media for memories left by dead people to turn a profit. Microsoft didn't say why it filed for the patent but points to a tweet from its general manager of AI programs, who said, "there's no plan for this," later calling the patented tech "disturbing."
Still, there's not much stopping them or any other firm from doing so, AI analysts say.
After all, we're living through an era marked by surveillance capitalism where the commodity for sale is your personal data. We're also living through an artificial intelligence revolution that's unlocking new ways to replicate humans, and firms are racing to develop clones that serve a host of purposes.
Google also has a patent for a digital clone that embodies people's "mental attributes." New Zealand-based software company UneeQ is marketing "digital humans" that "recreate human interaction at infinite scale." Pryon, an AI company, is working on tech that replicates sentiments held by staff within an organization to enhance chatbots. The goal is to capture what employees know and create a virtual assistant that can answer questions with more accuracy.
One of the primary reasons companies are entering the space is to capitalize on the power of predictive purchasing. The idea is that if they know how you think or can connect with you emotionally, they could help brands better pitch you a product.
Chatbots, or automated text and voice robots, have been around for years, mostly to answer generic questions over the phone or on a website. However, they're getting smarter over time as firms toss emotional intelligence, deep fake technology and audio synthesis into the mix.
It's the kind of tech that powers digital influencers like Miquela, a virtual DJ with 2.9 million followers on Instagram. In simpler applications, AI powers voice assistants such as Siri on your smartphone.
With people continuing to share more of themselves online, it's possible to create a reasonably accurate chatbot based primarily on people's digital footprints, according to Casey Phillips, senior manager of AI-driven experiences at Intuit.
"You could make a fairly relevant chatbot, especially based on somebody living in our world today," Phillips said. "We're constantly communicating in ways that are being stored. You can take that data, run it through an AI system to predict how that person would actually respond to things."
For typical chatbots, companies turn to AI agencies, which can charge between several hundred and several thousand dollars a month for customer service or website chatbots capable of answering a set number of questions. Creating a robust system of chatbots customized to individuals would be a much more expensive venture. It could cost tens of millions of dollars each year to support a team of highly-skilled data scientists, engineers and product developers, Phillips said.
Some AI specialists have already shown that it's possible on a much smaller scale. In 2016, James Vlahos, the CEO of HereAfter, created an interactive chatbot dubbed "Dadbot" that was based on his late father. That same year, Belarus-born Eugenia Kuyda digitally recreated her deceased best friend using text messages he had sent friends before dying in a car crash.
The idea of chatbots based on dead people raises several ethical questions surrounding privacy. It's like next-level identity theft. There are also limitations. People only share so much on social media, so algorithms relying on that would be flawed. Humans are also highly complex and influenced by experiences that aren't always shared via text messages. Microsoft's patent suggests that the company could use crowdsourced data to fill in any gaps. In other words, the resulting chatbot could end up saying things the person never said. While the AI stems from a real person, it is not the same as the physical being.
"It's hard to collect the tribal knowledge. Those little subtle things that make people unique, that's hard to grasp," said Igor Jablokov, CEO of Pryon. The closest companies can get to that is "authored knowledge, things that you wrote, or things that were transcribed while you were on a Zoom call."
The Washington Post