The Empathy Economy
What becomes of the human condition in a landscape increasingly dominated by algorithms and artificial intelligence?In July, Fortnite player extraordinaire Tyler “Ninja” Blevins traveled to the video game conference E3. He stayed active on Twitter, won a tournament, took loads of photos with fans, and returned to his Twitch account two days later to a surprising realization: Because he’d taken a break from the video platform, he’d lost 40,000 subscribers to his record-setting streams.
I think about Ninja’s example every time I step into an Uber or Lyft—a ride I’ve ordered via app, which matched me with a driver in the neighborhood and booked his car on my credit card. On their smartphone, the Uber driver got an alert—not from a dispatcher, but from an algorithm.
Two different worlds, both beholden to artificial intelligence—as demanding a taskmaster as any boss out there, and one completely lacking empathy. What if the Uber driver has a sick family member and takes three months off to care for them? What about the Twitch and YouTube personalities who depend on the platforms for their income who do the same?
As artificial intelligence assumes the roles and responsibilities that once required someone with a name tag and a monthly paycheck, the revolution of convenience we’ve experienced in the digital age will continue: products selected just for you, paid for and dispatched with a click or touch of the finger. But there are additional, far-reaching effects. Entire industries—finance, medicine, farming, transportation, dating, cooking, coaching, the list goes on—are facing or undergoing disruption.
Being at the vanguard, the Uber driver and the Twitch personality offer us a unique perspective on the problem with empathy and artificial intelligence. But they also signal an opportunity.
In the future, businesses will differentiate on empathy and creativity. If more and more of us are ordering things via smart speaker, it’s not just damaging for brands (are you going to make sure to say “Duracell” batteries when you order?)—it also means less room for creativity as we currently consider it.
Moving into the future, the work of brands will be to ensure that their value stays top of mind, especially considering the money already being spent to decide which product floats up to the top of a search result in Google or Amazon. Much like Lyft and Uber’s drive to recruit passengers and drivers from an ever-shrinking pool, this already feels like a spending race to the bottom. While this behavior continues, the real creativity will be in developing a human approach to AI.
Companies are already talking about how to be more trustworthy, more human. Witness Wells Fargo’s attempts at damage control following their revelations that employees opened up millions of fake bank accounts in real customers’ names. Conferences on making AI more empathetic are in our future and we’re already beginning to ask hard questions of it.
EqualAI is an organization I’ve supported, started by tech leaders like Jimmy Wales and Ariana Huffington. The focus is on reversing gender bias in technology, a significant issue caused by the lack of women in research and leadership positions in tech. Think about it every time you hear a child bossing Alexa around, and wonder whether that attitude won’t creep into other parts of their daily interactions with women.
But the answer won’t lie simply in making AI more empathetic. Why should we teach a machine to feel pain? That feels like the beginning of a frightening road. Besides, AI cannot feel pain, shame, guilt, or remorse. It doesn’t have the need for approval.
Does it need to be accountable for its actions? That’s where humans come in. So much military weaponry has been automated and powered by computational technology. But there is always someone making the decision on whether or not to pull the trigger.
Now, if you get a diagnosis from your doctor that is life-altering, you’ll likely head for a second opinion. AI will transform that process. Machines will be analyzing statistics and calculating probability, determining whether or not your doctor should operate. But the final call—based upon the information presented, the use cases from hundreds of hospitals around the world, your age, your DNA, and every other consideration—will finally be made by a human, and if things go poorly, the doctor, a human, will need to inform the family.
“In this respect, AI is not different from electricity or steam engines,” says Mariarosario Taddeo, deputy director of the Digital Ethics Lab at Oxford University. “It is our responsibility to steer the use of AI in such a way to foster human flourishing and well-being and mitigate the risks that this technology brings about.”
It’s not just people in positions of gravitas—the military general, the chief surgeon—who will play a role, but teachers and social workers and similar jobs that have a base element of compassion and understanding. Can you imagine the call center of the future? I think it will likely be staffed with people (not Siri-like automated voicebots) who feel your pain, and who were selected for the job because of their ability to connect. There will always need to be people whose job it is to grab the steering wheel and correct programming decisions with random acts of kindness.
I am certain that we don’t need to be fearful of either the future or of machines. The light at the end of the tunnel simply can’t be that we have regressed and become less intelligent. It also cannot be that we have created machines to do away with ourselves. Machines will progress, and as they integrate ever more fully into our lives, we will need to progress ourselves. We’ll do that not just by retraining for new careers, but on relying on one of the most elemental assets we possess: human nature.