Hi! This is a Sonder Scheme newsletter, written by me, Helen Edwards. Artificiality is about artificial intelligence in the wild; how AI is being used to make and break our world. If you haven’t signed up yet, you can do that here. If you like this newsletter, please share, especially on LinkedIn and Twitter.
This past weekend’s Super Bowl might be remembered more for Google’s tearjerker of an ad than the game. In it, “a man reminisces about the love of his life with a little help from Google,” according to the company’s description. The ad struck a cord. It was voted “best ad” in the Super Bowl line up. Twitter was flooded with the tears of hard-core football fans who, “damn it, don’t like to cry during games.”
What is the strategy behind this storytelling?
Picking the ad apart, it works on two levels. On one level it shows how AI can help someone remember. The assumption made by almost everyone (according to social media commentary) is that the man suffers from dementia. Without coming out and blatantly saying, “Google can help your grandfather with his dementia,” the ad shows that Google Assistant can be a great memory jogger.
Highly emotional ads are more engaging and memorable. If the comments on YouTube are anything to go by, it’s been successful at creating a strong emotional bond with Google; a monopoly that provides a utility service and is keen to stave off regulation by building loyalty when it needs it most.
We also know that the company’s goal is to harvest data and use it to make predictions to make more money on ads. You can spin the ad around…
Loretta used to hum showtunes.
How about an ad for tickets to Broadway or a DVD or a streaming service?
Loretta’s favorite flowers were tulips.
Or maybe flower delivery or a wall art of tulips or a trip to Amsterdam. Do you feel better about ads now that Google’s ad is so “powerful?”
“Little things” are powerful. Which gets to the other idea in the ad. What is the subtext about the nature of this “help”?
Outsourcing cognitive, moral and physical processes to technology comes with existential consequences - increased passivity, decreased agency, increased detachment. Outsourcing emotional work to a machine comes with the same consequences, but with an added twist. The promise of ever-more efficient ways for tech to support us through the human journey is the promise of “cheap bliss;” we are seduced into thinking that the hard work of being human is somehow avoidable.
Our most important emotional work is also work that can never be outsourced - love, grief, despair, delight. We need to remain fluent in these most human of experiences. We have to experience them to make sense of them. We need the struggle of working with others to go from *now* to *future.* The idea that even a small component of grief can be outsourced to a machine, one whose sole purpose is to turn that information into fodder for more predictable ad clicks, is misleading advertising in the extreme.
It’s a bit depressing to see how easy it is to manipulate us with a good tearjerker. It’s a blatant move to harness emotions and a fast route to build demonstrable loyalty. In the world of monopoly businesses, it is a recognized strategy to head off critical oversight and regulation. Using vulnerable human moments and fallibility, Google showed just how easy it is to keep the power balance exactly where it wants it because almost everyone embraced it.
The true story behind the ad is a real human story that we can and should embrace with compassion. But the ad is the company and it manipulates our capacity for empathy. It is designed to make us complicit and passive in the harvest of increasingly personal behavioral data for others’ profit.
Tearjerk ads are an easy strategy for monopolies looking to build emotional connection with captive customers. This Telecom NZ ad from many years ago is a melancholic masterpiece. I guess this is one movie we have seen before.
Also this week:
A video from Wuhan showing drones instructing people to go back inside. I’ve watched this video many times and each time it triggers a level of cognitive dissonance - is this real or fake? What happens when coronavirus is under control enough for people to go back to “normal” life? Is this the new normal in China?
Article in the NYT about facial recognition going live in a school district, despite some people trying hard to stop it. Plus the announcement of a house committee hearing on the use of the technology by Homeland Security.
Great long read from Vice about ClassPass and the use of platform algorithms in the fitness industry. Classic “frenemy” story and “Uberification” of everything.
Article from Protocol about technology companies and ethics and practical barriers to change.