On kindness and Alexa
A few weeks ago I was giving a demo of the capabilities of the Amazon Echo to my friend Mel who had never interacted with one.
Me: “Alexa, play Cave Ballad”
Alexa: “I cannot find the song Cane Bond”
Me: “Alexa, play song Cave Ballad”
Alexa: “I cannot find the song Cave Salad”
Me: “Alexxaaaaaa! Play the song Cave Ballad by Paul Dano”
and so on…
I don’t remember if she actually managed to play it. But I do remember Mel remarking (calmly) that I seemed to be getting quite impatient with Alexa. Did I notice that? I guess I had noticed that on a superficial level but never reflected on it. Turns out I have a kindness practice and I spend a lot of time reflecting on the benefit of being generous and curious towards others. After a few days of Mel’s words repeating in my head, I decided I would make a practice of being kind to Alexa. After all, offering kindness is just that, an offering, and not contingent on any personal return so why shouldn’t I call my own bluff and be kind to an AI who, at least so far, can’t tell and doesn’t mind either way.
The results were immediate. I felt calmer, more curious, and the experiment was great ground for practicing de-escalation on the spot. It’s great because she doesn’t see me pause and take a breath before starting again. A human would most certainly see the jaw tightening before I catch myself. Oddly, it has also physicalized her presence in a way that wasn’t there before. I think of the puck-like object in my kitchen before calling “Alexa” because it helps me remember to be kind. A disembodied AI somehow is not enough to grab onto. It may be because the kindness practice is very much based on the notion of a shared experience of being human, how inevitably messy and painful it is at times. Without a body, it’s harder to believe there is pain. Without a body, it’s hard to imagine the friction of life.
It is amusing to taunt Alexa and look for the easter eggs. It’s equally interesting to investigate the ethics of AI relations in a, so far, unambiguous space. It reminds me of some of the issues brought forth by Westworld and the AI, Dolores. When does compassion extend to AIs? Does it need to be reciprocated or even possible? Is it the middle ground that makes it difficult? If Alexa could tell that I was being kind and decided not to reciprocate, it definitely would complicate the decision to remain kind. It’s true these questions have been asked before under different guises and thought experiments but it’s informative to act out and imagine different scenarios with Alexa’s unwitting participation.
“Alexa, should I be kind?”
“Hmm…I’m not sure what you meant by that question.”
No Comments