I had occasion this week to query an order from Amazon. I was sent a notice of dispatch for something I had not ordered. I did the usual checks to make sure the account had not been compromised, and set about querying the order and organising a return.
And so it began.
A few years ago, there was some interesting research into the notion of the uncanny valley. The dissonance that occurs as you become unsure whether you’re conversing with an algorithm, or a person.
The system is efficient – you know the routine. Then you get to the part where you have a non standard problem, and a chat box opens. The responses were efficient, but mechanical and I found myself wondering what I was conversing with. It’s a strange feeling, wanting the reassurance of being paid attention to rather than being efficiently processed.
And then – A TYPO!! – and a quick correction.
Algorithms don’t do typos. Algorithms don’t do vulnerable.
The whole tenor of the exchange altered. I was dealing with a human somewhere. It changed the nature of my questions (have you noticed how we fall into “machine speak” in chat situations?) which in turn changed my host responses. I got a satisfactory result to my issue, and felt acknowledged.
There is a space – a liminal space – between things – notes of music, responses in a conversation, gaps between thoughts. They are hugely powerful – they contain the all the emotions from fear to joy that will determine what happens next.
As yet, algorithms don’t do liminal space. They respond, but don’t leave space for empathy.
AI will have a huge impact, but we need to recognise context.
When I have a non standard problem, I don’t want a more senior algorithm, I want a human. And I want to know I’m conversing with one.