The Emotional Supply Chain – a tale of three businesses.

Having written about emotional supply chains earlier, a good example fell into my lap this week.

I took an opportunity to spend a few days in Puglia, which involved three companies; an airline, a letting agency, and a car hire company. Turned out to be three very different experiences.

In chronological order, the letting agency. The have a developed an outstanding app, which let me choose somewhere, check availability, talk directly to the owner, and book. A great combination of automated efficiency and human contact. It extended to arrival and departure, being looked after by a local contact. (A high five to home and away)

The airline, a currently strike prone national carrier, was heavy on automation, but it worked well. Very few humans, but easy check in, bag drop and a good flight.

The care hire company, a major international chain, was neither of the above. Cumbersome booking and opaque pricing on a very average web page, but where it got interesting was on arrival. It took as long in the queue to collect the car as it took to fly from the UK. Staff were harassed, no eye contact, and a frustrating experience given that the admin can hardly be much more complex than for the airline. In this case the humans were a real downside. I was longing for an app.

Overall, I came away with a feeling of trust and connection to the letting agency, neutral abut the airline (efficient, but the prospect of unexpected strike action negated the human side) and a real dislike of the car hire company, who I am unlikely to use again. Which is a shame, because if the got the human part of it half right, the rest worked well.

As we increasingly automate what we can, the human aspect will come to the fore. Size and scale will only really affect the logical supply chain, and I suspect it has an inverse effect on the emotional one.

It’s happening now.

Do algorithms need psychotherapists?

I’ve become more and more curious about the power of algorithms. They are wonderful things that can take mind numbing hard work out of routine processes, freeing humans to do more meaningful work.

However.

Who writes the algorithm? When you think about it, whoever writes the algorithm passes on their own worldview, biases, heuristics and experience into eternal digital form. That’s a thought.

Algorithms are normally written by engineers – people with powerful, logical brains, skilled in getting from A to B via the most direct route. It is therefore not too fanciful to imagine that the algorithms they write take on something of their creator’s psyche.

Engineers are vital to our society, and we don’t have enough of them. That said, whilst all are different, software engineers, generally speaking, are not renowned for empathy.

So, if I want to create an algorithm for a customer service interface, who should write it? – somebody with really good programming skills, or somebody with real empathy?

I came across this article from Psychology Today as I explored this idea, and it gave me real pause for thought.

There’s a view that up to half of routine jobs will be replaced by algorithms of various flavours by 2030. That’s (probably) an extreme estimate, but even if only partially correct, we will be replacing flawed, inefficient, but essentially human agents with (by definition) soulless algorithms in significant numbers.

That has potential to create emotional havoc.

There’s a huge opportunity here – to combine digital efficiency with empathy and compassion. It will require however a holistic approach to design – one which better represents the way we work together as humans.

As I’ve explored this area, I’m beginning to understand that just a few seconds working with an unsympathetic (but logically efficient) algorithm can very quickly screw up a potentially good experience and relationship.

The future of brand reputation relies, I suspect, on getting the right balance between the messy human and the efficient algorithm.

Degrees of digital separation

Whenever we get a new tool we tend to over use it. We’re excited by the possibilities, enjoy the novelty, and want to explore its potential.

I think we’re at that point with our relationship with algorithms. The danger is that when we overstretch, the damage we cause to a relationship moves beyond the digital into the Human. Small issues get magnified.

I had occasion today to cancel a gym membership with a large international chain whose membership management is outsourced to another large international chain, so I already have two degrees of separation.

I had a query, so chose to call rather than engage with a cumbersome web site, and got through to the usual number roulette. Unusually, all their operators were very busy, but a I was assured I would have to wait no more than 2 mins 19 seconds. Not a few minutes – 2 mins 19 seconds. I was then subjected to a recording of other ways to contact them. I now had apparently 19 seconds to wait. Three degrees of separation.

I then got through to a human. Perfectly polite, but following a tightly defined protocol. A process, not a conversation. Four degrees of separation.

Problem resolved. Less than 10 minutes. No engagement.

UX is a huge area, but quite static. A digital (or script based) interaction is designed at a moment in time based on an average customer .

But time moves on, and none of us are average.

In the end, I got what I wanted from the transaction, but the opportunity to engage was entirely missed by protocols designed to maximise efficiency and minimise risk.

What could have been a conversation that would have yielded both key information (why was I cancelling) as well as sustaining a recoverable relationship was converted to an uninspiring exchange by a proxy of the people with whom a ten year relationship had just ended.

Quite a price to pay for efficiency I thought.