In the last few days I have noticed some interesting issues on the downsides of automated systems.
First, British Airways, having been unable to resolve a pay and conditions dispute with their pilots chose to accept strike action. One of the consequences has been customers being informed of cancellations in aless than organised way.
Then, Transport for London making damaging mistakes of implementing the congestion charge – essentially refusing payment and then fining people who had tried to pay.
There are doubtless many others – these were just ones that caught my eye.
It set me thinking. As we increasingly transfer routine processes from people to algorithms, business focus is principally on the efficiencies but we haveintroduced a new player into any process that impacts humans.
If we’re dealing with a call centre, or a customer services representative, it is a human to human interaction. It may be constrained by controls and scripts, but we’re still dealing with a human. There will be a gap, depending on culture, communication and empathy – the “further away” we feel from whomever we are talking to, the less recognised we feel – but we’re still dealing with a human.
As that process gets transferred to automated systems, we’re dealing with an algorithm, and the nature of our interaction with it is determined by the skill – and importantly the empathy, of the team that writes it. When we deal with the algorithm, we’re not dealing with a human, we’re dealing with a proxy. It shows when things don’t go according to script or design.
So I think we have to be careful. It’s easy to offload messy emotions by creating algorithms to avoid the ocaasional messiness of human interaction, but the risk is high. As has often been said, it takes a lifetine to build a reputation, and seconds to destroy it.