• Technology
  • CFO.com | US

Paying the ‘Attention Tax’ on Automation

Although companies are relying more on automated processes, businesses can’t function without the human element.

Opinion_Bug7As we automate more and more processes within business and commerce, it’s important to bear in mind that “people are not peripherals” and can’t be treated as though they are, as one of my colleagues likes to say. The systems and software we have today are much better than we had 20 years ago, and we might be tempted to take human participation out of more and more decisions.

However, whenever we hit the limits of what automation can handle, it creates unintended consequences for the people that remain. They have to be paying attention to what’s going wrong and be highly trained to deal with any malfunctions. It also guarantees that failures — driven by factors such as a lack of constant human alertness and a slow recognition of sudden problems — will be extreme.

To start, executives need to incent employees to stay engaged with automation by letting employees exercise discretion in some parts of their work even when software could be more efficient.

These challenges have been creeping up on us for a long time. In the early 1990s, I was involved in rescuing a large and innovative project that seemed to be going in a bad direction. The objective had been to automate a set of complex processes that had previously needed two closely coordinated and highly trained people to manage so that a only a single highly trained person could manage them alone. At the same time, that single person was expected to perform a much wider range of tasks than the two people being replaced could have managed — all through a more powerful and flexible set of integrated platform technologies. Half the people, twice the scope. What’s not to like?

The only problem was the operators couldn’t do it. They kept making mistakes and crashing the system. The trainers said it was the fault of the technology. The technologists said it was the training. Both blamed the operators.

John Parkinson

John Parkinson

It turned out that the operators were being overloaded with information by the automation, which was supposed to make their lives easier. Instead of getting just what they needed at the time they needed it, the systems were trying to tell them everything, all the time. At some point, the next item got lost in the flood and ignoring it triggered the crash. Two decades on, it often seems that everything in business is going the same way.

We also designed a filter to smooth out the many streams of data that fluctuated widely and rapidly but where the fluctuations didn’t affect what the operator needed to know. Then, as a back-up option, we gave the operators the ability to override the automation if they felt they needed to take control of the system.

Our final tweak was to analyze the recorded data in an effort to continuously improve the range of situations that the automation could handle. Over time we found lots of patterns that we could exploit to provide better help to the operators, and it looked as though we would eventually get close to a fully automated system with no operators required. As it turned out, this was too optimistic. We never managed to fully anticipate all the possible unanticipated events to allow us to remove the human component.

From the human perspective, that is probably a good thing even though the highly skilled and expensively trained operators had much less to do and were often bored. In fact, when they were needed, it took operators a while to get oriented and they didn’t perform all that well when handling emergencies.

In the end, we had to find ways to keep the operators alert, engaged and ready to step in if needed. For example, operators would have to check in from time to time with the automation and verify that some aspect of the process was working correctly. They were also required to generate and comment on a status report at periodic intervals that might vary from every few minutes to once a day.

We called this the “attention tax” because it was non-value added activity that had to be paid to keep the operators engaged and thus to get the overall system to function smoothly and reliably; no amount of smart automation ever eliminated it entirely.

But it was better to pay the attention tax and keep the humans alert and available for the jobs that only they can do. It’s the same in business.

John Parkinson is an affiliate partner at Waterstone Management Group in Chicago. He has been a global business and technology executive and a strategist for more than 35 years.

Discuss

Your email address will not be published. Required fields are marked *