https://www.forbes.com/sites/stevedenning/2021/08/01/why-computers-didnt-improve-productivity/?sh=180863ee36f2

“You can see computers everywhere but in the productivity statistics,” wrote Nobel-Prize-winning economist Robert Solow in 1987. His dictum spawned several decades of economic research aimed at solving the mystery that has become known as the ‘Solow Paradox’: massive investment in computers but no net gain in productivity.

In the 2010s, ‘digital transformation’ also turned into a meaningless buzzword, as almost every big company attempted one but most failed to get the expected gains. The benefits obtained from successful big data projects were also outweighed by mostly disappointments, generating renewed interest in the Solow Paradox.

A key to solving the Solow Paradox lies in recognizing that bureaucracy and computers are a marriage made in hell: computers generate a great deal more work, but not necessarily more useful work. In a bureaucracy, there is often no net gain. Until you tame bureaucracy, with more agile management, computers often make things worse.

The Solow Paradox Is Counterintuitive

The usual first response to encountering the Solow Paradox is disbelief: “If I have to write a few emails so that I don’t have to use a carrier pigeon, sign me up!”

But in a bureaucracy, the story of those few emails usually doesn’t end there. So, you send your few emails, and then you soon get email replies and comments. Now, you have to write more emails in reply to those emails, which are then sent up the hierarchy and to a chain of full-time reviewers, who each make a comment to show that they are useful.

So now you have to reconcile all these comments. And maybe you have to send more emails to explain to each reviewer how their particular comment has been dealt with, and you may get replies to those emails. And so on. What began as a few quick emails has turned into days of work by multiple people all over the organization. In a bureaucracy, computers help work spread like a virus. MORE FROMFORBES ADVISORWho Is The New Platinum Card From American Express Good For?ByEric RosenContributorWhat To Do If You Still Haven’t Received Your Child Tax Credit PaymentByKemberley WashingtonForbes Advisor Staff

The First Digital Computers

Digital computing began differently, with a big success. The first digital computer, Colossus, was developed by British intelligence in the 1940s. It deciphered encrypted messages between Hitler and his generals: it helped win the Second World War, as David Price explains in his new book, Geniuses at War: Bletchley Park, Colossus, and the Dawn of the Digital Age (Knopf, June 2021.

Similar one-off triumphs were apparent with NASA’s moon landing in 1969, and with IBM’s computer, Deep Blue, which defeated world-champion chess player Gary Kasparov in 1997 and Watson, which won the game show, Jeopardy, in 2011. Yet widespread economic gains from computing have been harder to find.

Mainframe Computing

When mainframe computer systems began appearing commercially in the 1950s, big firms were thrilled. Suddenly, in a single system, top managers could for the first time see the results of massive amounts of data. Computers were used by firms for critical applications like bulk data processing, enterprise resource planning, large-scale transaction processing, and industry and consumer statistics.

Mainframe computers were expensive and required significant expertise. The computers were physically huge and needed special cooling systems. Access to computing time was strictly rationed. The systems themselves were monolithic and could not interact with other systems. The systems were also difficult to adjust or re-program.

By the 1990s, huge sums of money were being lost in mainframe computing because the work of software development was always late, over budget, and plagued by quality problems. Some big projects were never finished at all. Software programmers were seen as culprits and were punished. They worked harder. They labored evenings and weekends. They were fired. It made no difference. The software was still late, over budget, and full of bugs. Replacements were hired, but they did no better.

Standard management practices did not seem to work with big software systems. Managers found that the more they sought to control things, the less progress they made. The more staff they added, the slower the team became. Complexity did not respond to authority. Billions of dollars were being lost. Something different had to be found. But in the course of the search to find a better way, computer departments acquired a reputation for bad management.

Computers and Bureaucracy: A Marriage Made In Hell

Meanwhile the advent of minicomputers in the 1960s, and then personal computers, in the 1970s, democratized access to computing. Now everyone in the white-collar workforce had access to a computer. And indeed, computers enabled much more work to be done.

One or two redrafts of documents became incessant revisions. Individual reviews turned into multiple levels of reviews and further rework. Staff found themselves preparing, and then sitting through, endless PowerPoint presentations.

Spreadsheets spawned massive amounts of data analysis. Yet different computing programs often couldn’t interact with each other. Data was shipped back and forth, or up and down corporate hierarchies, or across different systems, or unbundled and re-bundled, or transformed into analog and then re-transformed back into digital.

The Spread of Unproductive Work

As anthropology professor David Graeber explained in his landmark book, Bullshit Jobs: A Theory (Simon & Schuster 2018), the workplace became riddled with useless work: “HR consultants, communications coordinators, PR researchers, financial strategists, corporate lawyers, or the sort of people … who spent their time staffing committees that discuss the problem of unnecessary committees.”

Graeber’s analyses suggested that in large organizations, as much as half of all work was being done by five categories of unproductive jobs:

·      Flunkies, who serve to make their superiors feel important.

·      Goons, including lobbyists, corporate lawyers, telemarketers, and public relations specialists.

·      Duct tapers, who fix problems—temporarily.

·      Box tickers, who create the appearance that something useful is being done.

·      Taskmasters, who create extra work including middle managers.

An Explosion of KPIs

As the quantity of useless work grew with the help of computers, bureaucracies developed devices to prove that their work was useful. Key performance indicators (KPIs) that supposedly defined what was important to perform by any person or unit flourished like mushrooms.

In practice, KPIs were mainly used to justify the bureaucracy, rather than to help determine whether any activity was creating genuine benefit for any external customer. Masses of computerized KPIs helped managers prove—to themselves and to their bosses—that what they were doing was useful.

The Solution: Agile Management

In the (good) old days before computers, you didn’t send unproductive emails. You just did what needed to be done.

And that’s what happens now in the winning firms of the digital era that have made the transition to agile management. All work, including computing, is focused on what will add value to external customers. So, you don’t send those unproductive emails: you are working in a team with a clear mandate on a customer-focused set of tasks in a sprint—i.e. work with a short deadline. You don’t need to continually check in with a steep hierarchy or with an army of reviewers. You just get on with it and get it done.

It’s not that computers aren’t potentially capable of great gains. But like most new technology, they need a different kind of management to realize that potential.

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.