Nicholas Carr's essay IT Doesn't Matter in the Harvard Business Review in 2003, and the later book, argued that IT is shifting to a service delivery model comparable to electric utilities. It produced debate and defensiveness among IT managers over the possibility that they were sliding to irrelevancy. It's a debate that has yet to be settled. But what is clear is that Carr has a talent for raising timely questions, and he has done so again in his latest work The Glass Cage, Automation and Us (W.W. Norton & Co.)
This new book may make IT managers, once again, uncomfortable.
The Glass Cage examines the possibility that businesses are moving too quickly to automate white collar jobs, sophisticated tasks and mental work, and are increasingly reliant on automated decision-making and predictive analytics. It warns of the potential de-skilling of the workforce, including software developers, as larger shares of work processes are turned over to machines.
This book is not a defense of Luddites. It's a well-anchored examination of the consequences and impact about deploying systems designed to replace us. Carr's concerns are illustrated and found in, for instance, the Federal Aviation Administration's warning to airlines about automation, and how electronic medical records may actually be raising costs and hurting healthcare.
In an interview, Carr talked about some of the major themes in his book. What follows are edited excerpts:
The book discusses how automation is leading to a decay of skills and new kinds of risks. It cites an erosion of skills among aircraft pilots, financial professionals and health professionals who, for instance, examine images with automation. But automation has long replaced certain skills. What is different today about the automation of knowledge or mental work that makes you concerned? I think it comes to the scope of what can be automated today. There has always been, from the first time human beings developed tools, and certainly through the industrial revolution, trade-offs between skill loss and skill gain through tools. But until the development of software that can do analysis, make judgments, sense the environment, we've never had tools, machines that can take over professional work in the way that we're seeing today. That doesn't mean take it over necessarily entirely, but become the means through which professionals do their jobs, do analytical work, make decisions, and so forth. It's a matter of the scope of automation being so much broader today and growing ever more broad with each kind of passing year.
Where do you think we stand right now in terms of developing this capability? There are some recent breakthroughs in computer technology that have greatly expanded the reach of automation. We see it on the one hand with the automation of complex psychomotor skills. A good example is the self-driving car that Google, and now other car makers, are manufacturing. We're certainly not to the point where you can send a fully autonomous vehicle out into real-world traffic without a backup driver. But it's clear that we're now at the point where we can begin sending robots out into the world to act autonomously in a way that was just impossible even 10 years ago. We're also seeing, with new machine-learning algorithms and predictive algorithms, the ability to analyze, assess information, collect that, interpret it automatically and pump out predictions, decisions and judgments. Really, in the last five years or so we, have opened up a new era in automation, and you have to assume the capabilities in those areas are going to continue to grow, and grow pretty rapidly.
What is the worry here? If I can get into my self-driving car in the morning, I can sit back and work on other things. There are two worries. One is practical and the other is philosophical. The actuality of what's facing us in the foreseeable future is not complete automation, it's not getting into your car and simply allowing the computer to take over. It's not getting into a plane with no pilots. What we're looking at is a shared responsibility between human experts and computers. So, yes, maybe at some point in the future we will have completely autonomous vehicles able to handle traffic in cities. We're still a long way away from that. We have to figure out how to best balance the responsibilities between the human expert or professional and computer. I think we're going down the wrong path right now. We're too quick to hand over too much responsibility to the computer and what that ends up doing is leaving the expert or professional in a kind of a passive role: looking at monitors, following templates, entering data. The problem, and we see it with pilots and doctors, is when the computer fails, when either the technology breaks down, or the computer comes up against some situation that it hasn't been programmed to handle, then the human being has to jump back in take control, and too often we have allowed the human expert skills to get rusty and their situational awareness to fade away and so they make mistakes. At the practical level, we can be smarter and wiser about how we go about automating and make sure that we keep the human engaged.
Then we have the philosophical side, what are human beings for? What gives meaning to our lives and fulfills us? And it turns out that it is usually doing hard work in the real world, grappling with hard challenges, overcoming them, expanding our talents, engaging with difficult situations. Unfortunately, that is the kind of effort that software programmers, for good reasons of their own, seek to alleviate today. There is a kind of philosophical tension or even existential tension between our desire to offload hard challenges onto computers, and that fact that as human beings, we gain fulfilment and satisfaction and meaning through struggling with hard challenges.
Let's talk about software developers. In the book, you write that the software profession's push to "to ease the strain of thinking is taking a toll on their own skills." If the software development tools are becoming more capable, are software developers becoming less capable? I think in many cases they are. Not in all cases. We see concerns -- this is the kind of tricky balancing act that we always have to engage in when we automate -- and the question is: Is the automation pushing people up to higher level of skills or is it turning them into machine operators or computer operators -- people who end up de-skilled by the process and have less interesting work. I certainly think we see it in software programming itself. If you can look to integrated development environments, other automated tools, to automate tasks that you have already mastered, and that have thus become routine to you that can free up your time, [that] frees up your mental energy to think about harder problems. On the other hand, if we use automation to simply replace hard work, and therefore prevent you from fully mastering various levels of skills, it can actually have the opposite effect. Instead of lifting you up, it can establish a ceiling above which your mastery can't go because you're simply not practicing the fundamental skills that are required as kind of a baseline to jump to the next level.
What is the risk, if there is a de-skilling of software development and automation takes on too much of the task of writing code? There are very different views on this. Not everyone agrees that we are seeing a de-skilling effect in programming itself. Other people are worried that we are beginning to automate too many of the programming tasks. I don't have enough in-depth knowledge to know to what extent de-skilling is really happening, but I think the danger is the same danger when you de-skill any expert task, any professional task, ...you cut off the unique, distinctive talents that human beings bring to these challenging tasks that computers simply can't replicate: creative thinking, conceptual thinking, critical thinking and the ability to evaluate the task as you do it, to be kind of self-critical. Often, these very, what are still very human skills, that are built on common sense, a conscious understanding of the world, intuition through experience, things that computers can't do and probably won't be able to do for long time, it's the loss of those unique human skills, I think, [that] gets in the way of progress.
What is the antidote to these pitfalls? In some places, there may not be an antidote coming from the business world itself, because there is a conflict in many cases between the desire to maximize efficiency through automation and the desire to make sure that human skills, human talents, continue to be exercised, practiced and expanded. But I do think we're seeing at least some signs that a narrow focus on automation to gain immediate efficiency benefits may not always serve a company well in the long term. Earlier this year, Toyota Motor Co., announced that it had decided to start replacing some of its robots in it Japanese factories with human beings, with crafts people. Even though it has been out front, a kind of a pioneer of automation, and robotics and manufacturing, it has suffered some quality problems, with lots of recalls. For Toyota, quality problems aren't just bad for business, they are bad for its culture, which is built on a sense of pride in the quality that it historically has been able to maintain. Simply focusing on efficiency, and automating everything, can get in the way of quality in the long-term because you don't have the distinctive perspective of the human craft worker. It went too far, too quickly, and lost something important.
Gartner recently came out with a prediction that in approximately 10 years about one third of all the jobs that exist today will be replaced by some form of automation. That could be an over-the-top prediction or not. But when you think about the job market going forward, what kind of impact do you see automation having? I think that prediction is probably over aggressive. It's very easy to come up with these scenarios that show massive job losses. I think what we're facing is probably a more modest, but still ongoing destruction or loss of white collar professional jobs as computers become more capable of undertaking analyses and making judgments. A very good example is in the legal field, where you have seen, and very, very quickly, language processing software take over the work of evidence discovery. You used to have lots of bright people reading through various documents to find evidence and to figure out relationships among people, and now computers can basically do all that work, so lots of paralegals, lots of junior lawyers, lose their jobs because computers can do them. I think we will continue to see that kind of replacement of professional labor with analytical software. The job market is very complex, so it's easy to become alarmist, but I do think the big challenge is probably less the total number of jobs in the economy then the distribution of those jobs. Because as soon as you are able to automate what used to be very skilled task, then you also de-skill them and, hence, you don't have to pay the people who do them as much. We will probably see a continued pressure for the polarization of the workforce and the erosion of good quality, good paying middle class jobs.
What do you want people to take away from this work? I think we're naturally very enthusiastic about technological advances, and particularly enthusiastic about the ways that engineers and programmers and other inventors can program inanimate machines and computers to do hard things that human beings used to do. That's amazing, and I think we're right to be amazed and enthusiastic about that. But I think often our enthusiasm leads us to make assumptions that aren't in our best interest, assumptions that we should seek convenience and speed and efficiency without regard to the fact that our sense of satisfaction in life often comes from mastering hard challenges, mastering hard skills. My goal is simply to warn people.
I think we have a choice about whether we do this wisely and humanistically, or we take the road that I think we're on right now, which is to take a misanthropic view of technological progress and just say 'give computers everything they can possibly do and give human beings whatever is left over.' I think that's a recipe for diminishing the quality of life and ultimately short-circuiting progress.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.