Monday, March 07, 2011

Technology in the gaps
















It must have been strange to be an ancient Greek. Storms happened because Zeus woke up in a bad mood. Seas were calm because Poseidon was off playing XBox...or something. Basically, everything happening around you was due to the whims of mercurial, ineffable superbeings.

I feel like economists often treat technology the same way - as a capricious god who lives in all the gaps of our theories, pulling the levers and making the clockwork run.

The most famous example is the "Real Business Cycle" (RBC) model, for which Ed Prescott won the Nobel Prize.in 2004. Briefly...In this model (as in many others), you split economic production into two "factors": capital and labor. Whatever's left over - that is, the fraction you can't measure as capital or labor - is called "total factor productivity," or TFP. The RBC model says that TFP is, basically, technology. When technology gets better at a rapid rate, the theory says, we have an economic boom, and when it gets better only slowly (or gets worse), we have a recession. Ta-da! Business cycle explained!

Of course, there are many, many problems with this theory, and that's a blog post for another day. I want to focus here on one point: When you look at the wider world, you can't actually see the changes in technology that would be needed to cause the economic fluctuations we observe. RBC theory basically says that technology causes the movements of the economy, but that the only way you can see changes in aggregate technology is by...watching the movements of the economy. The storm is the proof that Zeus is angry.

Recently, many economists are using technology to explain a different set of phenomena - income stagnation, unemployment, and inequality. One prominent example of this is Tyler Cowen's The Great Stagnation, which claims that a slowdown in the rate of innovation and scientific discovery is causing the long-term flatlining of U.S. median income. This idea has been somewhat endorsed by Paul Krugman.

But there's a second strain of technologist theories bouncing around out there in the econosphere. These claim that technology is finally replacing humanity, making many of us irrelevant.

One of these is the theory of skill-biased technological change - basically, the idea that information technology is so hard to use that it's creating inequality between those who are smart/educated enough to use it and those who aren't. In the 90s, this theory was put forth as a reason why education could help fight rising inequality. But these days, some economists are saying that it's too late for that - that technology is replacing educated people too. Here's Paul Krugman:
[My magazine piece postulated] that information technology would end up reducing, not increasing, the demand for highly educated workers, because a lot of what highly educated workers do could actually be replaced by sophisticated information processing — indeed, replaced more easily than a lot of manual labor. Here’s the piece: I still think it’s a fun read. 

So here’s the question: is it starting to happen?...
Computers, it turns out, can quickly analyze millions of documents, cheaply performing a task that used to require armies of lawyers and paralegals. In this case, then, technological progress is actually reducing the demand for highly educated workers..

[S]oftware has also been replacing engineers in such tasks as chip design...

The fact is that since 1990 or so the U.S. job market has been characterized not by a general rise in the demand for skill, but by “hollowing out”: both high-wage and low-wage employment have grown rapidly, but medium-wage jobs — the kinds of jobs we count on to support a strong middle class — have lagged behind...

Why is this happening?...Some years ago, however, the economists David Autor, Frank Levy and Richard Murnane argued that...[c]omputers...excel at routine tasks, “cognitive and manual tasks that can be accomplished by following explicit rules.” Therefore, any routine task — a category that includes many white-collar, nonmanual jobs — is in the firing line.
(For a skeptical response, see Brad DeLong, and Brad DeLong again. Also Ryan Avent.)

And the most extreme example of this class of theory has got to be Tyler Cowen's "Zero Marginal Product Workers" hypothesis, which holds that many unemployed workers have been completely, utterly replaced - that they have no skills that are worth even minimum-wage compensation.

Now, I can't say with any conviction that any of these "technologist" theories are wrong (except for RBC, but that's for other reasons), so I'm not going to violate DeLong's Rules of Krugman. Maybe these things really are happening! And the bald fact is, technology is hugely important to long-term economic growth, to the composition of the labor market, and to our modern wealthy existence. This is not in dispute. So for me to say "Bah, technology is always just Zeus by another name" would be very silly.

But I will say that technologist theories deeply trouble me. They imply that economics, as a science (yes, I used the S-word!), is ultimately of limited use in explaining the economy. If all of the interesting stuff is happening in research labs and tinkerers' garages, then economists are basically reduced to being futurists, speculating on the rate and type and social impact of future technological advances. Our patron saint would be not Paul Samuelson, but Alvin Toffler.

Furthermore, frequent use of technologist theories forces economists to do some pretty tricky mental gymnastics. Cowen and Krugman, for example, claim that overall technology is advancing too slowly to raise our median incomes, but that certain kinds of technology are advancing fast enough to replace a huge number of workers. Of course, it's possible that's true; we've seen a lot of innovation in computers in the last 20 years, and less innovation in kitchen appliances. But it illustrates the fact that, the more phenomena you attribute to "technology," the more hyper-specific claims you are forced to make about a process that you can't accurately observe. 

How far are we willing to go with this? Taken to their absurd extremity, technologist theories would have us explaining every single fluctuation in any economic variable in terms of invisible changes in technology...oh wait, never mind, that's called "RBC."

Therefore, I recommend that economists be very sparing in our use of technologist theories. When we see something we can't easily explain - stagnating incomes, rising inequality, shifts in job opportunities - we should try very hard to explain the phenomenon in terms of things we can understand, observe, and predict. For example, Krugman offers globalization, and the outsourcing of white-collar desk jobs, as an alternative explanation for stagnant employment in those job categories. That might turn out to be less important than technology, but it's something we should look at first, because we can understand it and we can observe it and we can even predict it to some degree.

So, in conclusion: Using technology to explain short-term or medium-term economic phenomena may be perfectly spot-on correct. It is also a kind of giving up.

Update: I do want to point out that I'm not trying to trash Krugman's NYT column. That column's main thrust is that education will not be a panacea for inequality, stagnating wages, or unemployment. I think that that is an excellent point, for many reasons. And Krugman's hypothesis about technology replacing skilled workers is really meant as an alternative to the standard theory of skill-biased technological change, not as an assertion that technology drives everything about the labor market.

Update 2: Via Thoma, a paper investigates "skill-biased technological change" and finds that changes in skills were a much bigger factor than changes in technology in the inequality increase of the 1990s. Just another reminder of the danger of putting "technology in the gaps"...

5 comments:

  1. Anonymous11:41 AM

    is globalization somehow different from the technology that lowers transaction costs (shipping/communications/ISO) that makes globalization work?
    micromeme

    ReplyDelete
  2. "Our patron saint would be not Paul Samuelson, but Alvin Toffler." -- Or Thorstein Veblen?

    ReplyDelete
  3. Anonymous11:42 PM

    So, Noah, you're a PhD student. I have a PhD - in Computer Science. I've been watching this issue probably for longer than you've been alive.

    Yeah, economists should just give up, because as a group, they're clueless on this issue, and it's a fundamental issue.

    Take off the blinders, Noah.

    Technology devalues human labor. Period. If you need anything other than simple arithmetic to understand that, then you're in denial.

    In a manner of speaking, people have been made smarter and very much faster by technology. But in an economic sense, people don't get the credit for it - and don't deserve it on an individual level. The technology that fills in those gaps gets the credit, and takes away jobs in the aggregate.

    When a production process requires X% fewer people by virtue of the use of productivity-improving technology, then X% of the jobs required to do that process have become entirely obsolete, if the technology is more economical than the people so displaced. And guess what, it is more economical, by factors, not fractions - in some cases by orders of magnitude.

    How much faster is a computer than a person, at work both can do? Nowadays, the answer is billions of times faster.

    How much smarter is a computer than a person who would otherwise not know how to do a task that a computer can do, even if that task is simple for a programmer to implement? How simple can such tasks be, and still be able to do things that a lesser skilled person could not do for lack of knowledge? E.g., as simple as it is to compute, how many people can't compute compound interest without a machine to do it for them? And as a result, how many jobs are there now for people who can do such tasks without a computer, given how cheap and easy it has become for computers to do them without people even knowing how? In effect, such skills are now obsolete in all but the software developers who now implement them. And, surprise, even those developers become obsolete after implementing them _once_. Goodbye, former job-producing human skill.

    Now imagine two companies. One uses productivity-improving technology, and thus lowers its costs commensurately - which includes using fewer people. Another doesn't use such technology. The latter puts itself at a competitive disadvantage. It must thus either adopt equivalent or better technology - in terms of productivity improvement - or go out of business. Think about that in macroeconomic terms.

    So you (and other economists) set the bar way too high when you imagine that there is no economic impact of technology unless artificial intelligence and robotics and automation are involved to the extent that people are directly and entirely displaced. The bar is actually very, very, VERY low. If a PC on every desk reduces staff requirements by any degree at all - and they have always done as much - then there can be people entirely displaced by the use of those PCs by other people who were fortunate enough to keep a job that uses them.

    This is really very basic, Noah. You shouldn't need a PhD to understand it. Not that I would know, I suppose, since I do have one... I really am wondering these days what intellectual rigor there is to economic curricula and research - I can't see any evidence of any at all, with all due respect to you folks. I can only wish the world would pay a fraction of the attention to computer scientists who understand this issue that it pays to economists who clearly don't know how to begin to think about it.

    ReplyDelete
  4. Anonymous3:37 PM

    I agree with you. Economists are frequently drawn to simple explanations obtained from some basic theoretical model, instead of actually seeing the world. I mean, it is what they are supposed to do, but some people really overdo it... I recently heard someone say that the relative low health of Americans vis-a-vis other developed countries was due to people rationally choosing to care less for themselves since they expected medical advances to solve their problems down the road. I am not saying this is not true, but I would say if there is really any relevance to this "mechanism" it probably is second or third order in importance.

    ReplyDelete
  5. It is difficult to comment on this post without responding to Tyler Cowen's hypothesis at length; the world is highly complex and inter-causal, but here it goes.

    Creative destruction occurs all the time. In the past, or in fast growing economies, new opportunities generally compensate for destruction causing displacement.

    Notice that technology does not always cause displacement. The car arguably did, but what about the refrigerator? Upon closer inspection, the car arguably was a net creator of jobs.

    Technological progress today is mostly process; it is hollowing out middle America; flattening organizations and making operations and marketing positions more efficient.

    ASIDE: Technology may force a re-thinking of hierarchy and executive management, along with remuneration, altogether.

    But the key thing technology is doing right now is speeding up the rate of change. That destruction occurs so fast that new opportunities are not keeping up.

    Sure, it is exacerbated by other factors; firms and individuals are reluctant to invest capital during a recession, globalization and off-shoring, rising health care and education costs which crowd out gross wage increases and profits, etc.

    But two other psychological factors may also help explain today's high unemployment; our tendency to over-build, and the inertia of the status quo. Let me explain.

    Investment drives new jobs. Firms must hire people to research and plan that investment. They also tend to generally over-build; they get too fat, and the longer the economy stays healthy, the fatter they get. They simplify only when competition or bankruptcy forces them to.

    In recessions, all firms lay off all those people at the same time. Some are never going to be hired back. In a way, delaying recessions ensures that when that simplification does come, it will be more cataclysmic.

    There is also an inertia to the status quo; barring significant events it remains. Example: I can off-shore half my product management staff to great benefit even if it does not lower my salary cost; I avoid huge American administrative, management, liability, and regulatory costs even while I convert labor to a true variable cost of sales instead of a fairly sticky variable cost.

    But I don't do it until a recession; I've already hired and trained people, paid for office space and other productive assets. Plus, I like those people. The recession wipes all that away. Now I will never hire them back.

    ASIDE: I struggle to see how Keynesian stimulus can have any net positive influence on these trends.

    CONCLUSION: You are correct in this post: the issue with the technologist arguments in economics is the way in which the term is being used. It is used both as a variable and a constant in the equation of economic growth; it is a 'Zeus' explanation.

    Technology is an enabler. It is more like the Protestant work ethic than it is like labor or capital. And since it is used to describe both process and product innovation, its meaning is diffused even further. That it can both increase the size of economies (iPods) and destroy jobs (automation) makes it even more meaningless.

    In the end, 'technology' as a term is arguably more confusing than helpful. I believe the same can be argued for the idea of macroeconomics, but that is a whole other discussion.

    ReplyDelete