Monday, February 20, 2012

Are macroeconomic methods politically biased?


In a recent post, Steve Williamson writes:
The tools of modern macroeconomics are no more the tools of right-wingers than of left-wingers. These are not Republican tools, Libertarian tools, Democratic tools, or whatever. These are the tools of Economic Science[.]
I've thought about this for a long time, and I'm not sure that Steve is right. I think there is a case to be made that the methodology of modern macroeconomics has the effect of biasing the field toward conservative policy recommendations.

Let me explain why.

Modern models of the business cycle generally rely on one of two techniques: 1) Dynamic stochastic general equilibrium models (DSGEs), or 2) Structural vector autoregressions (SVARs). The former is by far the more popular and well-accepted (although Chris Sims won the Nobel Prize for inventing the latter), so when I talk about "the methodology of modern macro," I'm going to talk about DSGEs.

One of the main features of DSGEs is that they are microfounded; that is, they try to explain macroeconomic phenomena in terms of individual decisions. Another feature is that they are based on optimization, which means that the individual decisions are modeled using the calculus of variations.

Explaining macro phenomena based on individual optimization is hard to do. Individuals may take many things into account when making their decisions; in math terms, this means you can easily have a large "state space." Also, the thing that people optimize (their "objective function") may be very complicated; in principle, it can include all manner of weird things like non-rational expectations, learning, dynamic inconsistency, habits, overconfidence, reference points and framing effects, etc. Finally, aggregating a whole bunch of individual decisions into one giant macroeconomic outcome is in principle a very hard thing to do; it's even harder if you include things like firms and governments.

So, unsurprisingly, making a DSGE model is a lot easier if you make some simplifying assumptions. Here are some simplifying assumptions that make a DSGE pretty easy to solve:

1. The assumption that the economy can be modeled with a representative agent; in other words, that the macroeconomy behaves as if there's only one person in it.

2. The assumption that government doesn't exist, or exists only to transfer income from one person to another.

3. The assumption that prices are fully flexible.

4. The assumption that firms are simple profit-maximizers and make zero profits in equilibrium.

5. The assumption that individuals have rational expectations.

6. The assumption that risk preferences can be entirely modeled using people's utility of consumption, and that this utility can be modeled using a small number of parameters that do not change over time.

7. The assumption that labor markets clear.

8. The assumption that "technology" is represented by the Solow residual, and that technology is exogenous and evolves according to a simple time-series process (for example, an AR(1)).

9. The assumption that the business cycles we observe represent small enough fluctuations that the model that describes them can be linearized around its steady state.

If you make all of these simplifying assumptions (and a few more), you end up with something like the first DSGE business-cycle model: the "Real Business Cycle" model of Edward Prescott and Finn Kydland, first published in 1982. This model, and the approach it pioneered, won a Nobel Prize for its authors.

Now, if the above assumptions seem unrealistic to you, that's because they are! And if you think that this makes the RBC model unlikely to fit the data, well, you're right. It doesn't.

(Side note: If Kydland and Prescott's model didn't fit the data, then you may ask, why was it awarded with a Nobel Prize? The answer is "nobody knows the mind of the Nobel Prize committee," but it is probably because this model was the first business-cycle model to try to answer the Lucas critique. The Lucas critique says that models should only contain "deep structural" parameters - i.e., parameters that won't change when government policy changes. Kydland and Prescott's model bases everything on "tastes" and "technology," which most economists at the time - and many even now - were willing to accept as "structural." Thus, it seemed to many people at the time that Kydland and Prescott had invented a modeling approach that had a good shot at one day explaining the business cycle in a way that wouldn't change when policy changed. Many macroeconomists still believe this, as evidenced by the dominance of the DSGE modeling approach in the macro literature.)

So, the DSGE model that is easiest to make (RBC) doesn't do a great job of describing the business cycle, much less predicting it. What would be better?

Fast-forward to 2007, and have a look at the Smets-Wouters model of the business cycle. This "New Keynesian" model is currently considered the "best" DSGE model in terms of forecasting performance. Which is to say, it performs ever so slightly better than the judgment-based forecasts of well-informed individuals. Consequently, some variant of the Smets-Wouters model is used by most central banks as their DSGE model of choice (which they use as a complement to other types of models, such as SVARs, reduced-form models, and judgment-based forecasts). Of course, the fact that Smets-Wouters is the "best" DSGE model does not mean it is very "good." Its forecasts are basically useless more than one quarter into the future.

Of course, this slight improvement on the original Kydland-Prescott model comes at a high cost in terms of the complexity of the model. Instead of one or two "shocks" (exogenous factors that are postulated to drive the business cycle), Smets-Wouters has seven. And there is a lot of doubt that all of these shocks are "structural" in the sense of the Lucas critique - in other words, there seems to be a pretty big chance that the parameters of the Smets-Wouters model would change if policymakers changed their policies (thus begging the question of why Smets and Wouters bothered to use a microfounded DSGE modeling approach in the first place).

Now realize this: It took 25 years to go from Kydland-Prescott's RBC model to Smets-Wouters. That is comparable to the time it took physicists to develop quantum mechanics.

And yet, despite being so complex, and despite making heroic assumptions about the "structural-ness" of certain parameters, and despite being 25 years in the making, the Smets-Wouters model does not come even close to capturing all of the "frictions" that people believe are at work in the macroeconomy. It does not include the financial frictions that many people believe caused the 2008 financial crisis. It does not include behavioral effects like habit formation, hyperbolic discounting, etc. It does not include learning. It does not include limited enforcement of debt contracts. It does not include hysteresis in labor markets. It does not include income or wealth heterogeneity among households or firms. And this is not even close to an exhaustive list of the relevant things that it doesn't include. To include all those things in one model is prohibitively difficult with current technology; the state space of the model explodes, and you would need a supercomputer to solve it if it could be solved at all.

So what this illustrates is that it's really hard to make a DSGE model with even a few sort-of semi-realistic features. As a result, it's really hard to make a DSGE model in which government policy plays a useful role in stabilizing the business cycle. By contrast, it's pretty easy to make a DSGE model in which government plays no useful role, and can only mess things up. So what ends up happening? You guessed it: a macro literature where most papers have only a very limited role for government.

In other words, a macro literature whose policy advice is heavily tilted toward the political preferences of conservatives.

Is that bad? Not necessarily. If the facts had a well-known conservative bias - i.e., if the models that fit the data best were the models that implied no role for government - then that would just be too bad for liberals! Liberals would have to accept that their ideas were contradicted by the best scientific evidence available.

But I contend that in the case of DSGE models, conservative policy recommendations don't emerge because they come from the best models, but only because they come from the easiest models. Thus, the conservative slant of modern macro comes not from the weight of evidence, but from the combination of publication bias and the inherent unwieldiness of the DSGE framework.

Now here's something else that might be worth mentioning. The DSGE framework was invented in large part by Ed Prescott, a man with deeply conservative political beliefs. The insistence that microfounded models with individual optimization were the only believable "structural" models - i.e., the only models that could answer the Lucas critique - came mostly from people with deeply conservative political beliefs (including Robert Lucas himself). And the criticism of alternative modeling approaches - in particular, of SVARs - seems to be much louder from economists with deeply conservative political beliefs.

That by itself proves nothing. (Maybe they're conservative because they believe the results of their models! Maybe conservatives are more scientifically honest!) But it seems like circumstantial evidence against the alleged political neutrality of modern macro methods.

Was DSGE created as an intentional conspiracy by conservatives to force macroeconomists onto a playing field tilted toward laissez-faire policy conclusions? Almost certainly not. Have conservative-minded macroeconomists been privately pleased with the publication dominance of models that tend to vindicate their prior beliefs? Almost certainly yes. Do I have a better alternative modeling approach handy? No (I'm not brave or foolish enough to mount a spirited defense of SVARs).

The real question, though, is: Has the "conservative publication bias" of DSGE made macroeconomists more complacent than they should be about searching for alternative modeling approaches, even in light of the extremely limited usefulness of DSGE models three decades after their creation? I don't know the answer. But if the answer is "Yes," then the claim that DSGE is a politically neutral tool of economic science is not quite right...

Update: It's worth pointing out that Thomas Sargent, one of the pioneers of both DSGE and Rational Expectations, and one of the three Nobel Prize winners in the photo at the top of the post, is actually a Democrat (though it's also worth pointing out that he left the Rational Expectations paradigm and started working on learning-based models, which have proven to be a lot harder to work with!).

71 comments:

  1. Anonymous11:31 PM

    "One of the main features of DSGEs is that they are microfounded; that is, they try to explain macroeconomic phenomena in terms of individual decisions. Another feature is that they are based on optimization, which means that the aforementioned individual decisions are modeled using the calculus of variations.

    Explaining macro phenomena based on individual optimization is hard to do. Individuals may take many things into account when making their decisions; in math terms, this means you can easily have a large "state space." Also, the thing that people optimize (their "objective function") may be very complicated; in principle, it can include all manner of weird things like non-rational expectations, dynamic inconsistency, reference points and framing effects, etc. Finally, aggregating a whole bunch of individual decisions into one giant macroeconomic outcome is in principle a very hard thing to do; it's even harder if you include things like firms and governments."

    This is a pretty damn common comparison I know, but this sounds like you're comparing economics quite well to Hari Seldon's Psychohistory (from Isaac Asimov's Foundation). In other words, it's possible to calculate the actions of groups, but the smaller the group (and the more individual groups/actors) the harder it is to calculate.

    (Not a particularly useful comment I know, but interesting to me given that Asimov invented the fictional field in 1942)

    ReplyDelete
  2. Anonymous11:35 PM

    Why call them assumptions? Since they are all obviously false, why not call them (very, very poor) approximations?

    ReplyDelete
  3. Anonymous11:57 PM

    I suppose I would accept "counterfactual assumption".

    ReplyDelete
  4. You forget a key feature of most macro models before the liquidity trap. Taxation was used to fund government spending, which simply vanished in the sea...

    ReplyDelete
  5. This is a very excellent post. I think you demonstrate your claim.

    You will guess that I think you go to easy on Williamson. In particular you don't contest the claim " economic science". Williamson simply asserts that mainstream macro is a scientific endevour. I don't think many people who approach the question with an open mind reach that conclusion.

    Very importantly the profession absolutely has not moved from Kydland-Prescott to Smets-Wouters. The recent policy debate made it clear that many leading macro-economists ( including Prescott and Lucas) simply reject New Keynesian economics and reason as if the K P model were the world.

    Notably, the fact that DSGE models can be fiddled and massaged until they forecast one quarter ahead better than people working on hunches ( which does not mean better than VARs) provides no evidence that the approach will lead to a decent model .. ever. The dominance of the approach absolutely is not be based on evidence. It was decided that it is promising and Williamson thinks that accepting this decision is necessary and sufficient for one to be a scientist.

    I think it is clear that he has no idea what science is. I think the first and second rules of science is that theories bow to facts. It is clear that he thinks that a hope that mathematical tools will be useful someday should bow to nothing and that this act of blind faith is science.

    ReplyDelete
  6. What is the point of macroeconomics without stickey prices?

    As someone who came as age during the Little Depression I am really gobsmacked anyone thought anything useful could come of that assumption. It seems criminally negligent.

    ReplyDelete
  7. You may find this paper, "Towards a Political Economy of Macroeconomic Thinking", by Gilles Saint-Paul interesting: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1889987

    ReplyDelete
  8. I assume that it is the "stochastic" part that causes the state space to explode. If you leave out or limit the stochastic effects then even a modest computer should have no trouble modeling the interaction of thousands or tens of thousands of representative entities.

    It is very difficult to build from micro foundations to macro behaviors - chemists certainly do not try to do chemistry based on the quantum theory of fundamental particles.

    ReplyDelete
  9. So, one of the fundamental lessons of physics from the last few decades is the idea of emergence- i.e. the whole really IS more than the sum of its parts.

    In physics, macro models of reality contain effects you could never see from simple aggregates of micro effects. It surprises me deeply that economists haven't noticed similar effects. You probably can't model a firm or a government by aggregating the behaviors of the individuals within them.

    ReplyDelete
  10. Also, witness, they were all massively useless.

    http://blog.rivast.com/?p=5538

    ReplyDelete
  11. GlibFighter2:49 AM

    Woodford's "Interest and Prices" certainly is a rabid right-wing rag.

    ReplyDelete
  12. The whole idea of an equilibrium, or a natural trend to it, is in itself already doubtfull. It can be confused with stable looking periods in a dynamic process.

    ReplyDelete
  13. foosion5:57 AM

    DSGE was largely created by conservative economists who didn't like any theory that promoted government involvement in the economy. They may not have set out on a conscious level to force macroeconomists onto a playing field tilted toward laissez-faire policy conclusions, but it appears to be a large part of their preferences.

    DSGE appears to be a way to hide a set of anti-government assumptions behind a thick fog of complex modeling.

    “In four years of reflection and rather intense involvement with this financial crisis, not a single aspect of dynamic stochastic general equilibrium has seemed worth even a passing thought,” [Larry] Summers said. http://web.mit.edu/newsoffice/2011/summers-talk-1104.html

    ReplyDelete
  14. DSGE is a theoretical model. SVAR is a statistical model, with a little bit of theory on top to disentangle structural shocks from residuals. Macroeconomics needs to use (and actually does use) both. It doesn't make much sense to present them as substitutes, just like it would be wrong to say that supply and demand model is a substitute for linear regression.

    Now, regarding the limitations of DSGE models, your list is true when we are speaking of 80's style RBC literature. But it's 2012 now and in the meantime, people have studied deviations from all of those unrealistic assumptions. It's true that putting them together in one huge model is infeasible, but why would you want to do that anyway? It would be nice if you could just think of a change in policy, feed it into a black box and precise predictions would fall out, but this is not realistic goal for macroeconomics (not in foreseeable future, and maybe never). Yet we can still learn a lot from building simpler models that focus at one or two deviations at once.

    Finally, as a European I don't really feel qualified to comment on policy debate in US, but it's quite obvious that, outside of central banks, academic DSGE models have had limited impact on policy. Sure, there are macroeconomists who work on DSGE models and are conservatives, just like there are those who are liberal. And anyway, it doesn't really matter. When you discuss economics with conservatives / libertarians / Ron Paul fanboys (we have those over here as well), they have no idea about these things, as their understanding of macroeconomics ends around 1930's with Keynes, the bad guy, and Hayek, the good guy (or so their favorite talking head told them).

    ReplyDelete
  15. I am not certain it is not a "conspiracy". Kim Phillips-Fein's book, Invisible Hands, outlines quite well the rights attempt to subvert the New Deal and the idea that government can actually provide some good. The wealth of the right being used to creat many/most/all of these right wing think tanks whose mission is not to inform but to promote right wing ideology. Additionally, we have the climate deniers arguing that Climate Science is just an attempt to get "big" government to take over your life, we have Santorum arguing the public education is anachronistic, to name just two that have gotten press in the past two days.

    ReplyDelete
  16. I think you're broadly correct.

    it has always struck me that nothing can ever really go wrong in these economies that needs fixing - a negative technology shock or preference shock only presents a very limited sort of problem. How can these models help us make policy decisions when they cannot capture the problems we face?

    A very common explanation for recessions - not one everybody likes but certainly worthy of investigation - is the old liquidity preference story, something happens to trigger everybody trying to accumulate money at once (a la Nick Rowe), everybody trying to adjust their balance sheets, etc. cannot, as far as I know, really be addressed in these models.

    Similarly, some of the reasons why you might advocate fiscal policy (maybe, the governments ability to directly hire people in face of mass unemployment) or some of the reasons why you might think monetary policy ineffective (pushing on a string stuff) can't, as far as I know, really be addressed in this models.

    happy to have my ignorance corrected.

    ReplyDelete
  17. Anonymous8:39 AM

    Noah

    very very useful post

    the devil is in the details

    deciding what to put in and leave out of a model only reflects the bias of the economist, the result intended

    ReplyDelete
  18. A shorter explanation could be:

    DSGE contain GE. GE is a libertarian fairy dream.

    You might add:

    NOBODY believe GE is a reasonable description of the real world. GE cannot even be shown to be dynamically stable without very restrictive assumption – not that the assumption needed for existence is not ridiculously restrictive to start with.

    The only reason GE is invoked is that it makes armchair economics trivially easy. As with macro, micro models seldom incorporate more than one or two imperfections – not because the world is not full of them, but because it is hard - or maybe rather because their effect becomes untraceable.

    I.e. economists like simple models. That way they can say things about stuff without actually looking into the details. If you put a lot of imperfections in the models, you will not be able to say general things about stuff that share one or a few characteristics together – but actually have to look at the issue at hand – and who wants to do that? If the model is complicated enough maybe, good forbid, you won’t even be able to derive an analytical answer. “What!!” – the economist screams. How will I know that the derivative with respect to X is positive everywhere? Do you expect me to run numerical simulations to deal with complex issues instead of simply pretending that I have an answer? NEVER! I rather assume a can opener!

    ReplyDelete
  19. ... although it's easy to overlook what's good about DSGE. Contra commentator above, I don't agree that thinking about general equilibrium effects is a libertarian fantasy, even lefties need to consider how changes in one part of the economy affect other parts.

    b.t.w I think it's worth emphasizing that these models were never set up to predict business cycles. They take a shock process as given, and then ask how to respond to those shocks.

    There is nothing, afaik, in these models that can capture anything akin to the build up of some imbalance or similar that will eventually cause a recession. (maybe like those leverage cycle papers?)

    again, happy to be corrected.

    ReplyDelete
  20. Ivansmi:” Yet we can still learn a lot from building simpler models that focus at one or two deviations at once.”

    Sorry for being rude – but this is retarded.

    Assignment:
    Try to make a simple model with one deviation. Now, incorporate another one.

    Question:
    Does the first deviation still have the same effect?

    Hint:
    No – it will not – the effect might even be the exact opposite of whatever it was in the beginning.

    Anonymous:"why not call them (very, very poor) approximations?"
    Because that would require that you made a statement about the real world. What’s next – are economists going to state that their results are approximations of reality to? NEVER!!!

    If I state that I could torture the data to fit this model with a R2 of 0.9, and you for some reason think that I thus has said that this is a good model – the fault is entirely yours. My conclusions clearly stated that GIVEN that this model was the correct one, it would imply this and that. I did not say that it was the correct one (so you can´t blame me). Only a idiot would think it is correct – as you point out, it clearly lacks this and this and this and this and …

    However, in a policy discussion I will mention that a lot of people is using this model and then jump to the conclusion that it thus must be a good one (and if it would turn out to be wrong, no one in particular is to blame).

    ReplyDelete
  21. @Luis Enrique
    Of course you have to think about GE effects.
    I was talking about the specific pareto optimal (or efficient) GE.
    That is a libertarian fantasy - and is something radically different than to investigate how “changes in one part of the economy affect other parts of the economy”. I would almost say that it is the opposite. Every economist know what GE effects a change in one part of the economy has on the others given a pareto efficient GE world. If you are a economist, and actually has to think about the GE effects, you have already left the worldview I criticized (and called a fantasy).

    ohh - and it is the fantasy version (or something very close to it) that is used in DSGE.

    ReplyDelete
  22. Oh, Noah...how disappointing.

    ReplyDelete
  23. @nemi:
    Sorry for being rude – but this is retarded.

    Assignment:
    Try to make a simple model with one deviation. Now, incorporate another one.

    Question:
    Does the first deviation still have the same effect?

    Hint:
    No – it will not – the effect might even be the exact opposite of whatever it was in the beginning.


    If you can find such thing, great, write a paper and get it published. The point is you should build models with multiple deviations (or frictions, as economists would call them) if you have some good reason for it, i.e. you suspect that two (or more) frictions interact together in nontrivial way, and not because you want to emulate physics and derive macroeconomic theory of everything.

    ReplyDelete
  24. ah, David ... write a response!

    ReplyDelete
  25. David:

    Sorry to disappoint you...

    It's certainly possible that I'm wrong about this! It wouldn't be the first time in my life that I've been wrong about something! ;)

    I'd appreciate it if you could elucidate the reasons why I'm off the mark...

    ReplyDelete
  26. Anonymous11:40 AM

    Why is it not surprising that Andolfatto would be disappointed? I await the explanation with bated breath.

    ReplyDelete
  27. You do not view agent-based modeling as an alternative to both DSGE and SVAR, Noah? This approach does have some advocates in the central bank modeling groups, if still dominated by the other two.

    ReplyDelete
  28. You do not view agent-based modeling as an alternative to both DSGE and SVAR, Noah?

    I view it as a potential alternative. I don't think it's been explored sufficiently yet.

    I am planning another big post on agent-based modeling, and why people who insist on microfoundations should embrace it.

    ReplyDelete
  29. I think we need a generally accepted equivalent of the "Lucas critique" -- an agreement that it is methodologically unacceptable for models to (a) have a unique equilibrium and/or (b) be stable absent shocks.

    It is clear from theoretical analysis of economic models, history, empirical time series, and analysis of actual economic decision making that unique equilibria and stability are completely false claims for real economies. Assuming them today is similar to ignoring the evidence for quantization after 1920.

    If we don't have solvable models without those assumptions, well, that is why economists get paid fairly big bucks -- certainly as big as physicists. Time to get to work rather than telling more and more complicated fairy tales.

    I agree with nemi (if I interpret him/her right) that these assumptions lead directly to the libertarian fairy tale. Without them that fairy tale has no support from economics. Working with unstable models that have multiple equilibria would open up discussion of policy options in interesting ways.

    ReplyDelete
  30. A related point: It is much easier to build models with empirically credible microfoundations if we toss out unique equilibria and stability. So we can embrace fans of Lucas, albeit in a way they won't particularly appreciate.

    ReplyDelete
  31. Anonymous1:03 PM

    "t is clear from theoretical analysis of economic models, history, empirical time series, and analysis of actual economic decision making that unique equilibria and stability are completely false claims for real economies."

    Not only is this comment utterly bizarre, it is false. How could you possible know whether there are other equilibria in the "real world" when all we can observe is the one we're in now? Jed might be the dumbest person I've met today, and I went to the DMV earlier.

    ReplyDelete
  32. Very revealing comment: "How could you possible know whether there are other equilibria in the 'real world' when all we can observe is the one we're in now?"

    If correct this implies that we can never (ever) know that any observed process or system has multiple equilibria. If so this is a rather strong result in epistemology. Publish quickly!

    On the other hand if this is just a typical case of how current practices in economics are defended, then comment is superfluous.

    ReplyDelete
  33. David Andalfatto,

    Like Noah said, please tell us why what he's saying is not true. This is an important issue, and it's important that you let us know why, or where, this is wrong -- if, in fact, it is. I'm more or less utilitarian, and it looks like Noah is at least pretty much. Utilitarians are happy to adopt more libertarian policy, if it is a case where it actually increases total societal utilis.

    I've long thought the same thing as Noah. The messy things we tend to take out of models to make them more tractable tend to be the things that make it so a strong smart government role can greatly increase total societal utils.

    It's done for easier tractability, and these things should be considered, added back, in the end, when making inferences to the real world, and policy, but often they aren't.

    I mean, look at what basic classical models assume away: externalities, asymmetric information, natural monopoly, and just monopoly power, inability to price discriminate, zero marginal cost idea products, inability/impracticality to patent,... Things that imply that a strong government can massively increase total societal utility.

    ReplyDelete
  34. Anonymous2:40 PM

    Isn't one of the well-known but little-discussed features of all economic modeling that models show results they are intended to show? I don't even mean that in a bad way (until stupid people get ahold of the results). If you build a model to study the effects of monetary policy, you'll find effects from monetary policy. The impact of fiscal policy or external shocks or whatever are not the subject being modeled, so don't show up in a big way. That's intentional. Seeing the result, the smart researcher doesn't then declare "I have found that monetary policy is superior to all other forms of economic intervention." The smart researcher says "I have learned something new about the details of monetary policy's influence, in the context of a bunch of other stuff I did my best to hold constant." Declaring victory over a bunch of stuff you tried to hold constant is really, really bad thinking.

    If you start out by modeling only the impact of external shocks because Lucas told you that treating government policy as an independent variable will lead to bad results, you don't then conclude from the results of your modeling exercise that government policy is ineffective.

    This is, of course, not a narrow problem with one kind of model. The simplifying assumptions of 101 from decades back now end up being asserted as reality. (There's Keynes and his dead economists.) Blackboard models with only price takers and markets that clear and the like, just like DSGEs, are made that way to make the model tractable. Somehow, those assumptions escaped the lab and became what we "know" about the economy. It's silly, but it's true.

    ReplyDelete
  35. JohnM3:04 PM

    Be nice if you'd caption your photos. Maybe not everyone knows who these guys are or their relevance to your post.

    ReplyDelete
  36. @Anonymous: "Isn't one of the well-known but little-discussed features of all economic modeling that models show results they are intended to show?"

    yes - it ceartainly is.
    Take the canonical model of neoclassical economics - the one about the perfect competitive firm.

    Now - assume that the customers do not really have perfect information. Assume that they know the price of the brand they usually buy, and know that the other brands are exaxtly the same, but it takes one second to look at their pricetags.

    The equilibrium price now becomes equal to the monopoly price.

    How anyone can think that these kinds of completely unstable models have anything to say about reality is beyond me.

    PS: The first thing you have to (or at least should) learn in economics is the A-prime/c-prime theorem

    http://www.deirdremccloskey.com/docs/pdf/Article_287.pdf

    ReplyDelete
  37. BT (London)3:38 PM

    DSGE failed to predict the crisis of 2008 and the ensuing great recession.

    Why? Because DSGE models lack credit and debt.

    Try Steve Keen, who has a model with credit and debt that can reproduce a Minsky moment.

    ReplyDelete
  38. Noah -
    I'll accept that Ed Prescott is conservative, but Lucas? Check out his interview with the WSJ last fall: he voted for Obama, and he supported the stimulus package in principle as well as the quantitative easing. He's no purebred Keynesian either, and advises Obama to lower taxes on capital, but to pass Lucas off to your readers as someone with "deeply conservative beliefs" is an incorrect and unnecessary manipulation in favor of your alternative hypothesis.

    http://online.wsj.com/article/SB10001424053111904194604576583382550849232.html

    ReplyDelete
  39. Anonymous5:38 PM

    Noah, one could summarize your post in one sentence: Conservatives prefer DSGE models over other approaches.

    Unfortunately, you make only loose connections between the assumptions of the DSGE models and political preferences. You should have shown how political preferences are reflected in the assumptions.

    ReplyDelete
  40. Dear Noah,

    Anti-Mankiw definitely agrees: it's the politics that drives the economics.

    Keep up the great work.

    Anti-Mankiw Team

    ReplyDelete
  41. Noah,

    What about the coordination failure literature - 1980s style - Diamond, Cooper, Bryant, and the sunspot literature - Cass, Shell, Azariadis, Farmer, Benahabib, Woodford. Same tools. Different conclusions. Not to mention the New Keynesian literature. Same tools. Different conclusions.

    Steve Williamson

    ReplyDelete
  42. Another one:

    I like that picture - circa 1990 I think. Those guys look so happy. The look on Ed's face is classic.

    Steve

    ReplyDelete
  43. Steve:

    The Cass/Shell and Cooper/John papers, in the 1980s, were not DSGE models. Much later, people modeled sunspots and coordination failures with DSGE, but to my knowledge - and I've only glanced at those later papers, not read them thoroughly - it was hard to do that in a way that got those DSGE models taken seriously.

    Which I guess is my whole point; if you think Cass & Shell and Cooper & John were onto something interesting in the 80s, and if it's true that the macro field began demanding that everything be done in DSGE around that same time, and if it's true that forcing the coordination-failure and sunspot insights into a DSGE mold weakened the insights, then I think that shows exactly the kind of thing I was talking about in my post.

    I think that New Keynesian models, in particular, are really a prime example of how DSGE forces non-RBC-type models to be less tractable and to make less believable assumptions. Now maybe that's good and maybe that's bad, but it does seem to result in a conservative publication bias.

    ReplyDelete
  44. Herman12:46 AM

    @ nemi
    Thanks for the A-prime C-prime Mccloskey paper. Great reading and very true.
    http://www.deirdremccloskey.com/docs/pdf/Article_287.pdf

    @Noah
    Why are you pulling your punches? Hope that hasn't now become necessary for the young professor? Oh, and a somewhat belated congrats on your appointment. Hope you can continue with the blog - along with research. It continues to be enjoyable and thought-provoking reading.

    ReplyDelete
  45. The tools of economic "science" are biased from the very first moment. Because interpersonal comparisons of utility cannot be made scientifically they are assumed not to be important. Or rather they are not assumed to be important which amounts to the same thing.

    Benjamin Franklin got it right when he inserted the word "self-evident" into Jefferson's draft of the Declaration. Some moral axioms are not or should not be up for debate, among which is the proposition that, other things equal, a dollar will always be worth more to a poor man than a rich one.

    ReplyDelete
  46. You seem to be saying that all models that we have now are barely better than no information. I wonder what you think government policy should be in an environment of almost complete ignorance?

    ReplyDelete
  47. Darf:

    See this paper that I linked to in the post.

    Anon:

    Thanks for all the links (I only knew the Krusell paper and the Sargent paper). But what is your overall point?...

    ReplyDelete
  48. Anonymous1:21 PM

    the overall point is that the DSGE tools have been widely employed to find good reasons to have the government stepping in. When you consider heterogeneity, incomplete markets and so on, there can be plenty of good reasons for that. All the modern literature that consider these issues speaks the DSGE language, and the paper I have posted are just a little example of it! [Side note: there's no need to have 7 shocks-including a markup shock that is indistinguishable from a preference shock in the model, lol - to get a role for government]. The "fathers" of this literature are, by the way, all intellectual "sons" of those conservative minds that introduced the DSGE models. So my impression is that the claim by Steve Williamson is correct (dsge is just the language of modern macroeconomics and it's not a conservative or a democratic language) and that your argument doesn't make a lot of sense if you consider what is the state of macro today (see those paper, or this: http://www2.econ.iastate.edu/tesfatsi/SomeThoughtsOnStateOfMacro.Kocherlakota2009.pdf)

    ReplyDelete
  49. Anon:

    My point is not that it's impossible to make DSGE models where govt. has a role. My point is that it's hard - i.e., that it's computationally costly, and often ends up requiring unrealistic simplifying assumptions. And my argument is not that no one is publishing DSGE papers supporting govt. intervention; obviously many people are. My point is that there is that the aforementioned limitations imposed on these models by the clunkiness of the DSGE framework leaves the papers open to more criticism than they would otherwise be.

    ReplyDelete
  50. Anon with the research papers :

    None of the papers you linked to is a business cycle macro paper. They are all, by and large, exploring partial equilibrium optimality conditions in a DSGE framework.

    For example,

    1) Krusell's paper shows that tax on capital income would be substantially less than 100% even if the government did not care about capital accumulation. That is neither here nor there for the purpose of this debate/ post.

    2) Boldrin's paper imagines a world where students don't get loans (!!) and thereby defines the role of public financing of education to be utility-increasing by construction. Then it finds the optimal rate of return on public financing/ state pensions for this arrangement to be pareto efficient.

    3) Krueger's paper finds the optimal tax rate given income and wealth inequality and the risk of unemployment/ wage cuts. It ignores deadweight losses.

    Sure, there is wage rigidity, incomplete markets, overlapping generations and intertemporal optimization. But none of that is used to argue/ explain/ explore output, inflation or unemployment.

    These are all partial equilibrium optimization papers using DSGE techniques. They are consistent within their universe and their referenced-paper universe, producing the right mathematical result given a set of 100 standard assumptions and one new modification. None of them is calibrated to data. And none of them has anything to do with business cycle theory.

    What was your point again?

    ReplyDelete
  51. Noah,

    No one from the Minnesota school, broadly defined, uses the term "DSGE" in the way you're using it. Why are you hung up on what DSGE is anyway? That's completely unimportant - semantics. You have to understand first what the Minnesota program is all about. It's just using serious theory to think about macroeconomic questions. That's it. Why worry about political bias. Go talk to Ed Prescott. Ignore what he tells you about the Tea Party and such, and get him talking about something interesting. You'll actually learn something. I always do when I talk to him. The guy just likes to think about serious economics. He's a scientist. So is Lucas. So is Sargent. I'll write you a post this week to explain it more carefully. This whole thing is not well understood by the profession I think.

    ReplyDelete
  52. Thanks for attaching the link to Smets-Wouters.

    ReplyDelete
  53. Why worry about political bias.

    Well, it was just a thought... ;)

    Go talk to Ed Prescott.

    You think Prescott would talk to me???

    I'll write you a post this week to explain it more carefully.

    Thanks!!

    ReplyDelete
  54. Anonymous10:13 PM

    Zen Babu:

    All those paper are general equilibrium models; only two of them (one by krusell and one by boldrin) don't have a real quantitative exercise and are theoretical, the others do. My point, again, is that DSGE have been (and are) widely used to analyze situations in which the free market allocation is not pareto optimal and in which there is space for the government to step in. These are not business cycles models (well actually Rios Rull's paper is), because that's not what they want to study...you can always add some aggregate uncertainty and make it a business cycle model...like in...Krusell-Smith.

    Noah: I agree it's hard, but it doesn't seem to me that the fact that it's hard is pushing the profession to ignore those frictions as you claim...quite the opposite (especially far from the oceans).

    Anyway, I guess we agree to disagree?

    ReplyDelete
  55. Anonymous5:48 AM

    I laugh at Stephen Williamson's comment. "Serious theory"? "[Prescott] likes to think about serious economics. He's a scientist. So is Lucas." Ha, ha.

    And, of course, Noah never said he wasn't willing to learn from Prescott.

    And the question of political bias in the post was addressing a comment from Williamson.

    Anyways, we see that one of the co-authors on a well-regarded series of papers on noise trading doesn't think Williamson does science. Clearly, economists differ among themselves on what are scientific norms or whether DSGE models, more or less, conform to them. I doubt Williamson is willing or able to clarify such differences.

    ReplyDelete
  56. conesnail10:05 AM

    I find 2 things absolutely telling in this summary

    1. Your models are incredibly simplistic given the complexity of the problem. Therefore they have no predictive power. How can this possibly be ok? In science (I'm a scientist) a model with virtually no predictive power would hold no sway whatsoever in any scientific field I know of. Clearly, you need to get alot more complex and God knows you have enough data to model.

    2. The idea that "a more complex model would require a supercomputer..." is basically a non-starter. Why the heck is that? Why would you guys not be using the most powerful computational tools available to model what is clearly a very complex problem? A climate scientist that said that they can't make their model more complex and therefore more predictive because it won't run on their laptop would not get very far. Why should the standards in economics be so much lower? It seems, given recent events, that the importance of the problem might warrant it.

    Put it all together, and you guys just seem like wimps. Considering how much influence economists have, it's a little scary to witness the soft intellectual underbelly of the field.

    ReplyDelete
  57. Conesnail and Noah,

    As a PhD student in finance, I found that we always super simplified our models, making them a lot less realistic, so that they could be solved in closed form (i.e. you could find the optimum, like best portfolio strategy, with math, not a computer). Or, sometimes we would let the computer do the optimization. But there, a big problem was that I commonly saw people using local optimization algorithms to find the global optimum. And this was in situations where the state space was riddled with local optimums. So your "solution" was super dependent on where you had the algorithm start looking. And even if you used a global optimization routine, they just weren't nearly good enough to do much better.

    What I always wanted to do (but never had time) was to construct a truly realistic model. It wouldn't be solvable in closed form, or reliably with a global optimization program, but it would be fantastic as a simulator, for testing the solutions that come from the prestigious closed form, and other simplified models. You could plug in the strategies that they say are optimal, and then see how high an NPV, or GDP, or expected utility, etc. they create. Then, you could use your intuition about where those models are unrealistic to come up with a new strategy (vector of parameter values) and see if it beats it (or you could use your new strategy as a starting point, and then run the local optimization program from there).

    In this way we could keep looking for better and better strategies, which could give us just better strategies, as well as better intuition. And, it gets at the complaint that we can't test well in econ like in the harder sciences. With a great, very realistic simulator, even if we can't solve it globally, due to infinite and varied local optima, we can use it as a great simulator and tester.

    ReplyDelete
  58. Anonymous3:57 PM

    This is getting silly. Economists use computers to solve their models, especially macroeconomists. Models used for obtaining quantitative results will almost certainly only be able to be solved using a computer. Closed forms are useful for learning about a mechanism, but will be useless quantitatively.. and this is well understood. Economists who do work that requires lots of computation understand that a local optimum is not the same as a global optimum. There are plenty of strategies for dealing with this. Also, no one should be limited to having to solve their model on a laptop.. there are plenty of resources out there, some even free.

    ReplyDelete
  59. Definitely your post provides a great and useful resource every reader must adhere. This is truly a must read and admire. Thanks a lot for sharing!

    ReplyDelete
  60. Anonymous6:24 PM

    Great post, indeed. I always try to point out to my students the fact, that governemnt spending is taken as bringing zero utility to consumers as a point in case of this argument. However, one thing i have to add. I think that economics is laissez faire/right wing oriented, and rigthly so. The point which should economics stress at any time is the costs of any action and problems inherent to government solutions. And I would claim that people from non-economics field are more likely to analyze problem than to suggest solution which takes into consideration the cost-benefit analysis, or the incentive complications associated with this. And traditionally, the divide is in terms of righ-left is indeed in terms of government activism.

    ReplyDelete
  61. @Chops,
    Lucas could vote for Obama because they are both conservatives. Your universe of economic discourse (like the US generally) lost its left wing long ago.

    What Will said about emergence, the utter futility of micro-based macro and the astonishing insularity and backwardness of the persistence of the endeavour.

    ReplyDelete
  62. Global optimization can be an incredibly difficult thing with very complicated, high dimensional functions, or just impossible to do, that is to find the true global solution, and be able to prove you found it (or be sure with a high degree of probability that you're close with a given amount of precision). It's not something you just brush away, oh we have ways of handling that. We do have nice ways of handling it for some pretty uniform simple functions, but for the kind of very realistic, very high dimensional models I'm thinking of, forget it. Not if you have all of the super computers in the world.

    Google this; ask a math prof. who works in this area, or just look at the Traveling Salesman problem. It's nothing in complexity and non-uniformity compared to the models I'm thinking of, and yet its global optimization has been worked on by top mathematicians for decades.

    I ran into global optimization a lot as a PhD student, and read up on it and talked to professors in the finance and math departments about it. It's a huge problem. In many cases, you just do your best (and I did commonly saw just local optimization used for papers published in respected journals), but you have no good systematic way of knowing if you came even close to the true global optimum.

    So there is definitely an incentive to really make unrealistic simplifying assumptions to be able to solve, to optimize, the model. If, on the other hand, you were free from the worry of precise global optimization, and you just wanted a good simulator to test what other models say, to test strategies, and hypotheses, then you could be free to construct a really realistic and supercomplicated model. And that could be really useful.

    After all such computer simulations are used in many other sciences, as pure simulators, without trying to make them way less realistic so they are solvable. Examples include nuclear physics and aerodynamics.

    ReplyDelete
  63. Dave Mac8:27 AM

    "But I contend that in the case of DSGE models, conservative policy recommendations don't emerge because they come from the best models, but only because they come from the easiest models. "

    Best thing I've read in awhile right here. I am an Undergrad Economics student (who does a fair amount of reading ahead) who has been particularly struck by exactly what you are talking about in the quote I pulled from your post. All the models we are learning, (and all the models I've read ahead about), make assumptions which allow for nice clean mathematical explanation - As you said, the models are desirable, because they are easy to work with. While I see the value in fairly simple mathematical models, it seems to me that they have become a crutch for the discipline. "Simplifying assumptions" are all well and good, but it seems to me that many important variables and outcomes are lost and forgotten, they never come back into the picture. It's likely that those who pursue grad school credentials find some of the omitted factors coming back into play, but it also seems from my reading ahead, that for some they never do.

    Further, we have a glut of undergrads (the amount which demand outstrips supply in my school's econ faculty is unreal) who are going to be going out into the world where they will spout their halfcocked ideas about how the economy works. And they will know they are "right" because from the get go they had it shovelled down their throats that economics does positive analysis, that it is unbiased and scientific. Undergraduate economics rarely teaches students to think critically at all, what a shame.

    Anyways, I really enjoyed your post. Despite my apparent disillusion with the field of Economics, I do plan to attend grad school. It is nice to know that there are those out there who think critically about the material and are interested in finding new ways to approach it. In a way Economics is an exciting field because even its foundations are far from certain. It feels like a discipline that is still very young, a ripe for some pretty big paradigm shifts in the near future.

    ReplyDelete
  64. Anonymous8:14 PM

    Why did anyone pay any attention to the "Lucas critique?" It's an erroneous critique.

    It is known that the structure of the economy is determined by the laws and customs of the society, and that those are maintained by government -- they are *government policy*. A model which does not include parameters for government policy and other local social decisions is clearly just bullshit.

    Lucas's critique, as you describe it, appears to have no validity at all. Why were people reacting to it as if it did?

    ReplyDelete
  65. Anonymous8:16 PM

    OK, I get the idea behind the Lucas critique a little better, but it's *still* misguided. It's like asking for a model of chemistry which works for liquids, solids, and superheated plasmas.

    You figure out which elements of government policy are critical and devise *different* economic models depending on what the state of those elements is.

    ReplyDelete
    Replies
    1. BINGO. That's how you do it. That's how every science has done it. Before physics was "unified", there was a theory of optics, a theory of ballistics, a theory of waves, etc.... you used whichever one seemed to fit the problem best.

      After it was "unified", you STILL figured out which phenomena were "dominant" and ignored the rest.

      The Lucas critique is asinine and unscientific. A theory which works in a limited domain is far better than a theory which *doesn't give accurate predictions at all*.

      Delete
  66. Noah,

    This is a great post. I make a similar point here:

    http://unlearningeconomics.wordpress.com/2012/01/13/why-does-neoclassical-framing-matter/

    though my point is more general - that certain institutions are presented as 'intervening', which opens the door for the politics of 'not intervening' and protecting existing institutions (also known as conservatism).

    ReplyDelete
  67. Richard Serlin, very interesting comments.

    Especially in your second you implicitly highlight a major background assumption that causes a lot of the trouble: seeking a global optimum.

    In contrast, other scientific models -- climate models for example -- aren't seeking optima, they are just trying to accurately simulate what happens. (Climate models and many others are micro-founded to a large extent, but this is not a big deal either way.)

    My question as a non-economist is "Why is seeking global optima a major focus?" Especially since it makes models much less tractable! (As you correctly explain.)

    Obviously in general our economy doesn't achieve a global optimum. Indeed there are intractability results to show it can't, and also instability results to show there is almost certainly no stable global optimum.

    Does the economic profession have some deep commitment to this unrealistic requirement? If so, that is a big part of the problem.

    I don't know that we could find tractable models if global optima were dropped. But certainly a necessary step toward positive economics is to build models that approximate the world closely, and requiring global optimality runs completely counter to that.

    ReplyDelete
  68. Anonymous2:00 PM

    It's interesting that you see DSGE modelling politically biased.

    It's true that in most DSGE models the role of government is minimal; but this is rather the set up of the models, not a result derived from such models. Please correct me if I'm wrong but I couldn't recall a published paper which uses DSGE model to show that it is best to have a small government.

    Bear in mind that DSGE models developed nowadays are mainly for central bankers to analyse the effects of alternative monetary policy rules. They are not specifically designed for the evaluation of fiscal policies. To study the effects of a certain public policy, economists use microeconometric methods, not DSGE models.

    Of course, by saying this I assume that the Central Bank is reasonably independent from the government. How (or how not) independent it is in reality is another topic of discussion.

    ReplyDelete