Friday, March 29, 2013

The swamps of DSGE despair


David Andolfatto (of the St. Louis Fed and the blog MacroMania) points me to this interesting recent working paper by Braun, Korber, and Waki. The paper bears the somewhat unwieldy title of "Some unpleasant properties of log-linearized solutions when the nominal rate is zero."

Basically, the authors of this paper take a New Keynesian model somewhat similar to the ones used by "Keynesian" macroeconomists - e.g. Paul Krugman - to help justify the use of fiscal stimulus in a depressed economy. They note that in most papers, the models actually used to measure the effect of government policy are "linearized" versions. 

For the uninitiated: A DSGE model starts with the assumption of optimization by various economic agents such as households and firms, which spits out a system of nonlinear equations representing people's optimal choices. These nonlinear equations are then "log-linearized" around a "steady state", and the linearized forms of the equations, which are very easy to work with mathematically and computationally, are used to compute the "impulse responses" that tell you what the model says the effect of government policy will be. The linearization is equivalent to the assumption that the economy undergoes only small disturbances. That might not be a good assumption when it comes to major events like the recent crisis/depression, but it does make the models a LOT easier to work with. It also generally makes the equilibria unique - in other words, if you use the full, nonlinear version of a DSGE model, you are likely to come up with a bunch of different possible paths for the economy, and which path the economy takes will be determined purely by quantitative factors (like, whether variable X is more or less than 2.076, or something like that) - not the kind of thing that DSGE models are good at getting right. Since "multiple equilibria" generally means "we really don't know what's going to happen," macroeconomists tend to stick to the linearized versions of models, so that they can say "we do know what's going to happen."

Anyway, Braun et al. decide to venture into no-man's-land, and work with a non-linearized version of a New Keynesian model with a Zero Lower Bound. They find that, unsurprisingly, there are multiple equilibria. In some of these equilibria, the kind of special ZLB effects found by Eggertsson and Krugman - for example, the "paradox of toil" - are present, but small in size. In other equilibria, the effects go away entirely. 

So can we conclude that the concerns of Keynesian economists about the ZLB are overblown, and that fiscal policy isn't the answer? Not so fast. Here is another working paper, by Fernandez-Villaverde, Gordon, Guerron-Quintana, and Rubio-Ramirez, which conducts a similar exercise with a slightly different model. Fernandez-Villaverde et al.'s model is extremely hard to solve and their results come from picking some interesting-sounding cases and then doing numerical experiments (simulations) to see what happens in those cases. In the main case they consider interesting, the ZLB ends up being pretty important, and the fiscal policy multiplier is around 1.5 or 2.

(Update: A commenter points me to this response by Christiano and Eichenbaum, two of the leading New Keynesian theorists. They show that most of the multiple equilibria found by Braun, et al. are not supported by a specific model of learning. Also, here is a multiple-equilibrium DSGE paper by Mertens and Ravn showing that in some equilibria, fiscal policy actually makes recessions worse. The Mertens and Ravn result also conflicts with the learning model of Christiano and Eichenbaum.)

So what do we learn from these sorts of exercises? In my opinion, we learn relatively little about the real economy, but that's OK, since we do learn some important things about DSGE models. Namely:

1. Almost every DSGE result you see is the result of linearization, If you drop linearization, very funky stuff happens. In particular, equilibria become non-unique, and DSGE models don't give you a good idea of what will happen to the economy, even in the fictional world where the DSGE model's assumptions are largely correct! As Braun et al. write:
There is no simple characterization of when the loglinearization works well. Breakdowns can occur in regions of the parameter space that are very close to ones where the loglinear solution works. In fact, it is hard to draw any conclusions about when one can safely rely on loglinearized solutions in this setting without also solving the nonlinear model.
So even putting aside the question of whether DSGE models accurately represent reality, we see that most of the DSGE models you see don't even accurately represent themselves.

2. In order to be usable, DSGE models have to have a LOT of simplification. These nonlinear New Keynesian models go so haywire that they often have to be simulated instead of solved. Furthermore, neither of these models has capital or investment. Since investment is the component of GDP that swings most in recessions, you'd think this would be an important omission. But putting in capital would make these already mostly intractable models into utterly hopelessly intractable models (And if you don't believe me, ask Miles Kimball, who has spent considerable time and effort working on the problem of putting capital into New Keynesian models). Never mind putting in other realistic stuff like agent heterogeneity!

Basically, every time you model a phenomenon, you face a tradeoff between realism and tractability - the more realistic stuff you include, the harder it is to actually use your model. But DSGE models face an extremely unfavorable realism/tractability tradeoff. Adding even a dash of simple realistic stuff makes them get very clunky very fast.

3. DSGE models are highly sensitive to their assumptions. Look at the difference in the results between the Braun et al. paper and the Fernandez-Villaverde et al. paper. Those are pretty similar models! And yet the small differences generate vastly different conclusions about the usefulness of fiscal policy. Now realize that every year, macroeconomists produce a vast number of different DSGE models. Which of this vast array are we to use? How are we to choose from the near-infinite menu of very similar models, when small changes in the (obviously unrealistic) assumptions of the models will probably lead to vastly different conclusions? Not to mention the fact that an honest use of the full nonlinear versions of these models (which seems only appropriate in a major economic upheaval) wouldn't even give you definite conclusions, but instead would present you with a menu of multiple possible equilibria?

Imagine a huge supermarket isle a kilometer long, packed with a million different kinds of peanut butter. And imagine that all the peanut butter brands look very similar, with the differences relegated to the ingredients lists on the back, which are all things like "potassium benzoate". Now imagine that 85% of the peanut butter brands are actually poisonous, and that only a sophisticated understanding of the chemistry of things like potassium benzoate will allow you to tell which are good and which are poisonous. 

This scenario, I think, gives a good general description of the problem facing any policymaker who wants to take DSGE models at face value and use them to inform government policy.

So what's my suggestion? First I'd suggest detailed studies of consumer behavior, detailed studies of firm behavior, lab experiments, etc. - basically, huge amounts of serious careful empirical work - to find out which set of microfoundations are approximately true, so that we can focus only on a very narrow class of models, instead of just building dozens and dozens of highly different DSGE models and saying "Well, maybe things work this way!" Second, I'd suggest incorporating these reliable microeconomic insights into large-scale simulations (like the ones meteorolgists use to forecast the weather); in fact, any DSGE model that incorporates all of the actual frictions we find is likely to be so complicated, and so full of multiple equilibria in the full nonlinear case, that it demands this kind of approach. Third, and in parallel to the weather-forecasting effort, I'd echo Bob Solow's call to use simple models when trying to explain ideas to other economists and to the public (explanation of ideas being what DSGE models are mainly used for, given their abysmal performance at actually predicting anything about the economy). Note that I don't have a ton of confidence in these alternatives; after all, it's a lot easier to find flaws in the dominant paradigm than it is to come up with a new paradigm.

But in any case, few people in the macroeconomics field seem to be particularly interested in that sort of alternative approach, or any other. And the scientific culture of macroeconomics doesn't seem to demand that we find an alternative; in fact, in the macro profession, you pretty much have to back up any empirical result or simple model with a fully specified mainstream-ish DSGE model in order to be taken seriously.

So instead of trying to find which set of models really works, everyone just makes more models and more models and more models and more models...

(Note: If you know basic math and want to learn what DSGE models are all about, start with this chapter from David Romer's Advanced Macroeconomics.)

Update: Stephen Gordon agrees, and adds his own misgivings about DSGE.

106 comments:

  1. This is the most interesting article I've read in months, thanks very much, really useful to understand the DSGE world.

    May I ask you a question? When I finish my PhD (its about credit markets and behav finance) I was thinking about doing some macro. My problem is I have no background on these DSGE models and I dont know where to start from (that's why I found this article so illustrating). My question: could you provide us a couple of papers/references for academics to start working on this type of stuff?

    Thank you very much.

    ReplyDelete
    Replies
    1. Anonymous1:32 AM

      If DSGE is rotten from the beginning why bother with it any more? As an outsider other fields are more equipped than average economics to make honest dynamic models.

      Professor Steve Keen uses models that use dynamic math used in the sciences and engineering, called "ordinary differential equations". That course is usually not required for economics majors. It is required for most science majors and all engineers. Much of economics profession have ignorantly decided to stay ignorant of it. It is how dynamics is modeled in math. It is leaned in math class and applied in engineering. And engineers put thing of many parts together together to model useful systems. Thus they us systems of ordinary differential equations. Check out Keens papers and a ODE book. You can practice the applications of ODEs with a engineering or mechanics (phys) texts. Of course engineers get practice using that math in their other classes.

      Cultivate your engineer friends. Tell them about equilibrium assumption and demonstrate your learned econ. Then listen to their reactions.

      {If you economic policy folk put enough of us engineers out of work we will run circles around the econ profession and take your jobs. Fair is fair.}

      Professor Keen takes from an other fields that is hundreds or thousands of years old that many "econmisses" have ignored and sometimes crudely ape thus confusing reality. You can identify it up in his papers but I won't mention it for the benefit of Smith. But, I have noticed that some of the economists that are considered great know this field. Every one who is educated should know that field.

      Professor Keen's work is not perfect but he has gone very far from what I learned in economics and realized it was lacking. Here is a chance to go farther.

      (It appeared Smith was belittling professor Keen's work while demonstrating lack of familiarity and understanding. He may not have sufficient practice of ODEs or any familiarity. He should be familiar with ODEs with his claimed background. )

      Delete
    2. Actually, Steve Keen isn't a professor any more...but I still am! ;-)

      Delete
    3. Anonymous1:51 AM

      ODEs do not require that all reality has to be assumed away (economic common practice) to form the equations and get solutions. If that were the case very few mechanical, electrical, or chemical contraptions would not be able to be modeled or developed.

      By assuming equilibrium dynamics is specifically assumed away. Engineering and science does not have that hang up. They can do dynamic equations. It is required unlike in much of economics.

      Smith does not seem to be willing to or able to help you much in this.

      Delete
    4. Anonymous2:06 AM

      Sensei, he still has students, he still teaches, he still writes. He is still is a professor. Hai, O'Smithsan?

      He still learns.




      By the way congratulations on your coming around on DSGE, Sensei.

      Delete
    5. Anonymous2:48 AM

      Professor Smith,

      You may be glad to know that the title of professor never expires. : )

      From the students point of view that is certainly so.

      Delete
    6. 2:06 AM Anonymous, congratulations...you just made my head explode with Derp. That is not an easy feat. My hat off to you, sir. And my head too.

      Delete
    7. @Anon 1:32AM, 1:51AM

      Look for example at the Fernandez-Villaverde et al. paper that Noah links to. You don't have to read it, just skim through the math. Notice the "t" subscripts by each variable. Prepare for mind blown: "t" stand for time. The solution to their model is a system of stochastic difference equations (difference, not differential, because they use discrete time) that can be simulated, analyzed for unconditional or conditional moments, impulse-responses, forecasts, etc. The same holds for almost any DSGE model (another mind-blowing revelation - "D" stands for dynamic).

      The idea that economists don't model dynamics because they're afraid of ODEs or whatever is just plain false (ODEs themselves are routinely used e.g. in economic growth models) and stems from confusion about the word "equilibrium", which has different meaning in economics than in natural sciences. For outsiders to fall victim to such confusion is forgivable, but that it's spread by people like Keen, who should know better, is sad.

      Delete
    8. Anonymous9:35 AM

      I seem to remember studying ODEs in my Masters course in the mid 1990s, both in the macro subject and the math Econ subject. This is in Australia, though not at a university Prof Keen has studied at or worked at. In fact, many of the papers we studied, of things like two-state-variable control problems, were from the 1970s. And of course Malliaris' book using SDEs is from 1982!!

      Prof Keen likes to think he is the only person with a PhD in Econ who has thought of using ODEs. But that is false an an insult to many fine Australian macro economists like John Pitchford. Perhaps if Prof Keen had done some undergraduate or Masters level economics he would know this.

      Delete
  2. I have some training in macro but nowhere near what's needed to be taken seriously.

    So I am free to propose a more radical solution: Dump DSGE. Indeed, we need to dump a lot of the macro stuff we think we know. I disagree that micro-economic studies need to be carried out. They'd be welcome but I do not think they're necessary.

    And, for all its worth, the comparison with weather forecase is, imho, a bit misleading in the sense that Macro cares about high level stuff - the climate rather than the weather, if you will.

    I DO agree and believe that we need models to be SIMPLE. IMHO, the world is not 'infinitely complex' at the higher levels. At the micro level, sure. But even if the world is not a machine, I suspect that, given that Macro focuses on only a few variables (output, output growth, unemployment, inflation), complex models are a hindrance.

    http://theredbanker.blogspot.com/2013/03/inflation-or-deflation-refreshing-macro.html

    ReplyDelete
    Replies
    1. Anonymous11:24 AM

      Biggest difference between weather forecast and economic forecast (whether using DSGE or not) is economic forecast can and will influence future economic outcomes, i.e. altering probability of the next economic crisis or boom. On the other hand, weather forecast, which is equally complex given Mother Nature's hot temper, seems to have very little impact on the likelihood of the next nature disasters. Well, maybe in 1000 years once human takes their own climate change forecast seriously enough.
      Models have pros and cons. Bashing one or the other endlessly doesn't help economics as the whole, especially when professors disagree and show it off publicly in an uncompromising way. Linearization will give way to nonlinearity solution. Local maximization will be improved when global solution become more readily available with the new computing power. Economics will move forward with certain degrees of collaboration.
      We are in a profession where our subject of study, the economy, won't stop evolving or can be isolated in a lab-style physics study. If anyone can tell the economic future like palm reading, that will be the end of economics.

      Delete
  3. Noah,

    Very interesting as always.

    But I always get a little bit disappointed reading about these papers without seeing any reference to my own - http://goo.gl/zoawc - that addresses these issues. In particular, I study a model that is 1, solved nonlinearly, 2, has a realistic labour market (Mortensen/Pissarides instead of an intensive margin labour/leisure choice), 3, has a unique equilibrium, and 4, does include investments (but not productive capital; investments is instead equity that sets up new firms and creates new jobs). Apart from showing a lot of propagation even to quite small shocks, fiscal policy turns out to be extremely effective. The marginal multiplier peaks at around 3, and fiscal policy is almost always increasing welfare (Pareto improving).

    Sorry for taking up space here advertising my own work. Feel free to delete my comment.

    Pontus Rendahl

    ReplyDelete
    Replies
    1. Just downloaded your paper.

      And, sure, it goes toward my own biases so I like its abstract. However, I have questions. Do you see any issues with large deficit and/or money printing?

      Also and somehow more fundamentally - how does government spending turn the mood from 'pessimist' to 'optimist'?

      Delete
  4. There is a rift in economics between 'analytical' models which indeed are increasingly ad hoc and the models used to estimate the (macro) economy. Economists actually do have macro-models which map the entire economy in a non ad hoc, cogent, consistent, coherent and well defined way: the flow of funds and the national accounts. These are about net streams of money and therewith not about micro behaviour, they are well defined, impose empirical as well as conceptual and definitional discipline upon the scientist, they show stock and flows of financial variables as well as capital, whatever. Summarizing: scientific models. Some links:

    http://peemconference2013.worldeconomicsassociation.org/?paper=meaning-and-measurement-of-national-accounts-statistics

    http://p.feedblitz.com/t2.asp?/332386/4534930/0/www_paecon_net/PAEReview/issue63/knibbe63_pdf

    http://peemconference2013.worldeconomicsassociation.org/?paper=monies-debt-and-policy-the-concept-of-endogenous-money-as-a-basis-for-household-and-non-financial-companies-instead-of-bank-centered-monetary-statistics

    ReplyDelete
    Replies
    1. Okay. Now make predictions based on those.

      There are 3/4 items I think most Macro guys are interested in:

      1- Output/output growth.
      2- Unemployment
      3- Inflation
      4- Exchange Rate/Sustainability of our trade balance.

      Delete
    2. I send you back to 1992, to Wayne Godley, who used the accounting identities, flows of moneys and the like to predict the Euro disaster. http://www.lrb.co.uk/v14/n19/wynne-godley/maastricht-and-all-that

      Delete
  5. Jorge Bielsa6:05 AM

    As a Macro teacher to undergraduates in Advanced Macroeconomics, I have to tip my hat and thank you for your clear-cut post. Thank You very much, Noah, you clarified me some convoluted things.
    By the way, the approach you propose, with the weather-forecasting example included, seems to me very simmilar to the one Steve Keen tries to explain-develop. Am I wrong?

    ReplyDelete
  6. Noah, what is your measure of 'success' in macroeconomics? Is it an AER publication or being an input to monetary policy? I think my views of macro used to be similar to you. I thought log-linearized models in Miles' class were great fun but too black boxish for my tastes. I had this notion that macroeconomists thought big general equilibrium thoughts and knew stuff. This caused me some concern when I was hired in a Macro Analysis group, since my research was all on micro foundations (preferences) and I certainly did not feel like I knew much of anything during the crisis. My views of what macro is, how much we 'know' and how it feeds into the policy process have changed a lot over the last five years. I trust results when they come from a myriad of methodological approaches (let a thousand models bloom!) and they show up in the summary statistics as well as the full blown nonlinear bells and whistle computational model. I know you say models are not falsifiable in macro, but if models aren't being used by policy makers (or only used with caution) doesn't that say something? DSGE is not an end point in the discovery process just another spoke in the wheel.

    ReplyDelete
    Replies
    1. Sorry I don't want to give the impression I am hounding you across mediums but maybe it will be easier to have a conversation here than on twitter?

      You say: "DSGE is not an end point in the discovery process just another spoke in the wheel".

      And that would be true... if it didn't become an end in and of itself for career Macro people. Who's going to take the career risks of disrupting the established dogma? Sure, every sciences work like that and scientific discoveries are still made, progress still occurs.

      But the same could have been said of The Church and religion. And there dogma is a lot lot stickier than in sciences.

      I suspect that, with its difficulty in reaching falsifiable predictions, macro economic sciences might be dangerously close to religion and thus progress that much more difficult to achieve if one paradigm dominates and entrenches itself.

      Delete
    2. Frederic, DSGE models were a part of my education as a macroeconomist, but they do not define my career as one. It's a toolkit and not the only one. Of course, a modeler should know the point at which his or her model breaks downs or veers off into fantasy land. But just because you can break a model does not mean it needs to be thrown away.

      I think the whole macro dogma thing is overblown. Yes, some individuals are very excited about their pet ideas (what dedicated researcher isn't?) but taken together there's a lot of intellectual variety. Macro has it's challenges, but I see no reason to doubt the potential for progress.

      Delete
  7. Anonymous7:57 AM

    Great post! I'd only add that the linearised New-Keynesian model doesn't give you determinacy either--see this John Cochrane article http://faculty.chicagobooth.edu/john.cochrane/research/papers/cochrane_taylor_rule_JPE_660817.pdf

    ReplyDelete
  8. I'm quite surprised how little support agent-based modeling has found among mainstream economists in America (Europe, surprisingly, is taking the lead with this one...)

    Larry Summers – understandably disheartened with DSGE – just proposed several other ideas that are equilibrium based (fragile equilibria etc.)

    EURACE seems promising. I would love a big Federal Reserve research project studying the extent to which ABM can help economists map shocks.

    ReplyDelete
  9. I have written before about the Braun paper, and how it overturns important results. But do not throw out the baby with the bath water. Non-linearity does not always matter. And there are plenty of non-linear DSGE models out there. Abandoning a whole strand of literature for its use of linearization, which is not essential to it, is silly.

    ReplyDelete
    Replies
    1. I'm suggesting massively changing the approach, so that we don't just keep making a million billion different DSGEs, but instead focus most of our effort on getting the microfoundations right, because accurately characterizing even a narrow class DSGEs in their full nonlinear form is a monumental undertaking.

      Delete
  10. Christiano & Eichenbaum have a response [1], where they claim that additional equilibria found by Braun et al. are not stable under learning dynamics and "so those equilibria may perhaps be treated as mathematical curiosities." I haven't studied this in detail, so I don't really have an opinion whether they're right, just thought it might be of interest.

    [1] http://faculty.wcas.northwestern.edu/~lchrist/research/Zero_Bound/manuscript.pdf

    ReplyDelete
  11. This is what happens when macroeconomists take an awful pun ("economists do it with models") literally, and start believing that if they create enough models, one of them will animate and start having sex with them.

    ReplyDelete
  12. This post is right on.

    A next logical step might be to ask how much earlier approaches to macro avoided this kind of epistemic closure. How do we feel about Robert Gordon's proposal for a revival of the economics of 1978?

    ReplyDelete
    Replies
    1. Thanks for the link!!

      Delete
    2. That's a fascinating paper.

      However, if you drop the absolute income hypothesis, the long-run Phillips Curve, have a price-stickiness theory of the transmission from changes in nominal expenditure to changes in real variables, have a preference for monetary policy over fiscal policy, and lack a basis for liquidity traps, then why call it Keynesianism?

      And isn't all that not so different from Irving Fisher's model of the economy? Or indeed David Hume's conception?

      Here's a hypothesis: economists keep on wandering off from a basic model of how the economy as a whole works (QTM + price stickiness) and reality occasionally smacks their faces hard enough to convince them to find some updated version of that basic model. So 1978 macro was actually part of a very good yet modest tradition in economics that goes back centuries- to some extent, even to Copernicus.

      Delete
  13. I have recently worked my way, to some extent, through an old, short-run, deterministic, non-linear Keynes-like model: http://robertvienneau.blogspot.com/search/label/Kaldor%20Business%20Cycle%20Model. I was surprised at the diversity of results among parameter ranges. As I understand it, you can get such diversity in mainstream, overlapping generations models. You might find Barkley Rosser's book of interest.

    I find curious that it takes the global financial crisis for these sort of results to be noticed. Should these possibilities begun to have been explored decades ago if economists were serious about rigorous mathematics? And, if an equilibrium point in a non-linear model is unstable in some parameter range, it might still be of interest. There could be non-equilibrium cycles around such a point.

    ReplyDelete
  14. Makes me wonder: what would we use to build these simulations. Something like Minsky?

    ReplyDelete
    Replies
    1. From what I've seen, Minsky is just "physics envy" taken to absurd levels...

      http://noahpinionblog.blogspot.com/2013/02/is-business-cycle-cycle.html

      Keen is talking about aperiodic cycles and deterministic chaotic systems, but as a physics undergrad I saw what those systems look like, and they're not pretty...even when based on god microfoundations, which Keen's models are manifestly not.

      I think there is really no hope of making Keen-type models that have any amount of predictive power.

      Delete
    2. Why do you think Keen's models are based on bad microfoundations?

      I mean, you yourself have been talking today about founding your assumptions purely in empirical data, which is precisely what Keen is trying to do.

      I think it's quite closed-minded to say "There is no hope for Keen style deterministic chaotic models!" until a lot of people have tried to do it. I mean, how do you know that we wouldn't have had more success if modern post-1970s econ hadn't gone down the Keen/Minsky route rather than the Taylor/Lucas/Mankiw route? At least Keen's assumptions trying to be reality-founded unlike DSGE which just spews out totally unfounded (but mathematically prettier) stuff like rational expectations, log linearisation and general equilibrium.

      Delete
    3. Why do you think Keen's models are based on bad microfoundations?

      They're not based on any microfoundations, IIRC. I think Keen doesn't like the idea of microfoundations.

      I think it's quite closed-minded to say "There is no hope for Keen style deterministic chaotic models!" until a lot of people have tried to do it.

      What I mean is, on Twitter Keen was talking about deterministic aperiodic chaotic models. Chaos pretty much ensures you don't have forecasting power.

      At least Keen's assumptions trying to be reality-founded unlike DSGE

      Actually I think they're about equally as good (which is to say, not incredibly).

      Delete
    4. Here's a statement that I think holds generally true:

      It is not necessarily the case that some human, somewhere, understands a given phenomenon.

      Once one accepts this, one's mind is open to the possibility that no one has any answers yet.

      Delete
    5. bjdubbs3:26 PM

      It seems to me what Keen or Minsky does is include finance in the model, which seems like a big oversight of the traditional models (if it is). That seems as important as microfoundations.

      By the way, great post. I may actually have to learn some real economics, as much as I'd like to dismiss it without understanding it.

      Delete
    6. If you want to learn what DSGE is about, start here!

      http://highered.mcgraw-hill.com/sites/dl/free/0073511374/695291/Sample_Chapter.pdf

      Delete
    7. Once one accepts this, one's mind is open to the possibility that no one has any answers yet.

      I think the advancement of knowledge is a slow process. That is, we learn mostly by trying stuff out and learning from our failures. True, sometimes there will be a quantum leap fuelled by genius, but most of the time we're building things on top of others' successes and failures.

      Keen's software is another attempt to deal with modelling the macroeconomy from a different angle. I don't think that anyone (especially not him) would try to say that it was "the answer". It is probably a useful experiment that we will learn from, and move onto another useful experiment, and another one, etc, learning from our mistakes.

      So what lessons is Keen applying from DSGE? Simple — don't start off with microfoundations that can be shown empirically to be unrealistic like clearing markets and general equilibrium, rational optimisation (at least as defined by Samuelson), and methodological individualism in dealing with dynamic large-scale systems. You could say that this is a non-microfounded approach, but maybe it's microfounded via negativa (better to have no specific micro-assumptions than a load of wrong micro-assumptions)? The appropriate microfoundations for cellular biology are not generally taken to be quantum-level phenomena, for example. If the neoclassical assumptions are unrealistic, then we need to start again anyway, and do macroeconomists have time to wait for a new microeconomics to emerge? No — did Maxwell or Newton have to wait for quantum physics to give useful approximations of macro-level phenomena? So the right approach right now appears to be — get rid of dodgy micro-assumptions, observe phenomena, try to reproduce phenomena in a macro model.

      Your big problem appears to be the use of accounting identities. But accounting identities are just expressions. Isn't objecting to the use of accounting identities like a quantum physicist objecting to the use of an identity like "temperature" in dealing with macro-level (i.e. room level) phenomena? Temperature feels like a pretty useful heuristic on room level, just as Keen's definition of aggregate demand may be a pretty useful heuristic on a macroeconomic level.

      The absolute worst that can happen from Keen's approach is that we learn that his approach is inappropriate, and learn some steps toward a better macro approach.

      Delete
    8. See, I think a lot of people fall into the trap of thinking "Neoclassical economists haven't got it, and Keen spends all day dissing neoclassicals, so Keen must have it." But that's just wrong. It's a mental trap.

      Your big problem appears to be the use of accounting identities.

      No. Accounting identities are accounting identities. They're definitions. So we have to use them, unless we want to split up quantities differently and write down different, equivalent identities.

      The absolute worst that can happen from Keen's approach is that we learn that his approach is inappropriate, and learn some steps toward a better macro approach.

      Yep.

      It's still going to yield no results, though. ;-)

      Delete
    9. See, I think a lot of people fall into the trap of thinking "Neoclassical economists haven't got it, and Keen spends all day dissing neoclassicals, so Keen must have it." But that's just wrong. It's a mental trap.

      I've been more impressed so far with Keen's ability to blast holes in neoclassicalism than I have with his ability to produce a workable alternative that makes systematic testable predictions, sure. But pretty much every great contributor to economics in the last century has started off as a critic of orthodoxies.

      I'm far more attracted to trying to model my ideas using Keen's framework than I have ever been with general equilibrium. I want a model where business cycles, unclearing markets and irrational expectations can be the baseline. I don't care if I have to start off with periodic cycles. Yes that might look silly to begin with. But I think Minsky was right that private debt is the primary driver of the true business cycle and using that parameter and the cost of private debt service as a percentage of GDP we can derive non-time-sensitive cycles and call housing and stocks bubbles (etc) with a decent degree of confidence.

      Delete
    10. Chaos pretty much ensures you don't have forecasting power.

      This is simply wrong, for the same reason it's wrong when climate deniers say it. A chaotic model can't tell you when an endogenous shock will occur, but you can use such a model to say what makes them more or less likely, what they look like, what makes them shorter/longer/more or less severe, and so on. They can have considerable explanatory value.

      But the bigger problem with your attitude here is the way it constrains you scientifically. No chaos = no endogenous shocks. But the real economy may have them, and it may be worthwhile to be able to model them. Phenomena like bubbles that then pop are better explained as endogenous shocks than exogenous. We should have models demonstrating how that can happen and what the result is. But you assume these things can't happen because the math would be too hard anyway if they could. But worse is the way in which you assume it: you can't just build "no chaotic endogenous shocks" into the assumptions of your model; rather you come up with a set of assumptions, and if they result in chaos, you change your assumptions until that problem goes away, but what you have in the end is a model where no endogenous shocks is the result, not the assumption, but you defined the problem as a search for a model which would have that property, rather than attempting to find out what properties a model based on reasonable assumptions would have.

      Delete
  15. Anonymous11:42 AM

    It all sounds rather chaotic to me... and I mean that in a "field of study" way. To test if DSGE models are suffering from Chaotic effects of how computer systems generate particular numbers or approximate different mathematical functions during their simulation runs try this- Run the same simulation multiple times on one computer and compare the results. Then run the same simulation on a different computer, but with the same operating system. Perhaps an AMD vs. Intel chipset if you are using desktop workstations. Then run on entirely an entirely different OS. Let me know what happens. MJF, Seatle WA

    ReplyDelete
    Replies
    1. Provided IEEE floating point is properly implemented, chipset and OS should make no difference. If you want to do an ensemble run and variability comes from randomness, do the runs with a different random number seed. If the variability comes from chaos, you perturb the initial conditions slightly on each run, say by up to .1%, and compare results.

      Delete
    2. Anonymous2:10 PM

      Danged NIST messes up another perfectly, almost, anyways, rational explanation. MJF

      Delete
  16. This comment has been removed by the author.

    ReplyDelete
    Replies
    1. @Dan:

      Give us the budget of particle physicists and unlimited dictatorial power to run macroeconomic experiments, and we'll have answers for you in no time.

      Alternatively, if you think it's easy to build economic theories with similar level of precision as those of physics, feel free to do so yourself. Should you succeed, a Nobel prize and eternal glory will await you. Good luck, and let us know how it goes.

      Delete
  17. A commenter at Macromania pointed out that the Braun paper solves for the solution and then applies the ZLB. Hence, it is unsurprising that it finds a different result than the log-linearized model, which applies the ZLB after linearizing but before solving a solution. That is, the Braun paper doesn't truly have a zero lower bound.

    More generally, I think there is a bit of misrepresentation here. Take, for example, the sunspot paper by Mertens and Ravn. The result is driven by the specific parameter values, not by the fact that they use non-linear techniques (though it is true you can't solve the model for those parameter values with linear techniques). Intuitively, if the demand curve at the zero-lower-bound is less steep than the supply curve, then you have multiple equilibrium both of which resemble the comparative statics of a full-employment economy. If the demand curve is steeper than supply at the ZLB, then you get the standard New Keynesian result. The relative steepness depends on the persistence of the preference shock (demand curve) and the degree of price stickiness (supply curve). Linearizion doesn't really have much to do with the question.

    ReplyDelete
  18. If you want a data driven approach, the POS databases of Walmart, Target, VISA etc. would be a place to try to start. Getting the owners of the data to agree to give access would probably be really difficult but may not be impossible. With that kind of data is should be possible to track the economy in real time.

    DSGE is just one way to model the economy. If it quickly leads to models that are completely untractable then it may not be the appropriate modelling tool.

    I spent a few years working in computational mathematics. Your article tells me that economists either need (1) a general purpose solver that can computationally solve DSGE equations (and display the solutions visually) or (2) to find a new mathematical paradigm.

    A correct DSGE model and a correct Agent Based model will have some deep equivalence since they are both trying to mathematically solve models of the same underlying system.

    ReplyDelete
  19. Noah,

    as usual, and contrary to Mr. Miyagi's advice (in Karate Kid I), I take the middle path. As I have commented before on this blog and elsewhere, any policy-maker who makes policy decisions based on a single (or a single class of) DSGE model(s) is crazy! This is something we agree on. Where I disagree is the idea that we need to give up DSGE modelling and with your emphasis on empiricism. Here are the reasons:

    1) One important benefit of DSGE models is that they help us think about all the interactions that take place and may influence the effect of a policy. Of course, I do agree that once these models become so complex that it is hard to figure out what is driving the results (e.g. when analytical solutions are not obtainable), one muct wonder what purpose they serve.

    2) Having a variety of multiple models with different assumptions does result in multiple estimates, but at least limits these estimates within a range. Yes, for policy-makers it makes a bid difference if the government spending multiplier is 0 or 2. But at least you won't find a respectable economist willing to say that it is 5, as would be the case in a simple IS-LM model with a marginal propensity to consume equal to 80%. This is progress.

    3) Empirical studies trying to estimate behavioral parameters often also produce a wide range of estimates. For example, empirical estimates of the elasticity of intertemporal substitution range from 0 to about 2 (my estimates in a paper forthcoming in the Journal of Economics and Finance are around 1.50). Things only become worse if consumers are hyperbolic, in which case the elasticity depends on the persistence of the change in the interest rate, in which case it is really impossible to pin down an exact value.

    In conclusion, and echoing my findings in another comparative study between Latin America and the OECD I recently published, I feel that policy-making can never be a science. It will always require a guess. What we can do is try to make that guess as educated as possible. And for that we need both more theory and more empirical studies through various (but rigorous) approaches.

    ReplyDelete
    Replies
    1. once these models become so complex that it is hard to figure out what is driving the results (e.g. when analytical solutions are not obtainable)

      Auuggghhh. The real world is complex and nonlinear. Trying to impose a simple linear (first order) model on a second order world will lead you to error. (insert classical reference to Procrustes). Properly done numerical solutions of complex models will disclose what is "driving" the results.

      Delete
    2. Properly done numerical solutions of complex models will disclose what is "driving" the results.

      One would hope...

      Delete
    3. Of course, I do agree that once these models become so complex that it is hard to figure out what is driving the results (e.g. when analytical solutions are not obtainable), one must wonder what purpose they serve.

      Exactly. That's what I meant by an unfavorable tractability/realism tradeoff.

      If you want insight, you need to use a simple model like old-school RBC or a basic stripped-down NK. As soon as you get into Christiano-land or Kehoe-land, it becomes a total mess. Not to mention the people who do nonlinear versions of Christiano-type or Kehoe-type models...

      Having a variety of multiple models with different assumptions does result in multiple estimates, but at least limits these estimates within a range. Yes, for policy-makers it makes a bid difference if the government spending multiplier is 0 or 2. But at least you won't find a respectable economist willing to say that it is 5, as would be the case in a simple IS-LM model with a marginal propensity to consume equal to 80%. This is progress.

      OK, but what if modelers decide (ex ante) that multipliers of 5 are unrealistic, and constrain their models to give multipliers much less than 5? Then the fact that none of the mainstream models have multipliers of 5 doesn't tell us anything.

      Empirical studies trying to estimate behavioral parameters often also produce a wide range of estimates.

      Yep.

      Things only become worse if consumers are hyperbolic, in which case the elasticity depends on the persistence of the change in the interest rate, in which case it is really impossible to pin down an exact value.

      Yep.

      In conclusion, and echoing my findings in another comparative study between Latin America and the OECD I recently published, I feel that policy-making can never be a science. It will always require a guess. What we can do is try to make that guess as educated as possible. And for that we need both more theory and more empirical studies through various (but rigorous) approaches.

      The only place I disagree is regarding theory. I think bad theories detract from policymakers' judgment. Bad empirics do too, but with empirics, if you do a million studies of the same thing, you'll weed out a lot of the noise. With theory there's no such assurance, so I think what you sometimes get is academic herding into bad theories that hobble policymakers for generations. To illustrate my point: Communism.

      Delete
    4. Properly done numerical solutions of complex models will disclose what is "driving" the results.

      Why would people think this is the case? To me this is just a numerical reiteration of Misesean praxeology. If you're not observing something in reality at both ends — both checking your assumptions against reality, and checking your results against reality — the whole thing is completely meaningless.

      Delete
    5. Aziz - I agree completely that you have to test your assumptions and conclusions etc against reality.

      What we are talking about here is understanding what the sensitivity of the result is to different aspects of the model. That becomes part of the test against reality. If the model starts showing sensitivity (high or low) to an aspect of the model (like the estimate of a parameter) which is at odds with how the real world should work then even if the actual result seems reasonable, the sensitivity may show that there is a flaw at the core of the model.

      Delete
    6. Ah, I see. I suppose this could be one way of discriminating between (the very high number of) models whose outputs are consistent with reality. Like a meta-test.

      On the other hand, you'd need pretty good real-world evidence of non-sensitivity in whatever parameter we're talking about for this to be useful. As a follower of Hyman Minsky, my implicit model of the economy becomes hypersensitive to changes in private debt levels once private debt has climbed to a level where debt-service-costs start eating into income levels.

      Delete
    7. my implicit model of the economy becomes hypersensitive to ..

      Leave aside for now the meaning of "hypersensitive". If you did a DSGE model and it did not show "hypersensitivity" to debt service / income then either your implicit or explicit model must be wrong.

      Delete
  20. "OK, but what if modelers decide (ex ante) that multipliers of 5 are unrealistic, and constrain their models to give multipliers much less than 5?"

    Well I am not a big believer in conspiracy theories. If economists are willing to argue so bitterly about whether the multiplier is 0 or 2, why wouldn't they argue as to whether it is 5?

    "but with empirics, if you do a million studies of the same thing, you'll weed out a lot of the noise."

    You mean, like the effect of an increase in the min. wage on employment? :-P

    "what you sometimes get is academic herding into bad theories that hobble policymakers for generations. To illustrate my point: Communism"

    Yes, but the question is whether these theories persist despite the presence of convincing evidence to the contrary. I mean, many, otherwise educated peope choose to believe in Astrology or New Age. What can I tell you? Don't get me wrong, you know that I believe in empirical work, I mean, that's what I do. But my work is guided by some theory and its predictions. I don't collect data and start fishing for correlations. Doing so would be bad science!

    ReplyDelete
    Replies
    1. Well I am not a big believer in conspiracy theories.

      Groupthink is different than conspiracy theories.

      You mean, like the effect of an increase in the min. wage on employment? :-P

      Sure.

      I don't collect data and start fishing for correlations. Doing so would be bad science!

      Not necessarily. That's how Kepler did it.

      Some empirical work is about testing theories...other empirical work is about generating theories. Theories (usually) don't spring fully formed out of the minds of idiosyncratic supergeniuses. They come from looking at the world (i.e. data) and trying to observe some regularities (i.e. fishing for correlations).

      Delete
    2. I don't collect data and start fishing for correlations. Doing so would be bad science!

      Not necessarily. That's how Kepler did it.

      I was taught that Tycho Brahe collected the data that Kepler used.

      Delete
    3. "Some empirical work is about testing theories...other empirical work is about generating theories"

      Hmm, well all theories are motivated by empirical observations. Documenting the facts is surely important. Many economic historians do just that. But recording and analyzing the data are two different things. Without a detailed theory there is no way of knowing whether a correlation means something or is sheer nonsense.

      "Sure"

      Sure? That's all you have to say Mr. Smith? Where is the weeding? What I saw in the recent debate is people cherry-picking the empirical studies so as to validate their priors.

      Absalon,

      I forgot to thank you for posting that link to the "financial planning" picture in a previous post. It was, well, enlightening!

      Delete
    4. Without a detailed theory there is no way of knowing whether a correlation means something or is sheer nonsense.

      But I think that the Prescott idea of "theory ahead of measurement" leads to slow progress and progress in the wrong direction (requiring large amounts of backtracking)...

      Sure? That's all you have to say Mr. Smith? Where is the weeding? What I saw in the recent debate is people cherry-picking the empirical studies so as to validate their priors.

      Where? Blogs? News articles? Those are the people who still think global warming is a myth, Costas...they're not going to be convinced of anything by any amount of evidence...

      Delete
  21. Very nice post, Noah. Thanks.

    About Bob Solow, it seems to me he's also recommending a return to the type of models used by James Tobin and others. Is there any merit there?

    ReplyDelete
  22. In other words:

    A) DSGE is not merely nonsense, it is egregious nonsense.
    B) To be taken seriously as a macroeconomist, you have to profess that DSGE modeling trumps reality. [I hope this is not to great a distortion of what you are saying near the end.]
    C) I'll leave completing the syllogism as an exercise for the interested reader.

    Cheers!
    JzB

    Cheers!
    JzB

    ReplyDelete
  23. I don't think they are as bad at forecasting future economic data as you think they are, check this recent paper out.

    http://www.newyorkfed.org/research/staff_reports/sr554.html

    ReplyDelete
    Replies
    1. I've read that, and also this:

      http://www.federalreserve.gov/pubs/feds/2011/201111/201111pap.pdf

      The problem is, blue chip forecasts also have very poor forecasting power.

      Basically, I'm pretty sure most or all of the Smets-Wouters model forecasting power comes from the AR(1) structure of the shock processes...basically, the (correct) assumption that most things in the economy have a little bit of momentum for two or three quarters.

      Also, realize that there's a HUGE model selection problem here...out of the hundreds of available DSGEs, one beats blue chip forecasts?

      Delete
  24. Anonymous12:53 AM

    "swamps of DSGE"

    So, what took you so long?

    It shouldn't have taken you more than a month with what you said your applied math background was which should be superior to the average economics course.

    I said before the name gives it away. It's at least a triple oxymoron.

    Now, shouldn't have attempted to Professor Keen a hard time.

    I could help you understand things better appealing to your background but I have decided not to because of your treatment of others.

    ReplyDelete
  25. Anonymous8:11 AM

    How are we to choose from the near-infinite menu of very similar models, when small changes in the (obviously unrealistic) assumptions of the models will probably lead to vastly different conclusions?

    I would advise that you follow your own assumptions, and choose the model that maximises your personnal utility, that is to say, your paycheck, prestige, and influence. Just compute the results of all models (not difficult, since you have assumed omniscient agents and you are an agent), and choose the own that yields the results that will please your employer the most (so that you get a bonus and maybe a promotion).
    After all you already have assumed the scientific method away with utilitarianism, so why stop there ?

    ReplyDelete
  26. Noah,

    Professor Waki (one of the coauthors in the first paper you mentioned) runs most of the Advanced Macro stuff at my uni. I asked him about the log-linearised solution a few weeks ago, and he was hesitant to say much about the result. He thought that the fiscal multipliers estimated by some of the NK models were overestimated, but only for that particular brand of peanut butter, if you get my gist. To be fair, our conversation was brief.

    I'll ask him to take a look at this post when I see him next, hopefully this week. He is a really nice bloke.

    As for your more general comments on DSGE... I take some boring middle road, although I was intrigued by Stephen Gordon's post at Worthwhile. I may try to dive into his paper sometime.

    Enjoying your blogging!

    Cheers.

    ReplyDelete
    Replies
    1. Thanks, Ben!

      If you have a chance, tell Dr. Waki that I appreciate his work. :-)

      Delete
  27. Anonymous2:57 PM

    What textbook is the link to? There aren't any headers other than chapter heading.

    ReplyDelete
  28. So, look at Gallegate et al type models or Hommes type models. These generally have behavioral microfoundations for ABMs that can involve Minsky-type dynamics. Looks better to me than either DSGE or some of the other alternatives.

    ReplyDelete
  29. Anonymous5:14 PM

    Noah, what is the Chapter 7 you pointed us at Chapter 7 OF?

    ReplyDelete
    Replies
    1. Advanced Macroeconomics, by David Romer.

      http://www.amazon.com/Advanced-Macroeconomics-Mcgraw-Hill-Economics-David/dp/0073511374

      Delete
  30. Noah,

    As you point out the level of complexity involved here is close to that of weather type systems. However, reading the Baum et al paper there is very little reference to how one would solve the numerical system. I find this the most shocking part of dealing with economics literature. When solving complicated dynamic systems one cannot simply invoke solve in matlab or hope that an off the shelf optimizer will work at all.

    I strongly believing the path dsge needs to follow is as you specified- computational and empirical. However, if this is to be the case economist need to take the numerics seriously. To me this means that if you're presenting simulation based results then you should include a link to the source code you use and an appendix on why you use it.

    Rant over.
    Brett

    ReplyDelete
    Replies
    1. As you point out the level of complexity involved here is close to that of weather type systems. However, reading the Baum et al paper there is very little reference to how one would solve the numerical system.

      The reason for this, as I see it, is precisely that there are way too many models. No one is particularly interested in intensive solutions of a single model, since hundreds of these get put out every year.

      Delete
    2. "No one is particularly interested in intensive solutions of a single model,"

      So what the economics profession needs is a numerical analyst to write a general purpose solver for a broad class of DSGE models.

      Delete
  31. Anonymous6:51 AM

    Dear Noah,
    I think that most of these hypes against loglinearized DSGE is complely misplaced. These people don't understand the basic principle of Occam's Razor which means model parsimony. What will we learn by looking at a nonlinear model which has multiple equilibria unless we solve it in a tractable form. Each model is a parable which provides some insights about teh real world and there is nothing wrong having infinite number of models, just like seeing infinite number of paperback fictions in a bookstore. Why don't let the reader decide which parable is the best.

    Parantap Basu

    Professor Economics
    Durham University
    UK

    ReplyDelete
    Replies
    1. What will we learn by looking at a nonlinear model which has multiple equilibria ...

      First and foremost you will learn that your model and assumptions lead to multiple equilibria - an important insight which you want to throw away so that there is only one solution.

      Your approach is like the joke about guy looking for his car keys under the street lamp because the light is better there ...

      If you want a linear problem - formulate a linear model. If you can't formulate a linear model then don't expect anything sensible to come from a perturbation analysis of a nonlinear model.

      Delete
    2. Anonymous2:37 PM

      Hi, I am not married to linearity. Write a model as you wish as long as it gives a tractable analytical solution on a piece of paper not on a computer. I think that the challenge is to understand nonlinearity in terms of an analytical solution. It is likely that such a model will have poor quantitative predictions but powerful qualitative policy implications (example is OLG models which can beautifully characterize multiple equilbria). I do not think that a macroeconomist should try to forecast the economic outcomes. Even a doctor cannot predict human health condition. Why should economists venture into such a thing?

      Delete
    3. Parantap,

      Thanks for dropping by!

      Actually, you have anticipated another DSGE-related post that I intend to write, which will basically say "the use of DSGE models is to translate vague intuitions about how the world works into policy recommendations".

      However, I still don't think that justifies ignoring nonlinearities. If you write down a parable that embeds your intuition about how the world works, intellectual rigor requires extrapolating the full consequences of that parable, not just the ones that are easiest to extrapolate; otherwise we lose internal consistency.

      Best,
      Noah

      Delete
    4. Write a model as you wish as long as it gives a tractable analytical solution on a piece of paper not on a computer.

      You can keep looking under the street light for the keys if you like, but it is unlikely that you are going to find them ...

      Even "analytical" solutions are generally expressed in terms of standard transcendental functions and we need to go to tables or computers to evaluate them.

      Delete
    5. Anonymous5:46 PM

      I welcome you to write transcedental equations as long as it is properly microfounded without violating the discipline. All I am saying is that not to lose economics for the sake of computation. Don't lose moon for the numbers.

      Delete
    6. Yet why must economics be able to fit within a piece of paper?

      Delete
    7. "I do not think that a macroeconomist should try to forecast the economic outcomes. Even a doctor cannot predict human health condition. Why should economists venture into such a thing?"

      A good doctor can diagnose problems with a patient and provide good probabilities of survival rate & pain suffered given various actions. A doctor who can not do that is fairly worthless. A doctor who is more often wrong is harmful.

      If macroeconomists can not diagnose and give the likelihood of future problems with an economy or pain suffered if various policies are implemented (or get the likelihoods of success and suffering wrong), then of what use is macroeconomics? This isn't theology, after all; macroeconomics _should_ be an applied discipline.

      Delete
    8. Anonymous12:49 PM

      I agree Dohsan but that is diagnostics not forecasting. Can you suggest a doctor who will predict the NECESSARY AND SUFFICIENT conditions for a heart attack? I doubt that such a doctor exists. If so, how can we expect a macroeconomist to predict financial crisis?
      A macroeconomist should be doing policy evaluation NOT forecasting. At least that is what Lucas's Econometric Policy Evaluation paper says.

      Delete
    9. A macroeconomist should be doing policy evaluation NOT forecasting.

      You cannot do policy evaluation without forecasting the consequences (even just in terms of probabilities) of alternative policy choices.

      Delete
    10. Can you suggest a doctor who will predict the NECESSARY AND SUFFICIENT conditions for a heart attack?

      Of course not, because there are no necessary conditions, nor sufficient ones. But they can still tell you that you should exercise more and lower the cholesterol in your diet to reduce your risk of a heart attack. Predicting when exactly a financial crisis will occur is impossible, but there are still risk factors and economics should at least strive to identify them.

      Delete
    11. Agree with Eric.

      Anyway, what's with the hangup on "NECESSARY AND SUFFICIENT"? Physics envy? Math envy? Maybe it's because I was an options trader, but my take is that almost everything in economics that is nontrivial (and life, the universe, etc.) is a distribution of probabilities (and correlations, and higher-order terms). Virtually nothing is either necessary OR sufficient.

      Delete
  32. Anonymous9:24 AM

    An alternative school of thought you might be interested stressing non-linear dynamic systems can be found here:
    http://www.amazon.com/Financial-Assets-Debt-Liquidity-Crises/dp/1107004934/ref=sr_1_1?s=books&ie=UTF8&qid=1364736174&sr=1-1&keywords=financial+assets%2C+debt+and

    ReplyDelete
  33. I wonder how agent based models will perform in this regard. I know some sceptics think there would be an infinity of possible models there too, with no clear criteria for selection

    ReplyDelete
    Replies
    1. Actually, DSGE models are agent-based (that's what it means to have "microfoundations"); they are simply calibrated/estimated instead of simulated. Most people, when they say "agent-based models", mean "simulation".

      And as for whether or not it'll work better, I think the success of either simulation OR estimation/calibration will depend on whether the microfoundations (agent behavior rules) are right.

      Delete
  34. right, yes I know DSGE employes representative agents, but that's really quite a different thing from the agents in ABM, and I don't think "simulation" quite captures it either. There are plenty of models that one could call simulations without being ABM. As I understand it, the defining feature is that an ABM model consists of autonomous agents for whom you define some behavioural rules, particularly how they encounter and interact with other agents, and you don't know what kind of behaviour is going to emerge in the aggregate until you hit go and see what happens. It's all about emergent behaviour and such like. I think that's quite a distinct idea from DSGE, but it faces two problems 1. it's hard to know exactly what features of your model are responsible for what you observe and 2. who knows what the right rules to specify for your agents are. These aren't insurmountable, but that could well result in a swamp of competing models and no clear answers.

    ReplyDelete
    Replies
    1. right, yes I know DSGE employes representative agents, but that's really quite a different thing from the agents in ABM

      There are quite a number of DSGE models with agent heterogeneity - not representative agents.

      There are plenty of models that one could call simulations without being ABM.

      Of course, that's true!

      As I understand it, the defining feature is that an ABM model consists of autonomous agents for whom you define some behavioural rules, particularly how they encounter and interact with other agents, and you don't know what kind of behaviour is going to emerge in the aggregate until you hit go and see what happens.

      That's also true of a DSGE model with heterogeneous agents and multiple equilibria...except you never get to "see what happens" unless you do some sort of simulation...

      It's all about emergent behaviour and such like. I think that's quite a distinct idea from DSGE

      Yeah. DSGE makes it impossible to see emergent behavior, because of the conditions the models impose to find the equilibrium...

      Delete
    2. "Yeah. DSGE makes it impossible to see emergent behavior, because of the conditions the models impose to find the equilibrium..."

      Lot's of DSGE models have what the author might call "emergent behavior". They all get thrown out before a word of LaTeX is typed because that "emergent behavior" A. doesn't lead to lead to equilibrium (as Noah points out) but often also, B. make no sense.

      Delete
    3. sorry I am sowing confusion by expressing myself poorly.

      first my simple point. There is this thing called ABM. It is quite distinct from mainstream methodology in many important respects. Many people think it's a candidate for replacing mainstream economics. Yet I suspect it may suffer from the same morass of competing models as DSGE.

      I am under the impression that het agent DSGE is quite different from ABM. One normally works with distributions and summary statistics. You cannot say that agent 3104 encountered agent 9860 at 10.30am and offered an apple in return for an orange, and was declined. Which you can - sort of! - with ABM. Plus whilst of course you are correct that for many models you don't know what's going to happen until you hit go, there is a distinction here (I think this is what emergent behaviour means). As you allude, you don't do things like impose market clearing conditions and such like.

      Delete
  35. Yes, but Noah, DGSE models are very RIGOROUS!

    ReplyDelete
  36. Really excellent discussion. One of the main reasons physicists and engineers and economists eschew nonlinear solutions is that "funky stuff happens," i.e., bizarre behavior pops out that extremely path-dependent and highly dependent on initial conditions. This makes nonlinear solutions non-generalizable, and physicists and engineers and economists typically want generalized solutions that apply widely to a vast range of initial conditions.

    That's understandable. The problem? Evidence appears to be mounting that the real world is highly nonlinear outside a relatively narrow range of boundary conditions. One example is the thyristor, best modeled as a cubic parabola Shockley curve of current vs voltage. When the voltage rises beyond a certain point you get reduced current out, but if the voltage rises high enough you suddenly get a lot more current out. This models pretty well the behavior of bandgap materials like silicon carbide that actually become highly conductive when subjected to strong voltage spikes. (It's how the surge suppressor in your power strip work. A big voltage spike makes the SiC conductive and dumps the surge current to ground.) Or again, consider the behavior of brain receptors for neurostransmitters: current neuroscience that an increase up to a certain level of a certain type of neurotransmitter produces excitatory response by the receptors, but increasing the neutrotransmitter levels beyond that point generate inhibitory response.

    We see this kind of nonlinearity most spectacularly in the breakdown of game "theory" (so-called) during the initial tests with RAND corporation secretaries in the 1950s. When the secretaries failed to behave as predicted in the game "theory" and the expected Nash equilibrium failed to materialize, the RAND economists merely dismissed the behavior of real people in the real world with hand-waving and doubletalk (i.e., the RAND scientists claimed that the behavior of real people would be different "if the rewards were larger" -- without providing a scintilla of evidence to support that baseless speculation). When game "theory" goes out the window and gets revealed as pseudoscience, the basic assumptions about investors acting as rational agents to optimize their utility function breaks down. This leads to chaotic bizarre behavior in capital markets under extreme conditions as investors eagerly take irrational moderate present losses in order to avoid expected large future losses. Kahneman and Tversky of course found a large variety of such cognitive biases in their research, which can only be understood by looking at nonlinear economic models.

    ReplyDelete
  37. I will copy and paste Cosma Shalizi comparing old textbooks on (condensed matter) Physics and Macroeconomics:

    More relics from graduate school, revisited as part of the book purge. These books are now 15--20 years old; I'm sure that there're more up-to-date treatments of both topics, though I feel like I'd know if something had displaced Chaikin and Lubensky form its niche, and I don't.
    Blanchard and Fischer is about "modern" macro, models based on agents who know what the economy is like optimizing over time, possibly under some limits. This is the DSGE style of macro. which has lately come into so much discredit — thoroughly deserved discredit. Chaikin and Lubensky is about modern condensed matter physics, especially soft condensed matter, based on principles of symmetry-breaking and phase transitions. Both books are about building stylized theoretical models and solving them to see what follows from the model assumptions; implicitly they are also about the considerations which go into building models in their respective domains.
    What is very striking, looking at them side by side, is that while these are both books about mathematical modeling, Chaikin and Lubensky presents empirical data, compares theoretical predictions to experimental results, and goes into some detail into the considerations which lead to this sort of model for nematic liquid crystals, or that model for magnetism. There is absolutely nothing like this in Blanchard and Fischer — no data at all, no comparison of models to reality, no evidence of any kind supporting any of the models. There is not even an attempt, that I can find, to assess different macroeconomic models, by comparing their qualitative predictions to each other and to historical reality. I presume that Blanchard and Fischer, as individual scholars, are not quite so indifferent to reality, but their pedagogy is.
    I will leave readers to draw their own morals.

    ReplyDelete
  38. Biggest difference between weather forecast and economic forecast (whether using DSGE or not) is economic forecast can and will influence future economic outcomes, i.e. altering probability of the next economic crisis or boom.

    I'm pretty sure this is wrong...forecasting methods that make good out-of-sample forecasts have implicitly taken their own existence and use into account, otherwise their out-of-sample performance would be terrible.

    Models have pros and cons. Bashing one or the other endlessly doesn't help economics as the whole, especially when professors disagree and show it off publicly in an uncompromising way.

    This seems to me to run counter to centuries of academic tradition. It's not the way biology works, or physics, or psychology. You test other people's models to failure by being critical of them. Peer review is an institutionalized form of this, as are conferences and seminars, but informal bashing has always been part of the process. Look at Bohr and Einstein, for example.

    You want to turn econ into a cozy cartel where nobody ever has to be wrong, and we just publish our buddies' papers, and cooperate in telling the outside world that all of us are worth our salaries and they should hire more and more of us and pay us more and more? Sounds like fun, but doesn't sound like science...more like parasitism.

    If anyone can tell the economic future like palm reading, that will be the end of economics.

    No

    ReplyDelete
  39. Good critique of the limits of linearity in creating economic models, Noah Smith.

    But what sort of microfoundations do you hope to see built in the dynamic models of the future? Ones based on Maxmin Expected Utility, or Choquet integrals, or the decision rule Daniel Ellsberg proposed in his doctoral dissertation, Risk, Ambiguity and Decision?

    ReplyDelete
  40. Andrew Klaassen9:40 PM

    "Chaos pretty much ensures you don't have forecasting power."

    I like the weather forecasting analogy. If you look at it closely, though, you see that weather forecasters need a huge amount of near-real-time data in order to predict the weather just a few days in advance. Even when they've got that, predicting exactly where a hurricane is going to go or where a tornado is going to appear remains out of reach.

    You might be able to accomplish something similar in economics if you could point the firehose of data from daily Visa transactions, Walmart sales, and illegal drug purchases at a high performance compute cluster; as it stands, though, government statistical services are still trying to figure out what happened in the economy - not why it happened, but ~what~ happened - a couple of years ago, putting out regular corrections to the data they gathered then.

    When you get that daily data, though, don't expect to be able to do much better than the weather forecasters. Weather is a chaotic system; so is the economy, probably. That's why forecasting is hard. But if it's chaotic, that's what it is, right? No need to be wedded to clean predictive models if they don't and can't work.

    ReplyDelete