I eagerly await Kling’s history lesson. Meanwhile, a real life working macroeconomist has this to say about the emergence of modern macro (which he calls New Macro… I guess I’ll have to defer):
Dynamic equilibrium theory made a quantum leap between the early 1970s and the late 1990s. In the comparatively brief space of 30 years, macroeconomists went from writing prototype models of rational expectations (think of Lucas, 1972) to handling complex constructions like the economy in Christiano, Eichenbaum, and Evans (2005). It was similar to jumping from the Wright brothers to an Airbus 380 in one generation.
A particular keystone for that development was, of course, Kydland and Prescotts 1982 paper Time to Build and Aggregate Fluctuations. For the fi rst time, macroeconomists had a small and coherent dynamic model of the economy, built from rst principles with optimizing agents, rational expectations, and market clearing, that could generate data that resembled observed variables to a remarkable degree. Yes, there were many dimensions along which the model failed, from the volatility of hours to the persistence of output. But the amazing feature was how well the model did despite having so little of what was traditionally thought of as the necessary ingredients of business cycle theories: money, nominal rigidities, or non-market clearing.
Except for a small but dedicated group of followers at Minnesota, Rochester, and other bastions of heresy, the initial reaction to Kydland and Prescotts assertions varied from amused incredulity to straightforward dismissal. The critics were either appalled by the whole idea that technological shocks could account for a substantial fraction of output volatility or infuriated by what they considered the superfluity of technical fi reworks. After all, could we not have done the same in a model with two periods? What was so important about computing the whole equilibrium path of the economy?
It turns out that while the rst objection regarding the plausibility of technological shocks is alive and haunting us (even today the most sophisticated DSGE models still require a notable role for technological shocks, which can be seen as a good or a bad thing depending on your perspective), the second complaint has aged rapidly. As Max Plank remarked somewhere, a new methodology does not triumph by convincing its opponents, but rather because critics die and a new generation grows up that is familiar with it. Few occasions demonstrate the insight of Planks witticism better than the spread of DSGE models. The new cohorts of graduate students quickly became acquainted with the new tools employed by Kydland and Prescott, such as recursive methods and computation, if only because of the comparative advantage that the mastery of technical material offers to young, ambitious minds. And naturally, in the process, younger researchers began to appreciate the exibility offered by the tools. Once you know how to write down a value function in a model with complete markets and fully exible prices, introducing rigidities or other market imperfections is only one step ahead: one more state variable here or there and you have a job market paper.
Hey, I represent that last remark!