Archive for April, 2014

The Reality of the Present and the Challenge of the Future: Fagg Foster for the 21st Century

L. Randall Wray | April 23, 2014

(Here is a presentation I gave at the University of Denver at the annual J. Fagg Foster honors ceremony. Most of you will not know of Foster, but you should. While he did not publish much, he was the professor of a number of prominent institutionalists who attended DU in the early postwar period. I was lucky to have studied with his student, Marc Tool, and was introduced to Foster’s work at the very beginning of my studies of economics. My presentation below is based on two of Foster’s articles: J. Fagg Foster (1981) “Understandings and Misunderstandings of Keynesian Economics,” JEI, vol XV, No 4, p. 949-957.; and (1981) “The Reality of the Present and the Challenge of the Future”, JEI vol XV, No 4, p. 963-968. Both are from 1966, republished in a special issue of the Journal of Economic Issues, 1981. You should read them.)

Is this the age of Keynes? That’s the question raised by Fagg Foster in 1966.

In the 1960s the answer seemed obvious. Keynes dominated economics—or, at least, macroeconomics—and Keynesianism dominated policy. And it worked! Or, so most thought.

Foster wasn’t sure. While he agreed that “[t]here probably has been no instance in history in which a pattern of ideas has had so much effect on the everyday life of everyone in so short a time,” he thought most of Keynes’s followers misunderstood his theory.

Further, Foster wasn’t convinced the theory provided a firm basis for policy.

Finally, he lamented that “among all post-Keynesian economists, the institutionalists seem to have been least affected by Keynes’s theory…. The institutionalists have not even contemplated the possibility of any generic relationships between the Keynesian theory and their own.”

A decade later, so-called Keynesian economics was in disarray, a casualty of the apparent failure of policy to fine-tune the economy. Stagflation at the end of the 1970s delivered the final blow, and fueled the rise of increasingly preposterous approaches such as Rational Expectations, Real Business Cycle theory, the Efficient Markets Hypothesis and hence on to DSGE with a single representative agent standing in for the whole economy.

In truth, even in the heyday of Keynesianism, policy was directed to stimulate the sentiments of business undertakers—precisely what Keynes recommended against—with supply-side tax cuts and a cornucopia of subsidies to the captains of industry.

While a parallel approach developed calling itself New Keynesian, the only thing new was the adoption of the craziest “new” orthodox ideas (witness rational expectations). And the only thing “Keynesian” was the presumption that sticky wages and prices prevent instantaneous market clearing—which was actually the old Neoclassical explanation of unemployment that Keynes had dispatched.

With friends like these, Keynes doesn’t need enemies.

In retrospect, Foster might have been a bit hard on the institutionalists. continue reading…


Distribution, Stagnation, and Macro Policy in an Interactive Model

Greg Hannsgen | April 21, 2014

The funny-shaped surface in the Wolfram “CDF” below (software download link) depicts excess demand for goods. The flat one represents the zero line where supply and demand are equal. On each axis is a variable that affects the degree to which demand outpaces or falls short of supply: (1) firms’ share in the price of goods, after paying wages, which equals the pricing markup m divided by (1 + m); and  (2) the income and production generated by the private sector, measured by capacity utilization. The height dimension measures excess demand for goods.

The sliding levers at the top of the CDF allow one to change (1) (“chi”) the percentage of disposable income spent by the wealthy households who own most stock, as well as all government-issued securities; (2) the rate of production by the public sector, which hires workers to produce services; and/or (3) the annual compound real interest rate (yield) on government securities. All of the other parameters are held constant as you move the levers. Click on the “plus” sign next to a lever, and further information appears.

[WolframCDF source=”” width=”331″ height=”361″ altimage=”3D-excess-demand-graphN5.png” altimagewidth=”309″ altimageheight=”351″]

Click here for a much larger, easier-to-read version of this CDF on a webpage of its own.

At the curved line where the two surfaces intersect (the edge of the dark blue region when viewed from above), aggregate demand is just equal to private-sector output, and there is no tendency for capacity utilization to change. Finding this intersection gives us the set of combinations of output and the distributional parameter at which all newly produced units are being sold, and no new goods orders are stacking up unfilled. Experimenting with the CDF, one finds that capacity utilization is usually higher: (1) when the share of the “K-sector”, or capital-owning sector, (m/(1 + m)) is lower, (2) when that sector spends a greater percentage of its disposable income, or (3) when government production and payrolls are larger.

One should keep in mind the simplification required to construct such a “small” model, which in graphical form represents only an imaginary economy; the numbers are not intended to mirror those of any particular country or data set–but the economic  system portrayed in the CDF is meant to be similar in many of its essentials to that of large industrialized nations with their own currencies, huge companies, liquid securities markets, floating exchange rates, etc. Another possible way to interpret this highly “stratified” industrial system is as an entire global economy in a mere 3 sectors: workers; firms/wealthy households; and government/central bank.

A larger version of the model featured an unemployment benefits system. To come: a discussion of the movements over time that may or may not bring the economy closer to the line where excess demand just reaches the flat surface and no higher. The model still has only a rudimentary financial system, with no private borrowing. Hence, the interest rate lever acts upon the economy solely by changing the amount of interest payments from the government to households–a distributional and fiscal variable in its own right and an MMT insight. (Business investment depends on capacity utilization and the gross after-tax profit rate.) The model is drawn more or less directly from Levy Institute working paper 723 (see this previous post) as revised recently for the academic journal Metroeconomica.


Minsky and Financial Reform’s “Never Ending” Struggle

Michael Stephens | April 18, 2014

In a new policy brief, Jan Kregel looks at a lesser-known, early period of Minsky’s work on financial reform. In the ’60s, Minsky was a consultant to a number of government agencies, including the Federal Reserve, on issues related to financial regulation. In this context, he came up with a new approach to bank examination, which he called “cash-flow based.” The new approach evaluated bank liquidity, not as an innate feature of a particular class of assets, but as a function of the balance sheet of the institutions under examination, the markets for those assets, the state of the macroeconomy and the financial system as a whole, and much else. In fact, as Kregel explains, what Minsky was after here was related to an early form of what we now call “macroprudential regulation.”

The evolution of Minsky’s thought on this approach to bank examination is interesting enough in itself, but it’s also a reflection of Minsky’s broader thinking about financial regulation and reform. Minsky developed his regulatory proposals in the ’60s and ’70s with an eye to what was to become his well-known “financial instability hypothesis,” which is to say, his proposals were informed by a theory of endogenous financial instability: a theory in which financial crises are not only possible, but are to be expected; generated as a result of the “normal” functioning of the financial system. Without such a theory, as Kregel points out, it’s hard to formulate effective regulation:

As Minsky was fond of pointing out, the bedrock of mainstream theory is a system of self-adjusting equilibrium that provides little scope for the discussion of a systemic crisis, since, in this theory, one could not occur. It was thus extremely difficult to formulate prudential regulations to respond to a financial crisis if one could only occur as the result of random, external shocks, or what Alan Greenspan would consider idiosyncratic, nonrational (fraudulent) behavior. The only basis for regulation would be to concentrate on the eradication of the disruptive behavior of bad actors or mismanaged financial institutions. From this initial presumption, the formulation of regulations and supervisory procedures required the assessment of the activities of individual banks—without any reference to their relations with other institutions or the overall environment in which they functioned.

One consequence of being informed by a proper theory of financial instability, Minsky maintained, is that regulation has to be responsive to innovations in the financial system; innovations that are often reactions to new regulatory frameworks. What this calls for, then, is not just the right set of rules, whether your preferred model is Glass-Steagall or something else, but also an adaptive, “dynamic” framework that’s attuned to the evolution of the financial system. This is from the preface:

the challenge for reform is not just the proper formulation and implementation of specific rules, but the development of an approach that is sensitive to the potential of actors in the financial system to adapt and innovate, creating new practices that threaten the stability of the system in ways that may not become apparent until the next crisis hits. Financial regulation and examination procedures need to be constantly reassessed in order to avoid becoming obsolete. And in that sense, as Minsky recognized, “the quest to get money and finance right may be a never ending struggle.”

There’s a lot more here, including Kregel’s take on the ongoing debates about imposing specific capital and liquidity ratios on financial institutions:

While the imposition of minimum liquidity and capital ratios is an improvement over the prior risk-based approach, such target ratios are not macroprudential regulations in Minsky’s sense. Similarly, stress tests of banks’ capital positions are applied to banks individually, rather than in a systemic interaction. Neither approach to macroprudential regulation takes into account the dynamic macro factors that impact the bank’s position-making assets and liabilities and the secondary markets in which they trade, or the ongoing institutional and policy changes that are a natural part of the economic system.

Download it here: “Minsky and Dynamic Macroprudential Regulation


Charles Evans on Missing the Fed’s Targets

Michael Stephens | April 17, 2014

Chicago Fed President Charles Evans spoke at last week’s Minsky conference, and news reports have focused on his comments regarding the expectation that the Federal Reserve will wait at least six months after the end of QE before beginning to raise interest rates (Evans: “It could be six, it could be 16 months”; “If I had my druthers, I’d want more accommodation and I’d push it into 2016,” but “the actual, most likely case I think is probably late 2015”).

But his speech might also be of interest to those who have been following the debate over whether the Federal Reserve is, let’s say, equally passionate about the two sides of its “dual mandate” (price stability and maximum employment). Right now, the Fed is missing both of its ostensible targets, with inflation below 2 percent and unemployment above the Fed’s estimate of the “natural” rate, which ranges from 5.2 to 5.6 percent (for Evans, it’s 5.25 percent). Many have suggested that the Fed appears much more concerned about inflation rising above 2 percent than it does about high unemployment, or below-target inflation, for that matter.

In the video below, Evans shares his view of how the Fed should “score” its hits and misses on unemployment and inflation:

the 9 percent unemployment rate we faced back in September 2011 can be depicted in “inflation-loss equivalent units” by showing the inflation rate that gives an equivalent loss when unemployment is at its sustainable rate. So what is that rate? If unemployment was at its natural rate, what would be the inflation rate that would make you equally uncomfortable as if you were facing the 9 percent [unemployment] rate? The answer is 5-1/2 percent inflation. […]

I think we need continued strongly accommodative monetary policy to get inflation back up to 2 percent within a reasonable time frame. After all, notice that the red and green regions of the bull’s-eye chart [posted below] show modest inflation above 2 percent is much more acceptable than even 6 percent unemployment.

BullsEye Accountability_Evans

Here he is on the outlook for inflation:

Despite current low rates, I still often hear people say that higher inflation is just around the corner. I confess that I am somewhat exasperated by these repeated warnings given our current environment of very low inflation. Many times, the strongest concerns are expressed by folks who said the same thing back in 2009, and then in 2010, and … well, you get the picture. […]

[A]nother potential source of inflationary pressures would be rising inflation expectations. Here, I mean a breakout of inflation expectations separate from any fundamentals that might accompany the previously discussed cases of rising commodity prices and stronger bank lending. One could think of this as the spontaneous combustion theory of inflation. The story goes like this: Households and businesses simply wake up one day and expect higher inflation is coming without any further improvement in economic fundamentals. Without appealing to esoteric economic theories of sunspots, these expectations don’t seem sustainable in the current environment.

The rest of the videos of speakers and panelists from the conference will be posted here.


Working Paper Roundup 4/15/2014

Michael Stephens | April 15, 2014

Minsky and the Subprime Mortgage Crisis: The Financial Instability Hypothesis in the Era of Financialization
Eugenio Caverzasi
“The aim of this paper is to develop a structural explanation of the subprime mortgage crisis, grounded on the combination of two apparently incompatible financial theories: the financial instability hypothesis by Hyman P. Minsky and the theory of capital market inflation by Jan Toporowski. …
… we firmly reject the idea that ‘black swans’ or exogenous shocks of any type might have caused the crisis. We believe that the pathogens which led to the crisis were congenital to U.S. capitalism and that the bursting in the mortgage market happened for specific reasons. This is what is meant in this paper by ‘structural interpretation’: the identification and the understanding of the endogenous forces which made the U.S. economy progressively reach an unsustainable financial position, making the crisis an inescapable event.”

Growth with Unused Capacity and Endogenous Depreciation
Fabrizio Patriarca and Claudio Sardoni
“This paper contributes to the debate on income growth and distribution from a nonmainstream perspective. It looks, in particular, at the role that the degree of capacity utilization plays in the process of growth of an economy that is not perfectly competitive. The distinctive feature of the model presented in the paper is the hypothesis that the rate of capital depreciation is an increasing function of the degree of capacity utilization. This hypothesis implies analytical results that differ somewhat from those yielded by other Kaleckian models. Our model shows that, in a number of cases, the process of growth can be profit-led rather than wage-led. The model also determines the value to which the degree of capacity utilization converges in the long run.”

Structural Asymmetries at the Roots of the Eurozone Crisis: What’s New for Industrial Policy in the EU?
Alberto Botta
“In this paper, we analyze and try to measure productive and technological asymmetries between central and peripheral economies in the eurozone. We assess the effects such asymmetries would likely bring about on center–periphery divergence/convergence patterns, and derive some implications as to the design of future industrial policy at the European level. … All in all, future EU industrial policy should be much more interventionist than it currently is, and dispose of much larger funds with respect to the present setting in order to effectively pursue both short-run stabilization and long-run development goals.”

Quality of Statistical Match and Employment Simulations Used in the Estimation of the Levy Institute Measure of Time and Income Poverty (LIMTIP) for South Korea, 2009 *
Thomas Masterson
“The quality of match of the statistical match used in the LIMTIP estimates for South Korea in 2009 is described. The match combines the 2009 Korean Time Use Survey (KTUS 2009) with the 2009 Korean Welfare Panel Study (KWPS 2009). The alignment of the two datasets is examined, after which various aspects of the match quality are described. The match is of high quality, given the nature of the source datasets. The method used to simulate employment response to availability of jobs in the situation in which child-care subsidies are available is described. Comparisons of the donor and recipient groups for each of three stages of hot-deck statistical matching are presented. The resulting distribution of jobs, earnings, usual hours of paid employment, household production hours, and use of child-care services are compared to the distribution in the donor pools. The results do not appear to be anomalous, which is the best that can be said of the results of such a procedure.”
* Related: Time Deficits and Hidden Poverty in Korea (pdf) Kijong Kim, Thomas Masterson, and Ajit Zacharias


On the Alleged Pains of the Strong Euro

Jörg Bibow | April 9, 2014

Since its most recent low of $1.20, reached in the heat of the summer of 2012, the euro has appreciated by 15 percent against the US dollar and by more than 10 percent in inflation-adjusted terms against a broad basket of currencies representative of the euro area’s main trading partners. Amounting to a significant loss in international competitiveness, representatives from a number of euro area member states aired fears that euro strength might undermine the area’s recovery from gloom. Members of the ECB’s governing council too expressed concerns about the euro’s exchange rate. ECB president Mario Draghi recently argued that the strengthening of the euro was partly responsible for the bank’s conspicuous miss of its 2-percent price stability norm by an embarrassingly large margin, adding that the euro’s strength was “becoming increasingly relevant” in the ECB’s assessment of price stability.

In truth euro appreciation should attract neither fears nor blame. The euro area’s dangerously low rate of inflation owes primarily to domestic sources. Instead of debating the euro’s external value, it is high time for euro policymakers to concentrate on getting their own house in order.  A sober assessment reveals that the supposedly too strong euro is at risk of turning into yet another scapegoat. Covering up euro policymakers’ unenviable record of staggering policy blunders is unwarranted.

Ultimately the single most relevant factor for price stability in an economy as large as the euro area is wage inflation corrected for productivity growth. The outstanding fact is that euro area wage inflation is approaching zero. Unit labor costs and business costs more generally are flat or falling. It is therefore no surprise at all that the ECB is failing on its price stability mandate. Rather, what is surprising is that euro policymakers keep on clobbering wages without remorse, apparently wishing to drive them ever lower. Seemingly justified by some holy calling to please the gods of austerity and competitiveness, euro policymakers keep on digging the hole they are trapped in ever deeper. continue reading…


A Minsky Moment on the BBC

Michael Stephens | April 1, 2014

For those of you who haven’t seen it already, Duncan Weldon did a feature on Hyman Minsky for the BBC last week, including this short article and a 30-minute piece for BBC radio.

In the radio segment, Adair Turner says this about Minsky’s contribution and his departure from the mainstream (a description of the pre-crisis orthodoxy which is probably baffling to many unfamiliar with the field):

“The dominant strain of modern economics had assumed, before the crisis, that you could largely ignore the details of the financial system and banks in particular. The phrase that was used was that finance was simply a sort of veil through which relationships between savers and borrowers passed and it didn’t have an influence, and at the … core of Minsky’s analysis is the fact that financing contracts and banks in particular have a crucial influence.”

Weldon devotes a great deal of the program to the “financial instability hypothesis,” for which Minsky is, perhaps, best known, but Minsky also offered an approach to re-regulating the financial system that makes his work as useful as a prescription for a more stable capitalism as it is as a diagnosis of financial crises. (The Levy Institute’s short ebook, Beyond the Minsky Moment (pdf), includes a survey of Minsky’s views about how to reconstitute the financial structure and explains why Dodd-Frank falls well short. The Minsky archive has also been digitized to provide access to many of Minsky’s unpublished papers and notes.)

The annual conference inspired by Minsky’s work will be held at the National Press Club in Washington, DC next week.