Seven Things that Economists Could Usefully Do or Call For Over the Next Several Years

Seven things that economists could usefully do or call for over the next several years

October 14, 2014

from Richard Parker

Without solving . . . Pikettyan meta-issues, the question arises: Are there other things we as economists can do if, like Piketty, we’re concerned (alarmed? appalled?) about current levels and trends of inequality? How – absent meta-solutions – should we or could we move an inequality-reduction agenda forward? What issues or strategies or agendas might help advance absorption of Piketty’s focus on distribution and reframe a mainstream professional and public discourse still fixed almost monocularly on aggregated, rather than a distributionally-differentiated, GDP?

As I contemplated that question in Athens this summer, several possible projects occurred to me as worth at least consideration and debate. Some readers will no doubt find these suggestions too small, too pallid, too technical, or too bureaucratic, but I’m motivated to raise them – rather than more sweeping or heroic responses toCapital – in part by my reading of the ways The General Theory’s lessons were absorbed, initially by academics, then policymakers, and then by elements of the press and wider public, during the first 25 years or so after its publication (about which more shortly).

What academic, government, and policy NGO economists could, in my opinion, usefully do or call for over the next several years includes, at very least, the following:

  1. Academic economists could begin teaching macro courses (undergraduate and graduate) as well as a research and publishing program focused on the big distributional questions, or at least the old growth questions reframed and disaggregated by distributional ones, in order to ground our students and our colleagues in the relationships between growth and inequality.
  2. Behavioral economists in particular could expand their own teaching and research on, for example, the field’s early findings about the effects of positional goods and relative incomes. As one example, David Moss at Harvard Business School and the Tobin Project are here already doing some pioneering work.[1]
  3. Cross-disciplinary teaching and research – in cooperation with political scientists, sociologists, social psychologists, historians, and moral philosophers – present a fertile range of opportunities for teaching and research. I’ve participated for several years, for example, in the multi-disciplinary Harvard Inequality and Social Policy program, just one model of how this could be done or approached.[2]
  4. Academic and policy NGO economists could start calling on government colleagues and statistical agencies to do (including spending) more to improve their collection and processing of income and wealth distribution data worldwide, given the myriad inherent limits of tax returns, social security files, and household survey data now in use. As we enter the era of Big Data – for better or worse – the sheer quantity of information available that could vastly supplement and enrich our attempts to measure and answer distributional questions – as well as the graphical means to make our findings more easily understood to audiences outside our profession – seem untapped.
  5. Organized calls from economists and other social scientists could press the IMF, OECD, Eurostat, the UN and the like to prioritize greater harmonization of the definitions and indices of inequality, to allow more meticulous comparison internationally. Mme. Lagarde has already publicly said that the IMF must “do more” about inequality (though without much precision about what it might do)[3]; one precedent here is the pioneering role the UN and IMF played in spreading national income accounting around the globe in the early 1950s.
  6. It seems important to me to find ways to elevate and “normalize” public reporting of the distribution issue in a super-condensed headline form, aimed not at economists but at the press, politicians, and the public. Let me call this simply the need, for want of a more elegant formulation, for “GDP-plus-Gini” – in lieu of GDP alone as the single-number metric of a nation’s economic performance.The Gini coefficient has numerous problems of which we’re all aware, of course, though in this it is no different than GDP. Its advantage lies, I would argue, in the power of that one number’s ability to reach a wider public and to shape policy and politics beyond our own limited world of classroom and peer journals. No president or prime minister runs on a platform promising to lower or even maintain current GDP.How might we best get a one-number summary of inequality before the public as GDP has done for aggregate growth? The World Bank, among others, already regularly ranks countries by their Gini coefficients – and its website allows users an easy choice of display as table, graph, or map.[4] There, one can quickly learn that the Bank ranks Sweden at 0.25 as the world’s most egalitarian nation, South Africa at 0.63, the most inegalitarian.[5] The United States, at 0.41, hunkers down among a host of developing economies such as Turkey, China, and several West African states – and, needless to say, far behind every other high-income developed country (Germany is at 0.28, France and Canada both at 0.33, the European Union as a whole, at 0.31).The merit of such rankings – if they were presented annually by national statistical authorities alongside GDP performance – is the way Gini’s simple single number translates rankings into performance that can, alongside traditional GDP, be reported on the evening news or Internet or debated in the halls of Congress. (Piketty himself has casually noted his own preference for calculation of two separate Gini coefficients – one based on labor income, one on capital income; on this, I’m for now agnostic though I take his important point.)A graphic supplement to such Gini ranking would be to disaggregate annual income and wealth changes by quintile or decile (with special attention to the top 1% and 0.1%, a la Piketty). There are already many variants of such presentations (see below); the point would be to elevate them to the prominence that reporting of GDP itself enjoys today.
    Parker1
  7.  Far more research and debate on the intersections of growth and equality is capable of leading in turn to our clearer understanding of what a band of “democratic growth” or “egalitarian growth” paths among the variety of growth paths might look like, calculated both in terms of private income and wealth distribution and the divide between public and private income and spending. Jonathan Ostrey and colleagues, to cite one example, have already published three quite solid IMF papers (“Redistribution, Inequality, and Growth”, “Inequality and Unsustainable Growth,” and “Efficiency and Equity”) that have begun opening up this sort of research, in a rich but preliminary way, to serious academics and policymakers. (I mention only these because fully enumerating the various strategies and sub-topics in this field would require its own separate paper.)

[1] On Moss and the Tobin Project, see “How Income Inequality Shapes Behavior” at

http://hbswk.hbs.edu/item/7283.html

[2] On the Harvard Inequality and Social Policy, seehttp://www.hks.harvard.edu/inequality/index.htm

[3] See Lagarde, “Economic Inclusion and Financial Integrity – an Address to the Conference on Inclusive Capitalism”, athttps://www.imf.org/external/np/speeches/2014/052714.htm

[4] World Bank Gini rankings at

http://data.worldbank.org/indicator/SI.POV.GINI/countries/all?display=graph

[5] I omit here a number of micro-states.

Richard Parker, “Reading Piketty in Athens”, real-world economics review, issue no. 69, 7 Oct 2014, pp. 58-73, http://www.paecon.net/PAEReview/issue69/Parker69.pdf

From the comments to the Piketty issue of the RWER: Newtownian is right.

October 11, 2014

In the comments to the 69th, Piketty, issue of the RWER, Newtownian states:

Try as I might in 182 PDF pages I could not find a single reference to the natural environment or climate change, peak oil, degrowth, exponential economic growth impacts, some small discussion oil and only a single footnote with the word ‘ecosystem’.

Which I suggest indicates the real natural world is still not seen as relevant to economic thought and critique even for ‘real world’ economists.

This comes concurrently with the release of Naomi Klein’s new book “This changes everything” http://www.theguardian.com/books/2014/sep/19/this-changes-everything-capitalism-vs-climate-naomi-klein-review !!??? which makes this omission even more bizarre.

From a mainstream economics group I would have expected this. But RWER I thought was progressive. This omission is so depressing – for me personally in part after having spent two days last week at a degrowth conference which itemized how bad, bad economics policy is regarding sustainability and future generations and now desperately new economics ideas are needed.

I agree. But the concept of capital used by Piketty is not as bad in this respect as the ‘mainstream’ text book growth concept (which, as shown below, is hugely influential in policy circles). Economists like Solow, Harrod and Domar and people inspired by them tend to equate physical capital with depreciable capital: stuff we can replace. As can be read in my article, the national accounts concept used ‘hook, line and sinker’ by Piketty also includes land and natural resources. Other economists equate ‘capital’ with financial capital. The Piketty approach uses balance sheets to combine these two kinds of capital. Which is a step into the direction suggested by Newtownian. This does not just enable a much better analysis of the distribution of capital income (including rents). It also enables an analysis which puts a heavier emphasis on sustainability. An example is this recent Voxeu article by Samuel Wills and Rick van der Ploeg, which endorses such an analysis:

At present, Norway has designed its fund [i.e. its ‘sovereign wealth oil fund’, M.K] according to the principles of modern portfolio theory. These principles would see Norway construct a highly diversified equity portfolio, choose the size of that portfolio based on its risk preferences, and consume a fixed proportion of the fund’s assets each year (Merton 1971). This is almost exactly what Norway does. Its equity and bond benchmarks are highly diversified indices. The size of each depends on Norway’s risk appetite, with the mix changing in 2009 when the country decided to accept more risk for a higher return. Finally, Norway has committed explicitly to a rule which states that, on average, 4% of the fund’s balance is spent each year.

However, when the fund was established these principles didn’t give due consideration to oil beneath the ground. In fact, in the 5000 words of the investment mandate, oil is not mentioned once (NBIM 2013). This poses an important problem because oil wealth is a substantial and very volatile part of Norway’s total wealth. Not only will below-ground assets alter the fund’s allocation and spending rule, above-ground assets should also affect the speed at which oil is extracted.

I agree – this is just a first step. But the Piketty (i.e.: national accounts) concept of capital enables this. And a journey has to start with a first step.

The riddle of induction

October 9, 2014

from Lars Syll

Recall [Russell’s famous] turkey problem. You look at the past and derive some rule about the future. Well, the problems in projecting from the past can be even worse than what we have already learned, because the same past data can confirm a theory and also its exact opposite …

For the technical version of this idea, consider a series of dots on a page representing a number through time … Let’s say your high school teacher asks you to extend the series of dots. With a linear model, that is, using a ruler, you can run only a single straight line from the past to the future. The linear model is unique. There is one and only one straight line that can project a series of points …

grueThis is what philosopher Nelson Goodman called the riddle of induction: we project a straight line only because we have a linear model in our head — the fact that a number has risen for 1 000 days straight should make you more confident that it will rise in the future. But if you have a nonlinear model in your head, it might confirm that the number should decline on day 1 001 …

The severity of Goodman’s riddle of induction is as follows: if there is no longer even a single unique way to ‘generalize’ from what you see, to make an inference about the unknown, then how should you operate? The answer, clearly, will be that you should employ ‘common sense’.

Nassim Taleb

And economists standardly — and without even the slightest justification — assume linearity in their models …

Behavioral economics

October 4, 2014

from Neva Goodwin

Neoclassical economics claims to be based entirely on a view of human nature which is not only morally repugnant, but which also both leaves out a great deal about how people actually do operate, while it brings in seriously contrary-to-fact assumptions about what people are capable of. The latter have included assumptions about consistency (including that preferences change slowly, if at all, and that if A is preferred to B and B is preferred to C, then C cannot be preferred to A); about information (people are able to act as if they have perfect information); about self-knowledge (people know what they want, and are best served by getting what they want); and about influence, or power. The last of these assumptions includes the idea that human wants and preferences are endogenous, generated entirely from within; it ignores the extent to which people’s choices and decisions may be manipulated by those who have an interest in persuading the public to buy certain things, or vote in certain ways. It ignores the reality that market economies are rife with powerful actors who do have such an interest, in both the economic and the political spheres. 

A paper on the sociology of economics[1] would describe how these unrealistic assumptions have been fostered in a profession with skewed motivations. Promotion and tenure in college and university economics departments depend on publication in a short list of acceptable journals. The editorial boards of those journals have an interest in keeping the ideology unchanged, for several reasons: Their status depends in part on the mystification of arcane language and hard-to-swallow assumptions; these characteristics, as well as emphasis on difficult mathematics, erect barriers to entry to the profession; control over the supply of economists results in an ability to command higher salaries than most other academics, as well as the possibility of much higher pay in the service of business or politics, where there is also an interest in maintaining the status quo[2].

In the last few decades the narrow economic view of human behavior has been challenged by a strong alternative called behavioral economics. Studies in this area suggest that a more sophisticated model of human motivations is required to explain such behaviors as those that lead to stock market swings, the ways that people react to good and bad fortune, and why people often seem to act against their own self-interest.

Perhaps the most famous contemporary behavioral economist is not an economist at all. Despite being trained as a psychologist, Daniel Kahneman, along with his frequent colleague Amos Tversky, won the 2002 Nobel Memorial Prize in economics. Kahenman’s research has found that people tend to give undue weight to information that is easily available and vivid – a detour from 20th century assumptions of economic rationality that he calls the “availability heuristic.” For example, suppose college students are deciding which courses to take next semester, and they see a summary of evaluations from hundreds of other students indicating that a certain course is very good. Then suppose they watch a video interview of just one student providing a negative review of the course. Even when students are told in advance that the negative review was atypical, they tend to be more influenced by the vivid review than the summary of hundreds of evaluations.

Kahneman has also shown that the way a decision is presented to people can significantly influence their choices, an effect he refers to as “framing.” For example, consider a gas station that advertises a special 5-cent per gallon discount for paying cash. Meanwhile, another station with the same prices indicates that they charge a 5-cent per gallon surcharge to customers paying by credit card. While the prices are exactly the same, experiments show that consumers respond more favorably to the station advertising the apparent discount.

An effect similar to framing is known as “anchoring,” in which people rely on some not necessarily relevant piece of information as a reference point in making a decision. In a real-world example, a high-end kitchen equipment catalog was selling a particular bread maker for $279. Sometime later, the company began offering a “deluxe” model for $429. While they did not sell many of the deluxe model, sales of the $279 model almost doubled because now it seemed like a relative bargain.

A conventional view of rationality is that emotions get in the way of good decision making, as they tend to interfere with logical reasoning. Again, however, research from behavioral economics suggests a more nuanced reality. Logical or rational reasoning is most effective when making relatively simple economic decisions, but for more complex decisions we can become overwhelmed with too much information. For example, Ap Dijksterhuis, a psychologist in the Netherlands, surveyed shoppers about their purchases as they were leaving stores, asking them how much they had thought about items prior to buying them. A few weeks later, he asked these same consumers how satisfied they were with their purchases. For relatively simple products, like small kitchen tools or clothing accessories, those who thought more about their purchases tended to be more satisfied. But for complex products, such as furniture, those people who deliberated the most tended to be less satisfied with their purchases.

[1] See, e.g., Goodwin, Neva, 2008 “From Outer Circle to Center Stage: The maturation of heterodox economics”.

[2] To give just one example of the latter point: standard utility theory, as portrayed in 20th century economics, could be used to show that a high degree of economic inequality is bad for economic stability, and reduces overall well-being. This conclusion is too rarely drawn in the standard literature, even though a few writers such as Robert Frank make plain the logic.

Neva Goodwin, “The human element in the new economics: a 60-year refresh for economic thinking and teaching”, real-world economics review, issue no. 68, 21 August 2014, pp. 98-118,
http://www.paecon.net/PAEReview/issue68/Goodwin68.pdf

 

Modern macroeconomics and the perils of using ‘Mickey Mouse’ models

October 15, 2014

from Lars Syll

The techniques we use affect our thinking in deep and not always conscious ways. This was very much the case in macroeconomics in the decades preceding the crisis. The techniques were best suited to a worldview in which economic fluctuations occurred but were regular, and essentially self correcting. The problem is that we came to believe that this was indeed the way the world worked.

To understand how that view emerged, one has to go back to the so-called rational expectations revolution of the 1970s … These techniques however made sense only under a vision in which economic fluctuations were regular enough so that, by looking at the past, people and firms (and the econometricians who apply statistics to economics) could understand their nature and form expectations of the future, and simple enough so that small shocks had small effects and a shock twice as big as another had twice the effect on economic activity. The reason for this assumption, called linearity, was technical: models with nonlinearities—those in which a small shock, such as a decrease in housing prices, can sometimes have large effects, or in which the effect of a shock depends on the rest of the economic environment—were difficult, if not impossible, to solve under rational expectations.

Thinking about macroeconomics was largely shaped by those assumptions. We in the field did think of the economy as roughly linear, constantly subject to different shocks, constantly fluctuating, but naturally returning to its steady state over time …

From the early 1980s on, most advanced economies experienced what has been dubbed the “Great Moderation,” a steady decrease in the variability of output and its major components—such as consumption and investment … Whatever caused the Great Moderation, for a quarter Century the benign, linear view of fluctuations looked fine.

Olivier Blanchard

Blanchard’s piece is a confirmation of  what I argued in my paper Capturing causality in economics and the limits of statistical inference —  since “modern” macroeconom(etr)ics doesn’t content itself with only making “optimal” predictions,” but also aspires to explain things in terms of causes and effects, macroeconomists and econometricians need loads of assumptions — and one of the more  important of these is linearity.

So bear with me when I take the opportunity to elaborate a little more on why I — and Olivier Blanchard — find that assumption of such paramount importance and ought to be much more argued for — on both epistemological and ontological grounds — if at all being used.

Limiting model assumptions in economic science always have to be closely examined since if we are going to be able to show that the mechanisms or causes that we isolate and handle in our models are stable in the sense that they do not change when we “export” them to our “target systems”, we have to be able to show that they do not only hold under ceteris paribusconditions and a fortiori only are of limited value to our understanding, explanations or predictions of real economic systems. As the always eminently quotable Keynes wrote (emphasis added) in Treatise on Probability (1921):

The kind of fundamental assumption about the character of material laws, on which scientists appear commonly to act, seems to me to be [that] the system of the material universe must consist of bodies … such that each of them exercises its own separate, independent, and invariable effect, a change of the total state being compounded of a number of separate changes each of which is solely due to a separate portion of the preceding state … Yet there might well be quite different laws for wholes of different degrees of complexity, and laws of connection between complexes which could not be stated in terms of laws connecting individual parts … If different wholes were subject to different laws qua wholes and not simply on account of and in proportion to the differences of their parts, knowledge of a part could not lead, it would seem, even to presumptive or probable knowledge as to its association with other parts … These considerations do not show us a way by which we can justify induction … /427 No one supposes that a good induction can be arrived at merely by counting cases. The business of strengthening the argument chiefly consists in determining whether the alleged association is stable, when accompanying conditions are varied … /468 In my judgment, the practical usefulness of those modes of inference … on which the boasted knowledge of modern science depends, can only exist … if the universe of phenomena does in fact present those peculiar characteristics of atomism and limited variety which appears more and more clearly as the ultimate result to which material science is tending.

Econometrics may be an informative tool for research. But if its practitioners do not investigate and make an effort of providing a justification for the credibility of the assumptions on which they erect their building, it will not fulfill its tasks. There is a gap between its aspirations and its accomplishments, and without more supportive evidence to substantiate its claims, critics will continue to consider its ultimate argument as a mixture of rather unhelpful metaphors and metaphysics. Maintaining that economics is a science in the “true knowledge” business, yours truly remains a skeptic of the pretences and aspirations of econometrics. So far, I cannot really see that it has yielded very much in terms of relevant, interesting economic knowledge.

The marginal return on its ever higher technical sophistication in no way makes up for the lack of serious under-labouring of its deeper philosophical and methodological foundations that already Keynes complained about. The rather one-sided emphasis of usefulness and its concomitant instrumentalist justification cannot hide that neither Haavelmo, nor the legions of probabilistic econometricians following in his footsteps, give supportive evidence for their considering it “fruitful to believe” in the possibility of treating unique economic data as the observable results of random drawings from an imaginary sampling of an imaginary population. After having analyzed some of its ontological and epistemological foundations, I cannot but conclude that econometrics on the whole has not delivered “truth”. And I doubt if it has ever been the intention of its main protagonists.

Our admiration for technical virtuosity should not blind us to the fact that we have to have a cautious attitude towards probabilistic inferences in economic contexts. Science should help us penetrate to “the true process of causation lying behind current events” and disclose “the causal forces behind the apparent facts” [Keynes 1971-89 vol XVII:427]. We should look out for causal relations, but econometrics can never be more than a starting point in that endeavour, since econometric (statistical) explanations are not explanations in terms of mechanisms, powers, capacities or causes. Firmly stuck in an empiricist tradition, econometrics is only concerned with the measurable aspects of reality, But there is always the possibility that there are other variables – of vital importance and although perhaps unobservable andnon-linear, not necessarily epistemologically inaccessible – that were not considered for the model. Those who were can hence never be guaranteed to be more than potential causes, and not real causes. A rigorous application of econometric methods in economics really presupposes that the phenomena of our real world economies are ruled by stable causal relations between variables. A perusal of the leading econom(etr)ic journals shows that most econometricians still concentrate on fixed parameter models and that parameter-values estimated in specific spatio-temporal contexts are presupposed to be exportable to totally different contexts. To warrant this assumption one, however, has to convincingly establish that the targeted acting causes are stable and invariant so that they maintain their parametric status after the bridging. The endemic lack of predictive success of the econometric project indicates that this hope of finding fixed parameters is a hope for which there really is no other ground than hope itself.

Real world social systems are not governed by stable causal mechanisms or capacities. As Keynes wrote in his critique of econometrics and inferential statistics already in the 1920s (emphasis added):

The atomic hypothesis which has worked so splendidly in Physics breaks down in Psychics. We are faced at every turn with the problems of Organic Unity, of Discreteness, of Discontinuity – the whole is not equal to the sum of the parts, comparisons of quantity fails us, small changes produce large effects, the assumptions of a uniform and homogeneous continuum are not satisfied. Thus the results of Mathematical Psychics turn out to be derivative, not fundamental, indexes, not measurements, first approximations at the best; and fallible indexes, dubious approximations at that, with much doubt added as to what, if anything, they are indexes or approximations of.

The kinds of “laws” and relations that mainstream econ(ometr)ics has established, are laws and relations about entities in models that presuppose causal mechanisms being atomisticand linear (additive). When causal mechanisms operate in real world social target systems they only do it in ever-changing and unstable combinations where the whole is more than a mechanical sum of parts. If economic regularities obtain they do it (as a rule) only because we engineered them for that purpose. Outside man-made “nomological machines” they are rare, or even non-existant. Unfortunately that also makes most of the achievements of econometrics — as most of contemporary endeavours of mainstream economic theoretical modeling — rather useless.

 

 

This entry was posted in Real World Economics. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *