Saturday, March 25, 2017

Asian-American representation in Hollywood

With the casting controversies over the live-action Ghost in the Shell movie and the Marvel Netflix series Iron Fist, the outcry over "whitewashing" of Asian characters in American entertainment has reached a fever pitch. So I thought I'd write a post about that.

Why care about whitewashing?

Why do I, who am not Asian, care about whitewashing? Well, there's a not-so-important reason and a very important reason. The not-so-important reason is that I have a lot of Asian-American friends, and it pisses me off to see movies depicting an America in which they don't seem to exist. But that's very unimportant compared to the real issue, which is racial integration. 

Most of America's immigration now comes from Asia, meaning that the nation's future will be greatly affected by how well we integrate Asian-Americans into American culture and society. Keeping Asian-Americans invisible will cause non-Asian Americans to keep seeing them as perpetual foreigners and outsiders, while denying them representation in the mass media will make Asian-Americans themselves feel disaffected and anti-nationalistic.

To see what I mean, watch this short film by Chewy May and Jes Tom. A lack of Asian-American heroes on the silver screen has made many Asian-Americans feel that their country doesn't really consider them normal, mainstream citizens. That's unacceptable. 

Why changing Hollywood will be hard

If it were easy for popular outcries to change Hollywood whitewashing, it would have happened already. There must be some deep reasons it hasn't yet worked. 

One reason is that Asian people, being only about 6.5% of the U.S. population, are a small part of the American movie-going public. If everyone demands to see characters of their own race on screen, then movies directed at American audiences will feature mostly white, Hispanic and black people. Even if this same-race preference is only a slight one, it's probably enough to make many risk-averse studio execs shy away from putting Asian people on screen.

But the American audience is not the only important one anymore. The Chinese box office is increasingly crucial for U.S. film studios, especially in the face of the ongoing U.S. shift toward home viewing. And Chinese audiences may even more strongly prefer to see white people on the screen. Chinese moviegoers, used to seeing Chinese people in film, might view Hollywood as a chance to see exotic-looking white heroes. The Chinese-made movie Great Wall, starring Matt Damon, could be an indicator of this.

Finally, Hollywood studio execs, in addition to being a bunch of old racist white guys, might simply be stubborn and contrary. All the protesting and criticism may have just caused them to assert their control more strongly by doubling down on whitewashing. No one likes to be pushed around by angry bloggers and Twitter trolls if they can help it (believe me, as a blogger, I know). 

An alternative path

I've often criticized the Millennial generation (of which I'm technically a part, just barely) of relying too heavily on "appeals to liberal authority" as a way of bringing about change. Educated people of my generation and younger have grown up under more benevolent and more liberal institutions than anyone in America's past - public schools, universities, the Obama administration, the media, corporations trying to look good for the media, etc. When something about society is wrong, we instinctively appeal to authority for a redress of the injustice. We make demands on university administrations, Silicon Valley venture capitalists, big companies, Hollywood execs. And when there's no obvious power to appeal to, we call out injustices to society at large, imagining that there must be someone listening with the power to respond.

I'm not saying it's wrong or bad to complain about whitewashing on Twitter or The Verge or Kotaku. I just think this approach has serious practical limitations. The problem with appeals to liberal authority is that there won't always be a liberal authority to hear and respond. Often, the authority isn't as liberal as we would like to think. And often, authorities have less power than we implicitly assume. Yes, I realize this is a grumpy-old-man critique. But sometimes the grumpy old men are onto something.

Maybe there's a different way to end whitewashing and get Asian-American actors onto the screen. Maybe the answer is not to demand representation, but simply to seize it. Maybe the solution is for Asian-Americans, and also those non-Asian Americans who (like me) want to see more Asian-Americans on screen, to make and distribute movies themselves.

That sounds crazy, but it isn't actually crazy. Hear me out.

Hollywood is ripe for overthrow

The U.S. big-budget film industry is an industry in crisis. Ticket sales are in relentless decline. Revenues are up (which must be due to soaring ticket prices if sales are down), but profits are hurting. Hollywood has to spend more on marketing and expensive spectacle every year just to cajole an increasingly bored public to see its low-quality product. The studios have adopted an insanely risk-averse attitude, focusing almost entirely on sequels and remakes. Meanwhile, Americans are sensibly shifting to Netflix and Amazon and HBO streaming TV, because that's where all the quality is.

Meanwhile, it has never been cheaper to make a movie. I just bought a used camera for $1000. That camera, which can also shoot digital video, was one of the cameras used to film the IMAX movie Jerusalem, which won awards for its cinematography. One thousand dollars. And I bet if I had tried, I could have found the same model for cheaper. One of the top films at the 2015 Sundance Film Festival was shot on an iPhone.

Editing software is also cheap, and the price of high-quality computer graphics is falling relentlessly. This doesn't mean making a movie is cheap or easy, but it's a lot cheaper and easier than before. In 2014, the average independent film cost $750,000 to make. That's not peanuts, but for the price of one house in San Francisco you could make three indie films.

Moonlight, this year's Best Picture winner at the Oscars, was made for $1.5M and grossed $55M.

Get Out, by Jordan Peele, was made for $4.5M and has grossed over $140M so far.

As for distribution, this isn't nearly as big of a problem as you might think. With the rise of streaming, it's possible to create new video distribution channels (streaming services, or even entirely new business models people haven't thought of yet) much more easily than in decades past. Only a few can succeed, but those will succeed big.

Netflix and Amazon and Hulu are desperate for new content. TV is often a stepping stone to the movies, and is where all the quality is nowadays anyway.

And traditional channels for independent movies still exist - plenty of Hollywood directors and producers got their start from indie hits, and that will probably continue to be true. 

And capital costs in the United States (and much of the world) are near all-time lows. Bond rates are historically low, the stock market is at all-time highs, money is flowing out of China and looking for somewhere to go, and venture capitalists are pushing up the valuations of unicorns like Uber. More and more capital is chasing smaller and smaller returns. That doesn't mean capital is easy to get, but it means it's out there in large quantities. 

To sum up, we have just experienced technological revolutions in video production and distribution, at a time when capital costs are low and incumbents are vulnerable. It's time for some disruption.

Who will do it?

Obviously most people who care about Hollywood whitewashing have other careers to keep them occupied; most people aren't going to throw their other plans away and launch a quixotic quest to make movies with Asian leads. I'm not going to be a filmmaker, and most of you probably aren't either.

But a few of you might be. The entertainment industry is an exciting place to be right now. Here's a small anecdote just to illustrate. In the summer of 2009, just for fun, my friend Peter Chang and I went to make a documentary in Japan. We never finished it. But Peter realized how cheap it now was to make indie films for someone technologically savvy and artistically gifted (both of which he is), and he went on to start his own film production company, Golden Gate 3D. He's now shooting movies in Cuba and Greenland, and about to launch more projects. He's commercially successful, and works on the bleeding edge of filmmaking technology (which is one reason he's successful). Peter's interest is in documentary rather than narrative film (at least for now), but if he can do this sort of thing in the documentary space, other people can do it in the narrative film space. 

Peter's operation is still pretty small, and I single him out because he's my friend and because I got to see his success up close. The people making the immediate changes would be bigger, more established folks. There's no lack of Asian-American filmmakers out there. Justin Lin and Joseph Kahn are out there doing awesome stuff. And there's a rising tide of Asian-American acting talent. What's needed are for some of these or other filmmakers to turn into big-time film producers, and for entrepreneurs to start innovative new production and distribution companies.

What I think American entertainment needs is a Pro-Diversity Mafia. The PDM would be a loose network of funders, entrepreneurs, content creators and industry workers who share creative ideas, technology, funding leads, networks, and resources. It would include Asian-Americans, members of other "invisible" groups, and others who are supportive of greater diversity and inclusion in visual media. There are many examples of this sort of "mafia" allowing marginalized groups to break into an industry. Don't be ashamed of doing this sort of thing; this is how capitalism works, not the idealized frictionless market of an economist's model. (In fact, this "mafia" would help not just Asians, but other marginalized groups break into the visual media world - Muslims, for example. Intersectionality!)

If this type of thing shows signs of being successful, of course, Hollywood is going to want a piece of the action. If Asian-American actors are starring in surprise indie-hit films made on shoestring budgets and demonstrating eye-popping margins, it won't be long before the big dumb studios come calling. But by then, Asian-Americans will be able to negotiate from a position of strength.

But by then, pro-diversity filmmakers won't need Hollywood. With a disruptive business model in hand, the creators of the PDM could then simply muscle in on Hollywood's territory, going upmarket into big-budget films and beating the tired, boring sequel-mongers at their own game, stealing eyeballs and dollars with new distribution channels. Many times in history, a tight-knit subculture of highly talented people frozen out by discrimination has created a hotbed of creativity that eventually took over the industry that once shut them out.

Really? All this, just to put Asian-Americans on screen??

No, of course not. To do all this just to put Asian-Americans and other underrepresented groups on screen would be overkill. What's also at stake is a potential revitalization of American visual media. Movies are going down the tubes. They need new blood, new geniuses, new perspectives, and new business models. They need to be revitalized creatively, in the way that Lucas, Spielberg, Coppola and others revitalized them in the 1970s and 1980s. And they need to be revitalized technologically, in terms of both production and distribution. The people who are bold enough to put Asian-American actors on screen will also be bold enough to experiment and improve movies and TV in a huge number of other ways.

And the payoff to whoever does this won't just be making the world a better place. There's a lot of money to be made here, both on the production and distribution side. The disruption of Hollywood's old-economy oligopoly is a revolution that is long overdue, for more reasons than just this. And young Asian-American and pro-diversity entrepreneurs and artists have the smarts and the creativity to make that revolution happen and grab that pot of gold.

And you know what? I might be wrong about all this. Maybe none of this is necessary to get Asian-Americans on screen. Maybe the outcries and the Twitter trolling will work, and we're on the verge of seeing Asian-American actors headline superhero movies and big-budget Hollywood adaptations. And if that does happen, great. But then we wouldn't get a film renaissance out of the deal.

Fight injustice and make money

Capitalism is about taking what you can get. Until some imagined future day when we all live under the protective wing of an immortal, invincible, benevolent liberal authority, capitalism will have to do. It isn't fair, but it isn't the ossified hierarchy of power and injustice that its critics make it out to be, either.

The lack of Asian-Americans on the silver screen isn't just an injustice - it's the sign of an overlooked business opportunity. It is money being left on the table. Someone needs to pick up that money. And when someone does, whitewashing will soon be relegated to the history books - possibly along with Hollywood itself.

Thursday, March 23, 2017

Facts or EconoFacts?

There's a new website out called EconoFact, and while I respect the people involved and I get what they're trying to do, I think they're currently going about it wrong. The site needs some major changes.

In the age of Trump, "alternative facts", and "fake news", there's an understandable desire to bring truth back to a post-truth world. The new website describes its mission thus:
EconoFact is a non-partisan publication designed to bring key facts and incisive analysis to the national debate on economic and social policies. It is written by leading academic economists from across the country who belong to the EconoFact Network... 
Our mission at EconoFact is to provide data, analysis and historical experience in a dispassionate manner...Our guiding ethos is a belief that well meaning people emphasizing different values can arrive at different policy conclusions. However, if in the debate we as a society can’t agree on the relevant facts, then the nation itself loses a common base for constructive debate and policy will suffer. 
EconoFact does not represent any partisan, personal or ideological point of view...Our network of economists might disagree with each other on policy recommendations, but all will similarly rely on widely agreed upon facts in their analysis.
The posts on the site are basically just short-form explainers, like Vox articles condensed into sound bytes. Each has a short backgrounder section called "The Issue", followed by a bullet-point list called "The Facts" and a conclusion section called "What This Means". Confusingly, the main website displays a short form of each post, while clicking on a post will open a page with a longer form.

My main problem with the site, though, isn't aesthetic. It's the idea that the public will buy a "just the facts" approach. Many readers will suspect that what they're getting is not a simple recounting of incontrovertible facts, but a mix of received wisdom, theory, and carefully cloaked ideology. And they won't entirely be wrong about that.

As an example, let's take "Agricultural Implications of Mr. Trump's Policies", by Menzie Chinn. Menzie is one of my favorite bloggers, and this post of his is typically excellent. It's well-informed, clear, succinct, and backed up by evidence and theory. None of the things I'm about to complain about are his fault.

But forcing Menzie to label his conclusions "The Facts" forces him to oversell his own work. Here are some of the assertions that the EconoFact website forces Menzie to call "facts":
The fiscal policy of the Trump Administration will likely result in a stronger dollar and hence lower commodity prices. The combination of expansionary fiscal policies – tax cuts and spending increases –  and the Fed’s actions to stabilize output and inflation will result in higher interest rates and consequently a stronger dollar. 
This assertion relies heavily on theory. First, there's the assumption about the Fed's reaction function - what the Fed will do in response to fiscal policy. Second, there's the theory about how fiscal deficits affect interest rates.

It's fine to cite these theories, but they're not facts. The Fed might accommodate Trump's fiscal deficits, or it might tighten to head off potential inflation. We don't really know. Also, the idea that fiscal deficits raise interest rates, while present in a lot of macro models, doesn't have solid empirical support - just look at how long Japan has been running massive deficits without rising interest rates. Or look at what Obama's temporary but huge deficits did to lift U.S. interest rates (nada).

Corporate tax reform threatens to exacerbate the dollar appreciation. The House of Representatives is considering a tax on cash flow that would treat exports and imports differently, popularly known as the “border adjustment tax”. One likely consequence of passing such a tax is that the dollar would experience additional pressures for appreciation, on top of those arising from the configuration of macroeconomic policies. The appreciation would put further downward pressure on export prices.
Fair enough, this seems pretty clearly true to me. But are statements like "threatens to" and "likely consequence" really "facts"?

Net farm income is already being squeezed. The lower prices received for farm output arising from a stronger dollar will depress already declining farm incomes. The farm sector is a net debtor, so rising interest rates will raise the cost of borrowing for farmers. In addition, higher interest rates would likely drive down the already declining prices of farmland since the borrowing costs to mortgage these purchases would rise. In the past, the Federal government has provided income support by way of commodity price supports.  But the Administration’s budget outline limits this type of assistance in the future.
These are mostly predictions, not facts. What if increased export volume more than compensates for falling prices, and farmers' incomes go up? What if interest rates don't actually rise from government borrowing (as they have failed to rise so often, for so many countries, in recent decades)?

The more aggressive policies promised in the Administration’s New Trade Strategy threatens to provoke retaliation against U.S. exports, including agricultural commodities. The trade document states that the Administration will violate World Trade Organization (WTO) rules when it is in America’s economic interest. Trading partners will in such cases be permitted to retaliate against American exports, and agricultural goods are likely targets.
This is not just a prediction, it's a political prediction about what China's government will do. Economists don't really know what governments will do (otherwise what would be the use in giving advice to governments?). Maybe no one does. Menzie's assertion about China's response seems intuitively plausible, but intuitively plausible predictions are not facts.

Mass Deportation would negatively affect agricultural output. One estimate places the number of undocumented workers in the agricultural sector at 350,000, out of a total of 2 million workers in the sector. Complete removal of all undocumented workers would measurably reduce output particularly in the fruit, vegetable and dairy sectors.
Sure, sounds legit. But again, not a fact. What if the loss of cheap labor induces agribusinesses to invest in labor-saving automation, driving down costs over the long run and raising output? Isn't that a leading theory of why the Industrial Revolution happened in the first place?

Again, let me stress that Menzie's post is a very good post. I agree with it (except maybe for the "deficits raise interest rates" part). It's not his post I have a problem with, it's the site's labeling of theory-based assertions as "facts". When Tyler Cowen or Narayana Kocherlakota or I write these sorts of posts for Bloomberg View, they are rightfully called "op-eds".

Lay readers know when they're being talked down to. They know the difference between theory and fact. They have learned, through long decades of painful experience, to be extremely leery, even dismissive, of economists' pronouncements. There's a reason the profession's prestige has been falling in recent years. When many people read EconoFact, their skeptical defenses against economist overconfidence will immediately be triggered.

Economists need to understand this, and to change their approach. Some economists seem to think that lay readers are either so dumb or so ignorant that they just need to be spoonfed the "facts" by smart, well-informed EconoSages. This is why some economists keep making condescendingly simple arguments for free trade, while discussing modern theories and empirical evidence only behind closed doors. But as a publicizer and explainer of economics, I have found that lay readers tend to be very smart, and that many of them can grasp most empirical results or theoretical arguments relatively quickly. Econ has a lot less esoteric, hidden knowledge than its practitioners like to think.

Therefore, the publishers of EconoFact should change the site's format. Just make it a regular econ blog. Stop throwing around the word "fact" unless what you're dishing out are actually facts. Ditch the humblebrag mission statement about "just the facts", and say the truth, which is that you're trying to bring econ research and ideas to the public debate. And encourage the contributors to put in appropriate caveats and qualifications. Otherwise, you'll probably just drive people away from the idea that economists know any facts at all, and exacerbate the problem of the post-truth world.

Saturday, March 18, 2017

Thoughts on Will Wilkinson's post on cities

Will Wilkinson, one of the greatest essayists working today, has a wonderful article in the Washington Post about two competing visions of America - one cosmopolitan and polyracial, the other exclusive and insular. Here are some great excerpts:
[Trump] connected with these voters by tracing their economic decline and their fading cultural cachet to the same cause: traitorous “coastal elites” who sold their jobs to the Chinese while allowing America’s cities to become dystopian Babels, rife with dark-skinned danger — Mexican rapists, Muslim terrorists, “inner cities” plagued by black violence. He intimated that the chaos would spread to their exurbs and hamlets if he wasn’t elected to stop it... 
To advance his administration’s agenda, with its protectionism and cultural nationalism, Trump needs to spread the notion that the polyglot metropolis is a dangerous failure... 
When Trump connects immigration to Mexican cartel crime, he’s putting a menacing foreign face on white anxiety about the country’s shifting demographic profile... 
Suppose you think the United States — maybe even all Western civilization — will fall if the U.S. population ever becomes as diverse as Denver’s. You are going to want to reduce the foreign-born population as quickly as possible, and by any means necessary. You’ll deport the deportable with brutal alacrity, squeeze legal immigration to a trickle, bar those with “incompatible” religions. 
But to prop up political demand for this sort of ethnic-cleansing program — what else can you call it? — it’s crucial to get enough of the public to believe that America’s diversity is a dangerous mistake. 
I think this is all pretty much true. Though this might be Bannon's strategy - or an accidental strategy - more than Trump's explicit idea; Trump himself probably mostly just knows things he remembers from the 1980s, when America's big cities really were mostly failing, crime-ridden places.

But though I agree with Will's overall message - and his call for an inclusive definition of American-ness - I think he glosses over a few important things.

First, it's not really cities that are doing well, but certain kinds of cities, suburbs, and towns. It's really the places with high levels of human capital. To understand the real pattern, read Enrico Moretti's The New Geography of Jobs. The engineer-heavy suburbs of Fremont or Milpitas are doing great, as are college towns like Ann Arbor and Gainesville. Meanwhile, big cities like Baltimore and St. Louis are still stagnating and crime-ridden, while others such as Detroit and Cleveland have only just now started climbing up out of their Rust Belt doldrums. It's not city vs. country, it's innovation hubs vs. old-economy legacy towns.

Also, Will depicts cities as diverse, tolerant places. That's true in some ways - you're not going to become a tech hub without a bunch of engineers from India and China, and people who live in cities do tend to develop more cosmopolitan attitudes. But in some important ways the picture is wrong. Many American cities remain extremely segregated, especially between black residents and others. Chicago is a thriving, diverse, fun, relatively safe metropolis - unless you go to the poor black areas, in which case you're in "Chiraq". New York is a pretty great place to live whether you're a poor person in the Bronx or a rich person on the Upper East Side; Chicago is a totally a different experience depending on whether you're white/Asian/Hispanic person living in the north, or a black person living in the south of town.

By Nate Silver's measure, the most segregated cities in America include places like Chicago, Milwaukee, Philadelphia, St. Louis, Baltimore, and Cleveland. Those are precisely the places that are having the most difficulty adapting to the new, innovation-based economy. And those tend to be the places where crime rates have rebounded to their early 1990s highs, or never really fell in the first place.

So I'd focus less on the urban-suburban-rural distinction, and more on the division between new economy and old.

But anyway, I really like Will's message at the end of his post:
Honduran cooks in Chicago, Iranian engineers in Seattle, Chinese cardiologists in Atlanta, their children and grandchildren, all of them, are bedrock members of the American community. There is no “us” that excludes them. There is no American national identity apart from the dynamic hybrid culture we have always been creating together. America’s big cities accept this and grow healthier and more productive by the day, while the rest of the country does not accept this, and struggles. 
In a multicultural country like ours, an inclusive national identity makes solidarity possible. An exclusive, nostalgic national identity acts like a cancer in the body politic, eating away at the bonds of affinity and cooperation that hold our interests together.
That's exactly the message we need to be repeating. It's the only thing that can hold this country together. Either America succeeds as a polyracial nation, or it doesn't succeed at all.

Thursday, March 16, 2017

Beware of "thinking like an economist"

The idea of "thinking like an economist" is, in principle, a good one. It often helps to think of the world in terms of incentive compatibility, constrained optimization, supply and demand, competition vs. barriers to entry, strategic interactions, present value, marginal vs. average effects, externalities, etc. These are all things that economists think about. And there are empirical techniques that mostly aren't specific to econ, but which empirical economists use a lot, that are also good to think about - endogeneity, omitted variables, conditional vs. unconditional probabilities, signal extraction, and so on. And it's generally cool and useful to be able to think like an applied mathematician in general - to be able to construct formal, quantitative models to represent ideas you have about the way the world works. So there are many ways that it can be useful, fun, and mind-expanding to learn to think like an economist. 

And there really are some books out there that will help you do this. Tim Harford's The Undercover Economist is my favorite of these, but there are other good ones as well. In fact, I'd like to see more of these books, dealing with more sophisticated and modern econ concepts - I think much of the general public can handle it. 

But this good stuff isn't necessarily what people mean when they say "thinking like an economist." The aura of mystique and esoteric wisdom that the econ profession was (unjustly) awarded from the 1980s through the early 2000s has, unfortunately, allowed a bunch of people to pass off lazy, sloppy thinking and pure political ideology as "thinking like an economist." 

For example, take Tom Sargent's 2007 graduation speech at Berkeley. Sargent gives what he claims is a "summary of economics", but it's mostly just a list of potential reasons to distrust government intervention in the economy. This was just free-market ideology masquerading as econ, since econ research hasn't given conclusive support to most of the assertions Sargent makes. In his book Economics Rules, Dani Rodrik backs me up on this point

As another example, take this 2003 talk by Penn State economist Russell Cooper. Here are Cooper's six basic "principles of economics": 
  1. Individuals (including households and firms) act optimally
  2. Competition works
  3. Measurement matters
  4. No free lunches
  5. Government intervention with caution
  6. Correlation is not causality
Points 1, 2, and 4 aren't even right. Individuals don't always act optimally, else there would not be a field called behavioral economics. And Cooper's justifications for assuming individuals act optimally - for example, his assertion that those who act suboptimally will be driven out of the market - are well known to be false. Competition doesn't always work, else there would not be a field called industrial organization. And "free lunches" obviously do exist in many cases; they're called Pareto improvements. As for point 5, it's just the same ideology Sargent was dishing out. 

There are plenty of examples of people, including some economists themselves, trying to pass off free-market ideology as economic intuition. Other times, people say "thinking like an economist" when what they really mean is "ignoring social norms in public discussions". 

But I think the biggest danger of the idea of "thinking like an economist" is that it promotes the idea of economists (or people trained in econ programs) as sages or gurus possessing esoteric wisdom. Smart, sensible people who are perfectly capable of thinking about incentives, constraints, externalities, strategic interactions, etc. are often at a loss when some economist (or, more often, some econ writer or think-tanker) blithely contravenes them, assuring them that if they were truly able to "think like an economist," they would see the error of their ways, but failing to explain exactly what that error actually is. 

This is what I worry about when I see Russ Roberts write something like this:
But an economist when considering a policy of banning autonomous vehicles can think of a lot of other impacts besides the jobs saved and the continuing deaths from human driven cars if such a ban is put in place. One of the things we would think about is how such a ban will effect the incentives to discover future innovation that might also people out of work. We would think about how putting more power in Washington would encourage lobbying for protection. We would think about the children and grandchildren of today’s workers and how restricting technology and changing incentives would affect things. These ideas are not rocket science. But they come easily to economists and not so easily to non-economists. Thinking like an economist is very useful.
I'm certainly not in favor of banning autonomous vehicles! But I fail to see why the issues and questions Russ raises wouldn't be accessible and even obvious to a non-economist. Does it take an economist to worry about technology bans reducing the incentive for innovation? No. Does it take an economist to worry about the power of lobbyists? No. 

Suppose someone supported a ban on self-driving cars, because they believed the disincentives for innovation and the incentives for lobbying wouldn't be too severe. Should they change their mind just because someone who claims to be able to "think like an economist" (but who presents no formal model or empirical study) says otherwise? I say no. That is placing way too much faith in esoteric wisdom. If an expert or guru can't point to some research that supports his pronouncements, you shouldn't trust that his brain just works better than yours. 

So while "thinking like an economist" is in principle a good thing, beware of people who claim to know how to do it. There's really nothing magical or esoteric about it. And if you think learning to do it means indoctrinating yourself with free-market ideology, you've been conned.

Wednesday, March 15, 2017

Misuses of empirical econ

Tyler Cowen has a new post in the ongoing blog discussion about the value of empirical economics. Most of Tyler's post is about the potential misuse of empirical economics. He writes:
The political process does not select for humble versions of empiricism.  Those end up with virtually no political influence, whereas some of the more dogmatic form of empiricism may find some traction.
Absolutely true. Ideologues just pick and choose results that support them. If 100 studies show the impact of immigration on labor markets is small, and one George Borjas study says it's large, the anti-immigrant people will wave around the Borjas study. (And then the anti-empiricists will say "See? No one can really know who's right!")

A lot of the bias in empirical methods comes simply from which questions are asked/answered. Post Trump and De Vos, I see plenty of commentators and researchers reporting “vouchers don’t raise test scores” and virtually no “vouchers increase parental satisfaction.” Is that empiricism? In isolation, maybe. In terms of reflecting the broader spirit of science, not so much. It is also not humility.
This is certainly true. I'm not sure focusing on certain questions and forgetting about others signals a lack of humility. Groupthink, maybe, but not arrogance.

I find a very common pattern among both researchers and commentators.  They first form...judgments about social systems, based on overall views of history, current politics (too much)...They then view very particular empirical debates through the broader lenses they have chosen.  For instance, views on politics used to correlate with views on the interest elasticity of money demand.  Today views on politics correlate with views on minimum wage elasticity, and so on.
Yep, this is certainly going on all over the place. There's evidence for ideologically motivated reasoning in econ research, though I think the evidence doesn't show that much motivated reasoning. The problem is almost certainly worse for us commentators, and worst of all for politicians. In general, the less people know about how econ research works, the more they seem to pick and choose the results they like based on which ideology it seems to support. Stopping people from doing this is a Sisyphean task for those of us commentators who are dedicated to a more dispassionate analysis of the facts, and even we aren't always the good guys either.

BUT, consider the other approaches to understanding the world. There's formal theory. There's intuition formed from exposure to theory ("thinking like an economist"). And there's intuition formed from exposure to casual observation and stylized facts (which Tyler calls "relatively general empirical judgments").

I'd argue that all of these are equally or more likely to be misused by commentators, ideologues, and politicians in exactly the same way Tyler describes empirical results being misused. People who want to justify fiscal austerity will wave around DSGE models that support fiscal austerity. People who want to kill the minimum wage will use theoretical intuition to claim that minimum wage hurts employent ("Demand curves slope down, DUH!", etc.). People who want industrial policy will use stylized facts - which Tyler calls "broad empiricism" - to point out that development successes like Korea and Japan usually have a lot of industrial policy in their past. And so forth.

The point is: Ideologues gonna ideologue. No economic analysis method will completely or even mostly put a stop to motivated reasoning. The hope for empirical economics is that a weak signal of reality is better, over the long term, than no signal at all. Over time, the hope is that the solid studies push out the shaky ones, that economists' natural rationality and desire to know the truth inches the academic consensus toward a better correspondence with the facts.

I'm optimistic - I think the empirical revolution in econ is no fad. Yes, there will be setbacks, as hot new techniques are over-applied, and as prominent results fail to replicate. But even better techniques will be developed, meta-analyses will weed out spurious results, and armies of smart, fair-minded grad students will comb through methodology sections and data sets to separate the wheat from the chaff. Eventually, theory will follow the weak but insistent tug of evidence - theories that don't fit the facts so well will gradually fall into disuse, while those that explain the most solid results will inch into the limelight. Classes will teach these more popular theories, and the next generation of econ majors will learn intuition that corresponds just a little more closely with observable fact. Eventually, people who rely on the "broad empiricism" of casual observation will start noticing the stylized facts that agree with their new, better theoretical intuition. Thus, progress will crawl forward, bit by bit, never reaching truth, but always headed in more or less the right direction.

Monday, March 13, 2017

White supremacism is not nationalism

If you spend any time at all talking to rightist immigration restrictionist types on Twitter or elsewhere, you'll notice that they've taken to calling themselves "nationalists." They contrast this with "globalism", which they associate with rootless cosmopolitans pushing open-borders policies on countries to which they have no allegiance. Lots of people on the left take these folks at their word - after all, weren't the Nazis nationalists? Didn't nationalism cause WW2? Etc.

But I've always been suspicious of the "nationalist" label. American rightists have always seemed to me like part of an international, borderless white supremacist movement - a sort of global white-ist Ummah. They always seem to have much more allegiance to their co-racialists in other countries than they do to their own non-white countrymen. 

For members of a movement that purports to focus on putting American interests first, American nationalists seem to spend an awful lot of time obsessing about Europe. 
Europe's birthrates are too low and Europe has Muslim minorities that it's not integrating well, Republican Rep. Steve King of Iowa complains, by way of defending his tweet that said, "We can't restore our civilization with somebody else's babies."
Josh thinks that the nativist right's obsession with Europe is a rhetorical tactic. He thinks that immigration works so well in the U.S. that the only way restrictionists can avoid saying the truth - that they just don't like nonwhite people - is to point at Europe, which is far worse at handling immigration. That's probably true. 

But I think it's more than that. I think America's white-nationalists feel a natural kinship with Europe. They always speak not of American civilization, but of Western civilization. They erupt in outrage over stories of white people (supposedly) victimized by nonwhites in far-off countries, while expressing little or no outrage when nonwhite American citizens are attacked. They are just as likely to complain about immigration to the UK or Germany or Sweden as to the U.S. Pepe the Frog, the Celtic Cross, and old European paintings are replacing the American flag as online markers of rightist identity.

Rightists often claim that American Muslims won't be loyal to the United States, but to a global Muslim community. To me, this clearly seems like a case of psychological projection. White supremacists see themselves as part of an international borderless racial community first and foremost, rather than citizens of a nation-state, so they naturally imagine that everyone else sees themselves the same way.

That's a generalization. I'm sure some of our white-nationalists really are just "restrictive nationalists" - the type of people who feel real American pride, but who also view whiteness as part of the essential definition of American-ness. I bet Steve King and many older people are like that. 

But I also think that the internet is breaking boundaries between national cultures, and forging trans-national loyalties. Twitter, Reddit, and forums like 4chan put European and American rightists in contact every day. Go on 4chan and check out the country flags on white supremacist posts - you'll notice that more than half are from outside of America. That constant contact with international fellow-travelers tends to erode national and local allegiances and create borderless identity groups defined by race, religion, and ideology. This was what happened with ISIS, al-Qaeda, and other global Islamist movements, and I think it's now happening with white-ist movements in Europe, America, and the Anglosphere countries.  

This is ironic; white supremacists embody the same thing they claim to be afraid of - an erosion of national loyalty. It's also scary, because it means that technology has fundamentally changed the game of politics in ways that few anticipated. When people spend most of their time online instead of engaged in their local communities, they naturally lose allegiance to the people near them and gain allegiance to the people on their screens. Whereas in past centuries, comrades-in-arms built loyalty fighting with guns and tanks for borders and land, now they build camaraderie fighting in meme wars and flame wars with comrades sitting in front of screens thousands of miles away. 

That worries me. Nation-states might have fought each other in wars, but they were incredibly effective in the 20th century in terms of providing public goods, improving social justice, and giving people a feeling of togetherness and commonality. International racial and religious movements will almost certainly be much worse at the first two of those tasks. A world defined not by borders but by online identity groups will be a deeply dysfunctional world, I predict. Whatever the sins of nationalism, I think history shows that militant trans-national movements are far more dangerous - they also commit mass violence, but they fail to provide the public goods and institutions that make life good in peacetime. 

Saturday, March 11, 2017

Book review: Phishing for Phools

I finally got around to reading Akerlof and Shiller's latest book, Phishing for Phools. In fact, I saw Akerlof give a talk by that name at an INET conference back in 2011. The book was basically a longer version of that talk.

Phishing for Phools has one big important idea. The idea is that deception is fundamental to market economies - that it's not just the result of certain models or certain situations, but that it's always present to some degree. The reason is that companies are always looking for new ways to trick people out of their money - to find new information asymmetries, bounded rationality, suboptimal behavior patterns, and legal loopholes to exploit. Given that companies are always looking for these, their will always be some degree of trickery and mistakes in any market. This is just an obvious result of costly monitoring (though I'm not sure Akerlof & Shiller actually mention that explicitly). They call this a "phishing equilibrium".

I wish the book had spent a lot more time on this idea. Obviously not all of our economic interactions consist of trickery and mistakes (right?). What affects how much or little "phishing" an equilibrium entails? How can we detect which of the different kinds of phishing ("informational phishing", "behavioral phishing", etc.) is going on? What are the principles of market design that minimize phishing? How can one detect, empirically, how much phishing is present? How can individuals learn to better avoid phishing?

Sadly, the book mostly eschews these questions and spends almost all of its pages telling anecdotes about situations that the authors think involved lots of phishing. The 2008 financial crisis, the Vioxx recall, political lobbying, advertising, and so forth. This is sort of interesting, but a lot of these are stories we've heard before. And they usually don't cite much research verifying that phishing was indeed central to these screw-ups.

That's a shame. That research often exists. I've reported on it. But instead, Akerlof & Shiller sort of assume that the stories speak for themselves - that any reasonable person listening to these tales would instantly realize that phishing was the only explanation in each case. I wish the authors had seen fit to include a little more proof and a little less rehashing of well-known events.

Another problem I had with the book was the tone. It's full of the stilted phrasing of academic econ papers - "In the following chapter, we will show", and so forth. I think that's just bad practice for a pop book - the general public is probably turned off by that lingo, and economist readers don't really need it. Phishing for Phools was not nearly as bad in this regard as The Assumptions Economists Make (shudder). But if you're an economist writing a popular book, just go ahead and ditch the "we will show"s and the "it has been demonstrated that"s.

And what on Earth told Akerlof and Shiller that calling people "phools" was a good idea? No one's phooled by that "ph", you know (especially not in the audio version). You're calling people fools. You're saying that they're too irrational and/or dumb to make it in a market economy. Even if that's true, isn't there some nicer way to say that? I mean, presumably Akerlof and Shiller would like people to support government policies that curb "phishing". But if people think they're admitting stupidity by calling for those policies, I bet they'll often oppose the policies just to save face.

Anyway, to sum up, I think Phishing for Phools is an interesting book that injects one very interesting, very important idea into the economic discussion. But I think that the choices of topic and tone lessen the effectiveness of the message. We need both more academic econ papers and catchier pop books about this topic.

Friday, March 10, 2017

Anti-empiricism is not humility

"Faith is the substance of things hoped for, the evidence of things not seen."
- Hebrews 11:1

Empirical economics is taking over the profession. It's very hard to make it in the field these days without doing a hefty amount of empirical work. Lots of job market papers are still theory papers (cough! signaling! cough!), but the number of economists who can make it as pure theorists is shrinking to a rarefied, brilliant sliver.

I see that as a very good thing. That's what natural science looks like - a small number of theory papers, supported by a very large base of applied theory and empirical work. It's the sign of a mature field. I also think it's going to be very important in helping the economics profession recapture some of the public respect that it's lost over the last decade. When people start to see economists as fact-driven scientists grounded in observable reality, rather than mathematical philosophers dispensing Olympian received wisdom, the profession will lose much of its accumulated stigma. And almost every young economist I talk to thinks the same - everyone's excited about new data sources. The kids these days seem to want to know facts about the world, instead of just "organizing their thinking" with models. The future looks bright.

But not everyone is on board. A few older folks, who grew up during econ's Age of Theory, are not so happy about the change. One of these is Russ Roberts, host of the excellent podcast EconTalk. In a recent blog post, Russ explains at length why he thinks the new empirical economics is overrated:
A lot of professional economists...will tell you how many jobs will be lost because of an increase in the minimum wage or that an increase in the minimum wage will create jobs. They will tell you how many jobs have been lost because of increased trade with China and the amount that wages fell for workers with a particular level of education because of that trade... 
[T]here is no simple way to resolve differences in analysis done by professional economists...[T]there is no way of knowing reliably if the consensus reflects the truth...Most economics claims are really not verifiable or replicable... 
I am arguing that the math and science of economic predictions and assessments are nothing like the math and science of space travel. Economics provides the illusion of science, the veneer of mathematical certainty...
He even goes further, and says that empirical economics isn't even really economics at all:
[M]ost of the people I am talking about are not economists. They are really applied statisticians. Economics is primarily a way of organizing one’s thinking in considering incentives and costs and the interactions between individuals that we call a market but is really emergent behavior with feedback loops.
Adam Ozimek has a patient and reasonable response to Russ, noting that even when empirical economics doesn't settle questions definitively or provide reliable point estimates, it narrows the scope of debate and rules out obvious wrong answers. That's certainly true. But I want to go further than Adam. The alternative to empiricism in economics is not agnostic humility, but intuitionism - the idea that we can know about the world by thinking about how it works, and that exposure to evidence will only pollute the truths that we divine from our own minds. And that's something I think economists need to avoid.

Consider the minimum wage issue. Suppose that a city like Seattle is considering hiking the minimum wage. How can we - economists, policymakers, and the general public - predict what the effect of the hike will be?

One approach would be to use theory. Basic Econ 101 labor supply-and-demand theory tells us that the effect will depend on the elasticities of labor supply and demand, which have to be estimated empirically. An economic geography theory might predict that the effect will be overcome by the strength of agglomeration effects, and therefore small. A search theory might predict that search frictions will preclude any sort of large short-term effect in labor markets.

How about stylized facts? Russ says that stylized facts are the only things that economists can really "know":
It is useful to know that 40% of the American work force was in agriculture in 1900 and now the number is 2%. It is useful to understand that that transition (which was most faster in the first half of the 20th century than the last half) did not lead to mass unemployment and starvation. There are indeed roughly 5 million fewer manufacturing jobs today than in 2000.
OK. So what do the stylized facts tell us about the minimum wage? Well, they tell us that places that raise the minimum wage don't tend to lose jobs. Look throughout American history. You won't find any cases where there was a big minimum wage hike and the unemployment rate soared. If we rely on stylized facts rather than careful controls and natural experiments, we'd conclude, as minimum wage proponents do, that the minimum wage isn't dangerous.

A third option is to rely on the kind of empirical studies Russ pooh-poohs. Most empirical studies say the short-term impact of the minimum wage on employment is small. 

A fourth option is to rely on casual intuition - not really theory, but a sort of general gestalt idea about how the world works. If we're of a free-market sort of persuasion, our casual intuition would tell us that minimum wage is government interference in the economy, and that this is bound to turn out badly. Russ seems to be advocating for this when he writes that "economics is primarily a way of organizing one’s thinking in considering incentives and costs." "Organized thinking" seems like just another term for intuition. 

As I see it, the fourth option is by far the worst of the bunch. Theories can be wrong, stylized facts can be illusions, and empirical studies can lack external validity. But where does casual intuition even come from? It comes from a mix of half-remembered theory, half-remembered stylized facts, received wisdom, personal anecdotal experience, and political ideology. In other words, it's a combination of A) low-quality, adulterated versions of the other approaches, and B) motivated reasoning. 

If we care about accurate predictions, motivated reasoning is our enemy. And why use low-quality, adulterated versions of theory and empirics when you can use the real things?

As I see it, a rational predictor should use a combination of theory and empirics. But theory should also be informed by data - there are lots of theories, and in general they can't all apply to the same situation, so you need evidence to tell you which one(s) to use. So a rational predictor's predictions should always be tied as closely as possible to empirical evidence. Discounting empirical evidence, as Russ does, seems inevitably to lead to the use of casual intuition (or to even worse things, like pure ideology).

Anyway, just in case you were curious, Seattle went ahead and hiked the minimum wage, and whether you measure by stylized facts or carefully controlled empirical studies, any negative effect on employment was small or zero. Of course, if you want, you can say that the empirical studies weren't controlled well enough, and the stylized facts are illusions, and the minimum wage hike must have hurt employment because government intervention always hurts employment la la la I can't hear you, but if you say that, who's going to respect you intellectually?

Now I want to turn to a second claim: the idea that discounting evidence represents "humility". Russ writes:
We economists should be more humble and honest about the reliability and precision of statistical analysis.
John Cochrane, in a blog post praising Russ' post as an exercise in "economic humility", writes:
[L]et's call [Russ' attitude] Hayekian humility. This is the hardest one for so many economists to admit, as we all like to play central planner.
This seems to be a bit of a change from when John wrote that "the stars in their 30s are scraping data off the internet." Or when he himself got famous and respected partly for doing careful empirical studies of asset prices. But anyway.

We'd all like economists to be more humble, right? Sure, count me in. But discounting empirical evidence in favor of "organized thinking" is probably not what most people have in mind when they call for economists to be more humble. 

Which is more humble: To try as hard as you can to assess the facts? Or to throw up your hands and say we'll never know the facts for sure, so we should rely on our own intuition about what people's incentives are? 

Russ writes:
[A]n economist when considering a policy of banning autonomous vehicles...would think such a ban will effect the incentives to discover future innovation that might also people out of work. We would think about how putting more power in Washington would encourage lobbying for protection...These ideas are not rocket science. But they come easily to economists and not so easily to non-economists. Thinking like an economist is very useful.
Does that sound humble to you? To me it sounds like the exact opposite of humility. To say that an economist has special insight into simple ideas sounds to me like the opposite of humility. To say that an economist's intuition can yield an understanding of the incentives governing innovation, or the effect of lobbying, and that checking this intuition against the facts would only pollute the truth it yields, sounds to me like the opposite of humility.

Anyway, one final point. Russ cites an empirical disagreement between David Autor and Jonathan Rothwell over the impact of trade on jobs. He writes:
Is Rothwell correct? I have no idea. Here is what I do know. There is likely to no way of knowing which view is correct with anything close to reliability or certainty.
No! No, Russ, you do not know that there is no way of knowing who's right. How could you possibly know that it's impossible to know something?? You can't prove a negative! This is the argument-from-ignorance fallacy. Just because a matter isn't settled doesn't mean it can't be settled.

But even worse than argument-from-ignorance would be an argument-from-personal-ignorance. It doesn't sound to me like Russ has tried very hard to determine the particulars of the Autor-Rothwell dispute. It doesn't sound like he has read the papers closely, studied and understood the statistical methodology, or done anything other than observing that the two researchers disagree. I don't want to put words in Russ' mouth here, but "two people disagree, so there must be no way to tell who's right" is pretty anti-rational.

Imagine if two researchers did experiments to determine the mass of the electron. The first researcher says the mass is 9.1e-31 kg, and the second says it's 4.6e-31 kg. After hearing these two conflicting results, do you say "Here is what I do know. There is likely no way of knowing the mass of the electron with anything close to reliability or certainty."???

No. That is not what you say. Not if you're rational, at any rate. If you're rational, you might say "Let me take a look at these two experiments and see if one of them got something wrong." Or you might say "I'm going to wait until scientists figure out which one of these two experimenters got something wrong, and defer judgment until then." Or you might even say "I trust one of these labs, since they have a great track record, so I'll tentatively favor their result until more evidence comes out." But what you would not say is "Huh, it must be impossible for physicists to determine the mass of the electron."

So I believe economists can do a lot better than Russ seems to think. They can do better than relying on intuition and throwing up their hands at any empirical disagreement. And by and large, they are doing better. Let's hope that trend continues, and doesn't regress.

Tuesday, February 28, 2017

Can't we all just get along?, Econometrics edition

Some academic fights I understand, like the argument over whether to use sticky prices in DSGE models. Others I have trouble comprehending. One of these is the fight between champions of structural and quasi-experimental econometrics. Angrist and Pischke, the champions of the quasi-experimental approach, waste few opportunities to diss structural work, and the structural folks often fire back. What I don't get is: Why not just do both?

Each approach has unavoidable strengths and weaknesses. Francis Diebold explains these in a nerdy way in a recent blog post. I tried to explain these in a non-nerdy Bloomberg View post a year ago.

The strength of the structural approach, relative to the quasi-experimental approach, is that you can make much bigger, bolder predictions. With the quasi-experimental approach, you typically have a linear model, and you estimate the slope of that line around a single point in the space of observables. As we all remember from high school calculus, we can always do that as long as something is differentiable:

But as you get farther from that point, extrapolation of the curve becomes less accurate. The curve curves. And just knowing the slope of that tangent line at that one point won't tell you how quickly your linear approximation becomes useless as you move away from that point. 

So this means that quasi-experimental methods have limited utility, but we can't really know how limited. Suppose we found out that minimum wage has a very small effect on jobs when you go from $4.25 to $5.05. How much does that tell us about how bad a $7.50 minimum wage would be? Or a $12.75 minimum wage? In fact, if all we have is a quasi-experimental study, we don't actually know how much it tells us. 

Quasi-experimental results come with basically no guide to their own external validity. You have to be Bayesian in order to apply them outside of the exact situation that they studied. You have to say "Well, if going from $4.25 to $5.05 wasn't that bad, I doubt going to $6.15 would be that much worse!" That's a prior.

If you want to believe that your model works far away from the data that you used to validate it, you need to believe in a structural model. That model could be linear or nonlinear, but "structural" basically means that you think it reflects factors that are invariant to conditions not explicitly included in the model. "Structural," in other words, means "the stuff that (you hope) is really going on."

The weakness of structural modeling is that good structural models are really, really rare. Most real-world situations in economics are pretty complicated - there are a lot of ins, a lot of outs, a lot of what-have-you. When you make a structural model you assume a lot of things away, and you assume that you've correctly specified the parts you leave in. This can often leave you with a totally bullshit fantasy model. 

So just test the structural model, and if the data reject it, don't use it, right? Hahahahahahaha. That would kill almost all the models in existence, and no models means no papers means no jobs for econometricians. Also, even if you're being totally serious and scientific and intellectually honest, it's not even clear how harsh you want to be when you test an econ model - this isn't physics, where things fit the data to arbitrary precision. How good should we even expect a "good" model to be? 

But that's a side track. What actually happens is that lots of people just assume they've got the right model, fit it as best they can, and report the parameter estimates as if those are real things. Or as Francis Diebold puts it:
A cynical but not-entirely-false view is that structural causal inference effectively assumes a causal mechanism, known up to a vector of parameters that can be estimated. Big assumption. And of course different structural modelers can make different assumptions and get different results.
So with quasi-experimental econometrics, you know one fact pretty solidly, but you don't know how reliable that fact is for making predictions. And with structural econometrics, you make big bold predictions by making often heroic theoretical assumptions. 

(The bestest bestest thing would be if you could use controlled lab experiments to find reliable laws that hold in more complex environments, and use those to construct reliable microfounded models. But that's like wishing for a dragon steed. Keep wishing.)

So why not do both things? Do quasi-experimental studies. Make structural models. Make sure the structural models agree with the findings of the quasi-experiments. Make policy predictions using both the complex structural models and the simple linearized models, and show how the predictions differ. 

What's wrong with this approach? Why should structural vs. quasi-experimental be an either-or? Why the academic food fight? If there's something that needs fighting in econ, it's the (now much rarer but still too common) practice of making predictions purely from theory without checking data at all.

Monday, February 27, 2017

Historical cycle theories are silly...or are they?

I have a soft spot for theories I thought of when I was 14. Back then I consumed a lot of epic fantasy books (and video games, and TV shows), in which an ancient evil is often just now returning after being banished (typically for a period of 1000, 5000, or 10000 years), and new heroes must arise to defeat it again, etc. etc. I reflected that my grandfathers had defeated cosmic evil, back in WW2, and that before that, my American forebears had defeated cosmic evil in the Civil War, so at some point we were due for another showdown with the ever-returning Forces of Darkness. I also figured that each generation after the war would be a little softer and more complacent than the last, and that this weakness would be one thing that encouraged the Forces of Darkness to make their comeback. And since it was about 75 years from the Civil War to WW2, I figured that each cycle lasted about four generations, and that it would be the generation after mine who would have to bear the brunt of the fight the next time.

It's fun to be 14. If you've never done it, I suggest you try it.

I recently found out that the authors Neil Howe and William Strauss already published a very detailed version of a very similar theory, back in 1991 (well before I turned 14!). I found this out via Steve Bannon, who according to news reports is a fan of their theory. Recently, Howe wrote a Washington Post op-ed explaining the theory. The basic idea is that there's a four-generation cycle. A "crisis" generation creates social unity and builds up national institutions, and each successive generation challenges and degrades those institutions, until four generations later the institutions collapse and there's another crisis. According to Howe, the Millennials are the ones who will have to renew our society this time.

That's a cool theory. But like all periodic theories of history, it's easily falsified.

Why? Because lots of crises are externally imposed. The Black Death's arrival in Europe had little to do with the strength of European institutions. The Japanese invasion of the Philippines was unrelated to the Phillipines' position in any generational cycle. The Industrial Revolution and the Mongol Invasions blindsided every nation on the planet. And so forth. Exogenous shocks obviously happen, and they disrupt the timing of any generational cycle. So the nice smooth even periodicity that Howe and Strauss posit can't exist, even if there are forces tending in that direction.

Also, if you look at history, you see both some very long periods of crisis-free stability and some very long periods of continuous dramatic social upheaval. For example, China's "century of humiliation" involved about 110 years of almost continuous rebellion, civil war, invasion, mass killing, and political upheaval. France during the years from 1789 to 1945 experienced two empires, three republics, a large number of revolutions and counter-revolutions, many foreign invasions, and millions of violent deaths. 

On the flip side of the coin, Britain from the Glorious Revolution of 1689 to the start of World War 1 experienced over two centuries of stability with no real regime change or total war (the Napoleonic Wars being the closest thing, but ultimately not even requiring mass conscription). China during the Ming Dynasty and Japan during the Tokugawa Dynasty were similarly stable. 

If you hunted around and looked closely, you might be able to look at those long stable centuries and find some minor social disruptions loosely corresponding to the Strauss-Howe four-generation cycle. But think how many other such minor disruptions you'd be ignoring! (Were the 1960s a "crisis" for America? We had a bunch of assassinations, race riots, and a major war, after all.) Apophenia is a powerful temptation. But don't be fooled - by any objective measure you can find, history is aperiodic.

So formally, in the rigorous sense, Strauss-Howe theory is wrong. BUT, I still think it could be describing some important processes at work. Just because history is aperiodic doesn't mean it's random.

First, there's the idea of institutional decay, as put forth in Mancur Olson's The Rise and Decline of Nations. The idea here is that institutions developed to solve the problems of one era eventually become powerful incumbents who resist needed institutional changes later on down the road. If crises cause a "reset" of this cycle - the necessary fall of ineffective incumbent institutions, and their replacement with newer, more effective ones - the result could look a lot like a Strauss-Howe cycle. If the time it takes for institutions to go from effective to parasitical is a few decades, then it could even look periodic for countries that experience few external shocks (like the U.S., perhaps?). 

Second, there's the idea of a cycle of globalization. If free capital and labor flows tend to cause instability to build up in global economies - through excessive leverage, economic financialization, difficulty absorbing large cohorts of immigrants, the creation of an unsustainable "reserve currency" regime, etc. - then there could be repeated periods of globalization and retrenchment. Obviously, since there has only really been a modern global economy for a century and a half or so, this sort of cycle can't be reliably observed or confirmed yet. And no one has suggested that the cycle lasts a fixed number of generations or decades. But there are plenty of parallels between 1890-1929 and 1980-2008. And there are also parallels between the Great Depression and the Great Recession. And you could be forgiven for believing there are parallels between the politics of the 1930s and the politics of today.

So I wouldn't totally toss out the idea of a predictable social crisis. Whether it comes from generational attitude changes, institutional decay, or the instability of globalization, it's certainly possible that eras of stability tend to lead to crises eventually.

Saturday, February 25, 2017

Why human capital is capital

Economists tend to use the word "capital" pretty loosely. It just means "anything you can spend resources to build, which lasts a long time, and which also can be used to produce value." That's really broad. For example, it could include society itself. It also typically includes "human capital," which refers to people's skills, talents, and knowledge.

Why do most economists define "capital" this way? Really, it's just a convenient way to make the kind of models they like to make. I tried to explain this in a wonky post a couple of years ago.

But there are people out there who really don't like this broad definition of "capital". For example, the economist Branko Milanovic has repeatedly argued against use of the term. So has Matt Bruenig. And Paul Krugman agrees with them. They would rather restrict the word to mean what economists typically call "physical capital" - machines, buildings, and the like.

Who's right? In general, I don't like to boss anyone around with regards to vocab choices. Use words the way you want to use them, and just let people know what you mean. I would personally have preferred a different term, like "skill capital". But I think the term "human capital" is useful because it helps to convey some important truths about the world. Here are some facts about the world that I think the term "human capital" helps remind us of:

1. It's worse to be uneducated, unskilled and poor than to be educated, skilled and "poor".

Imagine that you're 22, educated, and poor. You have a bunch of student loan debt, but no money in the bank. You have book learning and credentials, but no immediately employable skills, very little on your resume, and not much of a network. You're sleeping in your friend's spare room and buying the cheapest food you can find.

Congratulations, you're me! Was I poor? By many measures, yes. But I certainly didn't feel poor. I knew that my Stanford degree and general intellectual skill (I could do math well and write well) would eventually let me get a good-paying job. In fact, I had no qualms whatsoever about my economic future. Zero fear.

But imagine yourself in the same situation with no degree, and without that general writing and math skill. Imagine yourself as a 22-year-old with the same debt and the same empty bank account. What are your future employment prospects? Construction worker? Landscaper? Day laborer?

The second way is a much worse way to be, right? Both 22-year-olds, me and my hypothetical uneducated counterpart, have the same official amounts of wealth. But despite the fact that I was scrounging for cheap food and sleeping in a friend's spare room, I didn't really feel like a poor person, and with good reason - I knew my future wasn't a future of poverty.

The word "human capital" gets at this distinction. It's a way of saying "education and skills are a form of wealth." If you ignore this wealth, you end up treating penniless grad students the same as honest-to-goodness poor people.

2. Some people make more money than others from the same amount of labor.

Opponents of the term "human capital" tend to say that human capital is really just labor. They like to define "capital" as anything that gives you passive income - in other words, anything that gives you money without work.

But consider me vs. an uneducated grocery store worker the same age as me. I don't work any harder than a grocery store worker - less hard, if the truth be told. I get up, read books, read papers, read Twitter and the news. I write some articles. The grocery store worker is moving groceries from the stock room to the shelves and back, checking inventory, working the cash register, answering people's questions, etc. Who is putting in more labor input, more effort and strain? Probably the grocery store worker.

But, as I'm slightly ashamed to admit, I make more money.

I view that difference as a form of passive income. Without spending any more effort than my grocery store worker counterpart, I get more money. This is just as passive as owning stocks, bonds, or real estate. My education and my skills (and my human networks, and my knowledge of the labor market) are a form of wealth that delivers me income for no extra effort.

It makes sense to me to have a word for this other sort of passive income-earning power. And "human capital" refers to exactly that.

3. Government spending on education represents investment for the future.

Government pays lots of money to educate the populace. We have universal public school. We have state-supported universities and colleges. Families themselves pay a lot on top of that, for university tuition, room and board, tutors, etc.

Is that spending a form of consumption? Is it just fancy day-care and subsidized partying? Some cynics would say yes, but I think the answer is very clearly no. A lot of that spending represents an investment in the future. The spending today will pay off tomorrow, in the form of a more productive populace.

When you spend money today and get back more than you spent, I say you're building wealth. And "capital" is really the same thing as "wealth" - it's the ownership of something that can deliver you income (passively!) in the future. Education creates no physical stores of value - no trucks or buildings or machine tools. But skills and knowledge are durable - you remember how to program a computer, or how to think like a lawyer.

When investment creates durable stores of value that produce income in the future, it makes sense to me to call it "capital". In my opinion, one big problem in the United States is that government doesn't invest enough. I think that depicting education spending as an investment in productive capital is helpful for making the case that government should spend more on education.

So there are three reasons why it often makes sense to think of skills and education as a form of capital. What about the objections? One objection people give is that income from human capital isn't passive - but, as I explained in point 2, it really is passive, since it allows you to get more income without any more effort - or the same amount of income for less effort.

A second objection is that people's education and skills can't be bought and sold, because we don't have indentured servitude. That's basically true (though there are some gray areas, like long-term contracts, noncompete agreements, or wage garnishment for student loans). But that's just a law. We could easily pass a law saying that office buildings can never ever be bought or sold, but must be owned forever by the people or companies that built them in the first place. Under this law, they could only be rented out, and only using month-to-month leases.

Would that law make office buildings any less a form of "capital"? I say no. By the same token, laws against indentured servitude change how human capital gets used in the economy, but they don't really change what it is. I strongly support laws against indentured servitude. But they don't change anything about the physical nature of education and skills. They don't change the fact that these are durable investments that produce passive income.

So by using the term "human capital", we remind people of several important truths:

1. We remind people that educated "poor" people aren't really as poor as uneducated poor people.

2. We remind people that skilled workers don't really work harder than unskilled workers.

3. We remind people that government spending on education is an investment for the future.

I think those are good and important things to keep in mind.