I've had a model of higher education rolling around in my head for quite a while now, and I never had the time or energy to put it on paper. But then I read this post by Frances Woolley, which contains some ideas that are extremely similar to mine, so I thought I'd sketch out the basic idea of the model in a blog post.
Woolley asks why research is more important for a professor's career than teaching:
[W]ithin academia, research has higher status than teaching. The question is, why?...
Perhaps research is highly valued because it is in short supply...But scarcity cannot explain why dime-a-dozen mediocre researchers are accorded higher status than excellent teachers...[and e]ven a scarce commodity will have a low price if there is not much demand for it...
I think [research has higher status than teaching] is because research output is a signal of ability...Teaching just does not work as a signal in the same way. First, top rate teaching is extraordinarily difficult to measure...
Second, I don't know if teaching performance is as highly correlated with intelligence, creativity, and originality as research performance is.Basically, Woolley conjectures that research is valuable as a signal of unobservable teaching skill. I think that this is an excellent answer, for a reason that Woolley doesn't even mention: past research is paid more than future research.
Think about Joe Stiglitz' salary. Stiglitz has, by almost any measure imaginable, done a huge amount of great research. And he gets paid a very high salary. But how much great research do we expect Stiglitz to do in the future? He's old! And he's involved in other things, like speaking, getting involved in policy debates, etc. He is not really getting paid to do research. And, crucially, Stiglitz is getting paid a lot more than any economist whose best research years are ahead of her! If you look at total salary expenditure by universities, my bet is that you will find the same pattern - much more money being spent on past research than on future research.
Of course, from the labor supply side, this still functions as an incentive to do good research (so you can get paid more in the future). But from a demand side perspective, why the heck should a university pay professors for work they did in the past, when they were employed somewhere else? Unless universities are voluntarily internalizing the positive externality from research - i.e. unless universities just want to do good for the world by making research a well-rewarded activity - we must conclude that universities are not actually paying for research.
What are they paying for? I conjecture that they are paying for prestige. If Joe Stiglitz works at my university, it raises my university's prestige.
Why would a university want to raise its prestige? Well, if Woolley's conjecture is correct, prestige is a signal of teaching quality: a Columbia education is generally assumed to be better than an Ohio State education, in part because Columbia has more prestigious professors. So suppose that human capital is very important, but also difficult to observe; in this case, the prestige of your alma mater signals how valuable an employee you're likely to be, because of the fact that education matters, not in spite of it (as in the typical "signaling model" of education).
So here's a question: Why would universities care about prestige? Well, it might allow them to charge undergrads higher prices; Columbia tuition is certainly higher than Ohio State tuition. That in turn might lead to administrators (i.e. the people who make the hiring decisions) getting paid more, particularly if the number of administrators needed is proportional to the number of undergrads (so that higher tuition means higher expenditure-per-undergrad means more expenditure-per-administrator). Administrators trying to maximize their own salaries would then have an incentive to pay a lot of money for someone like Joe Stiglitz. (Full disclosure: I like and admire Joe Stiglitz. And naturally I can't pass up a chance to bag on Ohio State.)
So, to reiterate, here is a sketch of the Noah Smith (or perhaps Smith/Woolley?) Model of Higher Education:
1. The human capital benefit of an undergraduate education is highly sensitive to unobservable differences in teacher quality.
2. Past research accomplishments are a strong signal of teacher quality.
3. Thus, professors with stronger research records allow a university to charge more tuition per undergrad, increasing the salaries of administrators.
4. Although human capital signaling is itself inefficient, this system benefits society by providing a subsidy for the production of research, which as an almost completely nonrival good, is underprovided by the private sector. If teaching quality were observable, this research subsidy would not exist.
5. Extension: America's "legacy student" system (basically, auctioning off a few false ability signals for huge amounts of money, at only a small cost to the school's prestige) gives American universities an edge over foreign universities in terms of prestige, but it also increases the amount that America's university system subsidizes research. This may mean that the legacy system is good for the world.
Anyone can, of course, feel free to take this model and run with it if you like it (and if you do, feel free to include me as a co-author, or not, as you like). It definitely needs some serious empirical work to support each link in the chain. But notice that this model wouldn't just answer the question of why research is paid highly, it would (partially) answer the question of the value of universities to society as a whole, AND the question of why the dual research/teaching structure of the university has proven so durable over the years. And it would possibly point to ways in which the system could be tweaked to boost the degree to which it subsidizes basic research (some of these ways might seem very counterintuitive, e.g. admitting legacy students!).
And if you can see reasons why this model I've sketched is obviously wrong, please let me know, of course.