Mark C. Taylor’s yawp of pain about academia in The New York Times yesterday is a handy compendium of virtually every complaint currently circulating about the American university system. We are, he claims, overspecialized, obsolescent, irrelevant, and rigid. We learn more and more about less and less, while mercilessly exploiting successive generations of graduate students whom we then cast out into unemployment or the wilderness of adjuncting. In short, we stand with the auto manufacturers and (one might add) newspapers in the ranks of ill-adapted social dinosaurs awaiting extinction. “Graduate education is the Detroit of higher learning,” writes Taylor, who is a professor of religion at Columbia University. And to deal with this crisis, Taylor calls for a revolution of Jacobin proportions. Abolish tenure! Impose mandatory retirement! Eliminate departments! Introduce new regulations! Above all, replace our current system with a supple, flexible set of interdisciplinary research webs that can be focused on the most pressing problems of the day: Academia 2.0, so to speak. To add insult to injury, Taylor offers as an exemplar of everything that is wrong with Academia 1.0: a graduate student who is apparently writing (please hold your guffaws) a dissertation on the medieval theologian Duns Scotus’s use of citations.
It’s worth noting a few glaring contradictions in the piece. Taylor calls for bringing representatives of different disciplines to bear on such pressing problems as water supplies, even as he is demanding the elimination of the disciplines themselves. Yet the very word “interdisciplinary” implies a disciplinary base. Presumably Taylor himself would not want academics or policy-makers to address (say) the issue of radical Islam without the sort of knowledge of Islamic history and theology that a Department of Religion is best able to provide. Nor do I think he would want dams constructed by engineers who have degrees in Water, as opposed to Engineering. There is also the problem of simultaneously abolishing tenure and introducing sweeping new external regulatory systems on universities. Might not this combination have a certain, unfortunate effect on academic freedom? But then, academic freedom is a concept that goes singularly unmentioned in Taylor’s piece.
More fundamentally, it’s worth asking if the American university, and its system of graduate education, is really in quite the dire position that Taylor describes. Highly placed academics do have a tendency these days to decry their own supposed obsolescence. The former president of my own university, William Brody, liked to compare academia to the buggy whip industry. But where, exactly, is the proof of this obsolescence? Admissions to top American universities and college remains as competitive as ever--no matter how much, it seems, tuition rises. Despite an academic job market that has been anemic at best and disastrous at worst for more than 35 years, top Ph.D. programs still receive far more qualified applicants than they can hope to admit, include a rising proportion from overseas. America’s position in basic research, as measured in such things as Nobel Prizes, seems unchallenged. European academics generally regard the American academic system with untrammeled envy, while their own university systems go through crises that make ours look minor in comparison. Academia has suffered from the current economic downturn, just like virtually every other sector of the American economy. But this is the sort of “obsolescence” that Chrysler and The New York Times can only dream of.
Yes, the internet is certainly changing the way students learn. But those who prophesize the simple displacement of the university as we know it by online learning often know very little about how online learning actually works. Those of us who actually oversee online learning programs know that when they are done well, they involve just as much faculty effort and expertise per student, and just as much investment per student, as classroom learning. There are no simple economies of scale. So not surprisingly, the institutions leading the way in effective online learning are in fact the traditional universities, and they have, if anything, gained strength from the process, not the reverse. In my own university, these new courses of study generate profits that help to support, yes, traditional forms of graduate education.
As for the argumentum ad Duns Scotus, there are two responses. First, it is absurd to think of academic overspecialization as a peculiarly modern disease, and to conjure up some lost scholarly golden age in which every professor wrote articles of broad sweep and import. The quickest glance through the back issues of any academic journal shows that scholarship has always tended, for better or worse, to advance in slow, methodical increments. Is writing about Duns Scotus’s use of citations any worse than writing about the exact placement of a particular regiment at the battle of Fontenoy, or the precise language of a particular medieval capitulary, or any of the other subjects that obsessed earlier generations of scholars? What matter are the overall projects and contexts that a particular piece of scholarship contributes to. Maybe we don’t really need to learn more about this particular thinker’s use of citations--but perhaps the study will contribute to a larger point about how modern forms of scholarship, verification, and knowledge itself developed, and that would not be so trivial. The historian Anthony Grafton once published a history of the footnote, and a grand (and entertaining) piece of scholarship it was.
It is always easy to take scholarship out of context and mock it as trivial and irrelevant. We could mock Mark Taylor himself just as easily. Here, for instance, is a sentence taken out of context from his book Journeys to Selfhood: “While the interpretation of the discontinuous moment represents an account of the fullness-of-time alternative to Hegel’s pregnant present, and the insistence on the irreducibility of the Absolute Paradox is directed against Hegel’s implicitly rational Mediator, Kierkegaard develops the notion of contemporaneity to correct what he regards as problematic implications of the Hegelian sociocultural and philosophical mediation of the God-Man’s positivity.” While certainly ripe for parody, to mock this passage for its obscurity is actually a philistine and ignorant gesture, no better than mocking expressionist painters for their dribs and drabs of paint. Taylor is a serious scholar, whose language here suits the serious philosophical purpose of his book. Perhaps his young Duns Scotus-studying colleague is a serious scholar as well, who doesn’t deserve mockery on the op-ed page of The New York Times.
American universities obviously face serious challenges--all the more since the recession began. But to collapse all of those challenges into one single, facile analogy to Detroit does no one any good, except the yahoos on the right who delight in dismissing academia not simply as trivial and obsolescent, but as morally corrupting and unpatriotic. These critics would love not simply to restructure the humanities, but get rid of them altogether. Let’s not make their work easier for them.
David A. Bell is a professor at Johns Hopkins University, where he is dean of faculty, and a contributing editor at The New Republic.
By David A. Bell