Looks like you’re using a browser we don’t support.

To improve your visit to our site, take a minute and upgrade your browser.

How Business Schools Fail Up

The rise of the STEM-obsessed, corporate-partnered university


“The higher educational system of the industrial society,” University of California president Clark Kerr wrote in 1960,

stresses the natural sciences, engineering, medicine, managerial training… There is a relatively smaller place for the humanities and the arts, and the social sciences are strongly related to the training of managerial groups and technicians for the enterprise and the government. The increased leisure time of industrialism, however, can afford a broader public appreciation of the humanities and the arts.



Except for the long-forgotten hope of all this free time to enjoy books and art, Kerr’s technocratic vision for what he called the “multiversity” could come from the mouth of almost any research-university president today. Rankings-hungry universities across the country tout their STEM programs and their connections to internships and industry. Helping themselves to trendy tech buzzwords, they brand themselves as centers of innovation that drive economic growth and offer students salvation from the robot apocalypse. Academics in the humanities and social sciences, who have seen the collapse of their job markets despite steady demand for their teaching, generally feel the modern university to be in crisis, often struggling to articulate what exactly has happened to their place in the world.

Two new books try to explain how we ended up here. In Nothing Succeeds Like Failure, Steven Conn suggests that it was the rise of the business school, and the spread of its values through the university more broadly, that started this process of transformation. Ethan Schrum’s The Instrumental University takes a wider view, describing a movement among influential university leaders after World War II to make the research university an engine of knowledge for economic development and other dimensions of “national purpose,” breaking with the university’s traditional values of truth-seeking.

Both books consider the realities behind the soaring ideals in which universities drape themselves and through which they are too often uncritically imagined by the educated public. That universities are tightly entangled in political power, the production of elites, and the administration of society has been a fact for centuries, whether the curriculum was designed to produce masters of classical Latin or shareholder value. But what happened in the twentieth century to render American universities, in particular, such enthusiastic accessories to capitalist technocracy?

Conn places his cards on the table at the beginning of his book. “Most of us,” he writes of his humanist colleagues, “simply don’t believe that teaching business techniques constitutes the real work universities ought to do.” But he acknowledges that his own view that the university should be centered on liberal education does not fit with the history of education in the United States. Some of America’s first colleges were founded to train a numerically tiny, white, Northeastern Protestant elite. When relatively larger numbers of Americans began receiving higher education in the late nineteenth century, it was as the result of the 1862 Morrill Act, signed by President Abraham Lincoln, designating 30,000 acres of federal land per member of Congress for the states to found colleges or universities. Many of these “land-grant universities,” like Penn State and Texas A&M, originally focused on education for agriculture and mechanics, though the Morrill Act also stipulated “other scientific and classical studies.”

America’s first business schools, known in the nineteenth century as “commercial colleges,” provided to young men employed in industry the sort of formation that more traditional higher education, more appropriate for lawyers or doctors, did not. Because of their practicality and egalitarianism, commercial colleges often hosted a spirit of rebellion against the elitism and stodgy classicism of academia. Industrialists, flush with wealth from late–nineteenth-century transformation of the American economy, were acutely aware of the commercial colleges’ lack of social capital, and they pushed to establish business in elite universities in order to make the businessman a socially respectable figure. Joseph Wharton, a titan of the Philadelphia mining and metal industries, gave his name and fortune to the first university business school in the United States, the University of Pennsylvania’s Wharton School, which opened in 1881.

In the 1910s, when waves of business schools were founded, society was widely perceived to have grown more complex, and science was everyone’s answer to how to master its problems and challenges. Business schools exploded at the same time that upstart social sciences like psychology and sociology were seeking to explain—and sometimes to contain—the social conflict roiling industrial capitalist society. Progressive reformers were found in all of these new pursuits, and some of them welcomed business education into the university in hopes that a more scientifically trained commercial world would be less rapacious and anti-social. Louis Brandeis declared in 1912 that, thanks to higher education,

“Big business” will mean professionalized business, as distinguished from the occupation of petty trafficking or mere money-making. And as the profession of business develops, the great industrial and social problems expressed in the present social unrest will one by one find solution.

The hopes of figures like Brandeis, Conn writes, exhibit “an almost poignant faith in the socially transformative power of education”—one that lives on in liberals’ perpetual resort to education as an answer to the contradictions of capitalist society.

Business schools, however, were to prove a disappointment: Through the middle of the twentieth century, they fell cyclically in and out of favor with the business world itself, and were periodically criticized by the rest of the academy—as well as external organizations, like the Ford Foundation—as having low intellectual standards that put in question their presence alongside “real” academic pursuits. Economists scoffed that business schools were more interested in private gain than in understanding how the economy worked: While economists studied the great crises of the twentieth century, Conn argues, business schools chugged along with little self-reflection or sense of responsibility for the damage that business did to society.

These flaws constitute the “failure” in Conn’s title. But failure coexisted with great success: The number of business schools continued to grow, and, as anyone who teaches at the university level today knows, students have never more hotly pursued business majors. Almost as soon as business school intellectuals began to emerge in the 1920s, they were players in broader social and scientific debates; they saw themselves, as figures like Peter Thiel do today, as philosophers, social theorists, and “thought leaders,” and were keenly interested in developments in other disciplines. In the first half of the twentieth century, figures like Elton Mayo, Chester Barnard, and Peter Drucker brought intellectual sizzle to managerial ideologies and legitimized them among other kinds of social elites, including university leaders.

The stars finally aligned for business schools with the deregulation of the economy in the 1980s, which opened the floodgates of finance and created a tight nexus of business schools, Wall Street, and consulting firms like McKinsey that gobble up both the graduates and the ideas produced by business schools. According to Conn, the neoliberal economist Milton Friedman’s defense of shareholder value as the central purpose of business provided the “big idea” that finally brought business schools, economists, and the corporate world together.

“By the turn of the twenty-first century,” Conn writes, “business school education had aligned itself with the demands of business more than at any time since Wharton began offering classes in 1881.” Business schools finally stopped trying to prove they had rigorous intellectual standards and were a social good, because the money they brought in—and universities’ increasing adaptation of managerial ideas in their own time of neoliberal crisis—spoke for itself. Running universities like corporations, forcing academic departments to justify their existence, and appropriating resources on the basis of “productivity” and “value-added” all brought academic administration and business school ideas into closer sync.

Ethan Schrum’s The Instrumental University does not lay all the blame on business schools. He suggests that the university was already beginning to change in the early twentieth century, as the progressive movement aimed to orient the academy toward solving social problems. While this vision was gaining influence before World War II, the Cold War provided the models and money to implement it on a much larger scale. During the war, the government had mobilized American universities, deploying their research on massive arms-development projects. This work, Schrum argues, convinced academic administrators that their institutions had a grand role to play in reshaping American society and the world. In the 1950s and 1960s, they frequently wrote of transforming the university into an “instrument” of social change, of “using” it to solve pressing problems. Universities would produce technical knowledge for economic development, urban planning, and industrial relations. Schrum calls this vision the “instrumental university.” The ideal was an unapologetically technocratic adaptation of the earlier progressive impulse: The university’s goal was to provide the knowledge and the people who could, in Clark Kerr’s phrase, “administer the present.”

The visionaries of the instrumental university subscribed to a form of American exceptionalism that historians call “American modernity”—a widespread belief among American elites, invigorated by triumph in World War II, that the scientific and technological might of the United States made it the exemplary modern nation with a mission to serve as a model for the world. The consequence of American modernity was a concerted effort to remake all domains of knowledge in the mold of wartime military research. To this end, both public and private research universities established “organized research units” to produce practical research on problems identified by government, industry, and private philanthropists. These institutes bypassed universities’ department structures and created direct relationships with outside “clients,” attracting funding that sometimes dwarfed that of traditional departments.

The postwar academic entrepreneurs who championed the new institutes—Kerr at the University of California system, Penn president Gaylord Harnwell, and Samuel Hayes Jr. at the University of Michigan, among many others—saw the research university as a potential engine of economic development; liberal education and traditional academic research were of secondary importance, if not altogether obsolete. James G. March, who designed the social sciences program at the new Irvine campus of the University of California, declared that reading was only a secondary skill for social scientists, whose primary activity was abstraction and model-building, and he eliminated reading requirements in favor of problem sets. March’s vision for the social sciences as a whole tracked closely with the abstract modeling becoming popular in military strategy, based on quantification and systems theory. At Penn, Harnwell, a physicist who had participated in organized military research during the war, was obsessed with the university producing “manpower” to fuel the need for knowledge in “science, technology, business management, and human relations.”

Clark Kerr saw his own field of industrial relations, Schrum writes, as an effort to “steer labor protest movements in the underdeveloped world away from communism” and to create “a general strategy to inform U.S. tactics in that area.” The participation of dozens of American universities in government-funded economic development projects in the postcolonial world likewise was part of a broadly conceived effort of national defense against communism. When the Berkeley Free Speech Movement made Kerr its central villain, it was precisely because his vision of the university was so thoroughly associated with the military-industrial complex and the horrors of the Vietnam War.

Kerr’s vision soon attracted another more powerful set of enemies. In a 1970 screed, the Southern neoliberal economist James Buchanan attacked the University of California’s free tuition on the grounds that it produced “revolutionary terrorists” who had no respect for private property. According to Buchanan, education should be made scarce and expensive, rather than universal; he was one of the first to propose a punitive student loan system to yoke students to the discipline of the market. Advisers to Ronald Reagan, then governor of California, convinced him to fire Kerr on the grounds that the UC system had become a hotbed of communism. Backlashes to the public policy of the 1960s, including the nationwide “tax revolts” that wrote curbs on state spending into law, marked a changing of the political winds.

Across the decades that followed, state funding for higher education fell, and the ideas of neoliberal economists moved into the mainstream. Universities, scrambling to replace their lost public funding, raised tuition. For students, education became an individual consumer good, paid for increasingly with loans and seen as an “investment” in their own future productivity. Universities also drew into closer and more openly commercialized relationships with business. Kerr’s instrumental university provided something of a blueprint for the neoliberal university, now shorn of its public-spirited and egalitarian impetus. But unlike its postwar predecessor, the neoliberal university has two primary concerns: giving its students a ticket into the top tier of the polarized labor market, and pursuing projects preferred by its corporate partners.

In becoming instrumental, Schrum writes, the postwar university “lost some of what made it special. It became more like other large institutions, caught up in the economics and politics of the day, rather than a place intentionally set apart from those currents so that scholars could pursue truth.” In his conclusion, Schrum gives voice to conservative critics like the sociologist Robert Nisbet, who used the term “academic capitalism” long before it became a common descriptor of the neoliberal university. Schrum, like Nisbet, is nostalgic for an era when the university was smaller, more insular and communitarian, and when the humanities had a firmer place at the center of education.

There is no doubt that the progressive instrumental university, especially as it oriented itself toward winning the Cold War, laid some of the groundwork for the later subordination of universities to business. But that ultimate result was not an automatic consequence of universities getting too entangled in the world outside. It is inconceivable that universities could have remained untouched by the highly technical industrial economy that triumphed in the late nineteenth century; the important question was who they imagined themselves to be serving and how they did so. While some progressive reformers were elitist technocrats, others saw the university as a public-oriented counterweight to the power of big business. We owe planned cities, ambitious and capable government administration, and state economic management in part to progressive universities’ efforts to apply knowledge to social problems.

In the late nineteenth century, a new generation of economists, who had returned from training in Germany to challenge the laissez-faire orthodoxy of the American Gilded Age, gradually rose to prominence at Wharton. They argued that the government should intervene to address widening inequality of industrial capitalism. One of the school’s directors, Simon Patten, described himself as a “revolutionist” in economics and called for his colleagues to place themselves “on the firing line of civilization.” Patten hired the black sociologist W.E.B. Du Bois—who published his famous study The Philadelphia Negro while at Penn in 1899—and trained Rexford Tugwell, one of the economic architects of the New Deal.

The Wharton economists thought it was their responsibility to serve society’s needs, and, unlike both Kerr’s generation and today’s neoliberal university presidents, they held a notion of “service” that challenged the dominant beliefs and trends of American elites. Because they questioned the preferences of the Penn trustees, they paid a price: Patten was eventually forced into early retirement due to his strident opposition to World War I, while one of his faculty hires, Scott Nearing, was fired by the trustees for “advocating the ruthless redistribution of property.”

For those concerned by the state of today’s universities, the answer cannot be a retreat from the world outside, but a reassertion of the university’s independence from the forces that have managed to hijack its activities for private gain. And it cannot ignore the importance of “instrumental” knowledge: Organized policy research in areas like economic development and urban planning can genuinely serve the common good (just as, conversely, humanistic inquiry can nourish reactionary elitism). If universities are to be more than corporate research labs and employer training programs, they need a renewed vision of what it means to serve a truly democratic society—and they need a society that demands they do so.