You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.

Form and Fortune

Steve Jobs’s pursuit of perfection—and the consequences.

Steve Jobs
By Walter Isaacson
(Simon & Schuster, 627 pp., $35)

I.

In 2010, Der Spiegel published a glowing profile of Steve Jobs, then at the helm of Apple. Jobs’s products are venerated in Germany, especially by young bohemian types. Recently, the Museum of Arts and Crafts in Hamburg presented an exhibition of Apple’s products, with the grandiloquent subtitle “On Electro-Design that Makes History”—a good indication of the country’s infatuation with the company. Jobs and Jony Ive, Apple’s extraordinary chief of design, have always acknowledged their debt to Braun, a once-mighty German manufacturer of radios, record players, and coffeemakers. The similarity between Braun’s gadgets from the 1960s and Apple’s gadgets is quite uncanny. It took a Syrian-American college dropout—a self-proclaimed devotee of India, Japan, and Buddhism—to make the world appreciate the virtues of sleek and solid German design. (Braun itself was not so lucky: in 1967 it was absorbed into the Gillette Group, and ended up manufacturing toothbrushes.)

The piece about Jobs in Der Spiegel shed no light on his personality, but it stood out for two reasons. The first was its title: “Der Philosoph des 21 Jahrhunderts,” or “The Philosopher of the Twenty-First Century.” The second was the paucity of evidence to back up such an astonishing claim. Jobs’s status as a philosopher seems to have been self-evident. It is hard to think of any other big-name CEO who could win such an accolade, and from an earnest German magazine that used to publish long interviews with Heidegger. So was Steve Jobs a philosopher who strove to change the world rather than merely interpret it? Or was he a marketing genius who turned an ordinary company into a mythical cult, while he himself was busy settling old scores and serving the demands of his titanic ego?

There are few traces of Jobs the philosopher in Walter Isaacson’s immensely detailed and pedestrian biography of the man. Isaacson draws liberally on previously published biographies, and on dozens of interviews that Jobs gave to the national media since the early 1980s. He himself conducted many interviews with Jobs (who proposed the project to Isaacson), and with his numerous colleagues, enemies, and disciples, but as one nears the end of this large book it’s hard not to wonder what it was that Isaacson and Jobs actually talked about on those walks around Palo Alto. Small anecdotes abound, but weren’t there big themes to discuss?

That the book contains few earth-shattering revelations is not necessarily Isaacson’s fault. Apple-watching is an industry: there exists an apparently insatiable demand for books and articles about the company. Apple-focused blogs regularly brim with rumors and speculation. Ever since its founding—but especially in the last decade, when Apple-worship reached its apogee—Apple has been living under the kind of intense public scrutiny that is usually reserved for presidents. Jobs relished such attention, but only if it came on his own terms. He did his best to manage Apple’s media coverage, and was not above calling influential tech reporters and convincing them to write what he wanted the world to hear. Not only did Jobs build a cult around his company, he also ensured that it had its own print outlets: Apple’s generous subsidy allowed Macworld—the first magazine to cover all things Apple—to come into being and eventually spawn a genre of its own.

As Isaacson makes clear, Jobs was not a particularly nice man, nor did he want to be one. The more diplomatic of Apple’s followers might say that Steve Jobs—bloodthirsty vegetarian, combative Buddhist—lived a life of paradoxes. A less generous assessment would be that he was an unprincipled opportunist-a brilliant but restless chameleon. For Jobs, consistency was truly the hobgoblin of little minds (he saw little minds everywhere he looked) and he did his best to prove Emerson’s maxim in his own life. He hung a pirate flag on the top of his team’s building, proclaiming that “it is better to be a pirate than to join the Navy,” only to condemn Internet piracy as theft several decades later. He waxed lyrical about his love for calligraphy, only to destroy the stylus as an input device. He talked up the virtues of contemplation and meditation, but did everything he could to shorten the time it takes to boot an Apple computer. (For a Buddhist, what’s the rush?) He sought to liberate individual users from the thrall of big businesses such as IBM, and then partnered with IBM and expressed his desire to work only with “corporate America.” A simplifier with ascetic tendencies, he demanded that Apple’s board give him a personal jet so that he could take his family to Hawaii. He claimed he was not in it for the money and asked for a salary of just $1, but he got into trouble with the Securities and Exchange Commission for having his stock options—in a move that gave him millions—backdated. He tried to convince his girlfriend that “it was important to avoid attachment to material objects,” but he built a company that created a fetish out of material objects. He considered going to a monastery in Japan, but declared that, were it not for computers, he would be a poet in the exceedingly unmonastic city of Paris.

How serious was he about that monastery thing? Isaacson recounts well-known anecdotes of Jobs’s quest for spirituality and seems to take them all (and many other things) at face value. The story of Jobs’s youth—his pilgrimage to India, the time he spent living on a farm commune, his fascination with primal scream therapy—does suggest that his interest in spirituality was more than a passing fad. But how long did it last, exactly? Did the more mature Jobs, the ruthless capitalist, feel as strongly about spirituality as his younger self did? Surely there were good reasons for the mature Jobs to cultivate the image of a deeply spiritual person: Buddhism is more than just a religion in America, it is also a brand. And one of Apple’s great accomplishments was to confer upon its devices a kind of spiritual veneer.

Jobs was quite candid about his vanishing interest in matters of spirituality as early as 1985. When a Newsweek reporter inquired if it was true that he had considered going to a monastery in Japan, Jobs gave a frank answer: “I’m glad I didn’t do that. I know this is going to sound really, really corny. But I feel like I’m an American, and I was born here. And the fate of the world is in America’s hands right now. I really feel that. And you know I’m going to live my life here and do what I can to help.” In a more recent interview with Esquire he claimed that he did not pursue the monastery route in part because he saw fewer and fewer differences between living in the East and working at Apple: “Ultimately, it was the same thing.”

Jobs’s engagement with politics was quite marginal—so marginal that, except for him lecturing Obama on how to reset the country, there are few glimpses of politics in this book. He did not hold politicians in anything like awe. We see him trying to sell a computer to the king of Spain at a party, and asking Bill Clinton if he could put in a word with Tom Hanks to get him to do some work for Jobs. (Clinton declined.) When he was ousted from Apple, Jobs may have flirted with the idea of running for office but was probably discouraged by all the pandering it required. “Do we have to go through that political bullshit to get elected governor?” he reportedly asked his publicist. In an interview with Business Week in 1984, he confessed that “I’m not political. I’m not party-oriented, I’m people-oriented.”

But “not political” may be the wrong term to describe him. There is a curious passage in his interview with Wired, in 1996, where he notes:

When you’re young, you look at television and think, There’s a conspiracy. The networks have conspired to dumb us down. But when you get a little older, you realize that’s not true. The networks are in business to give people exactly what they want. That’s a far more depressing thought. Conspiracy is optimistic! You can shoot the bastards! We can have a revolution! But the networks are really in business to give people what they want. It’s the truth.

There is a hint of contempt, even of misanthropy, in this observation; and also of the marketer’s view of the world. Reform as a category of thought did not seem to exist in Steve Jobs’s universe, even if he was always tweaking his products. That an entirely different institutional and political arrangement might be possible; that it might result in better television; that this new kind of television might still be enjoyed by the population and play an important civic role in the national discourse—none of this occurred to him. Had he only bothered to look across the Atlantic, he would have discovered that a different television—the BBC, or the Franco-German Arte—was not only possible, but attainable. Jobs claimed to have liberal leanings, but he chose to live in an intellectual bubble that was decidedly pre-political. In that bubble, there were only two kinds of people to be reckoned with: producers and consumers. Norms, laws, institutions, politics—none of that larger context matters. Jobs was a revolutionary, but a limited one; and never did so limited a revolutionary create so vast a revolution.

II.

“PURE” WAS THE ultimate compliment that Steve Jobs could bestow. The word and its derivations appear often in Isaacson’s book. “Every once in a while,” says Jobs, “I find myself in the presence of purity—purity of spirit and love—and I always cry.” For Jobs, ideas and products either have purity—and then they are superior to everything else—or they do not, and then they must be rejected or revised. He wants Apple computers to be “bright and pure and honest.” He orders the walls of an Apple factory to be painted “pure white.” The iPad, he says, must embody “the purest possible simplicity.” He is deeply moved by “artists who displayed purity,” and describes an ex-girlfriend as “one of the purest people I’ve ever known.” Apple, he claimed in 1985, “was about as pure of a Silicon Valley company as you could imagine.” Ive, Apple’s master of design, loves purity as well. He wants his devices not in plain white but in “pure white,” because “there would be a purity to it.” A clear coating on the iPod nano would ruin “the purity of his design.” Ive believes—and says that Jobs shared this belief—that products need to look “pure and seamless.”

Neither Jobs nor Ive tells us exactly what he means by “pure,” and Isaacson is not much help here. It appears that “pure” products exhibit a perfect correspondence between their form and what both Jobs and Ive refer to as their “essence.” Ive notes that “we don’t like to think of our knives as being glued together. Steve and I care about things like that, which ruin the purity and detract from the essence of something like a utensil, and we think alike about how products should be made to look pure and seamless.” It is a kind of industrial Platonism. All knives have an essence, and if the form of a given knife corresponds to that essence, then the knife, the designed object, is perfect, or pure. Nothing compound or cobbled together; only the integrity of a single substance in a simple form. Pure products are born, not made; any visible signs of human assembly—say, screws—would make it hard to believe in the higher integrity, the perfection, of the product.

Isaacson’s discussion of the idea behind Toy Story—the wildly successful computeranimated film that was made in 1995 by Pixar, then run by Jobs—provides a further glimpse into how Jobs thought about essences. Jobs and John Lasseter, the film’s director, shared a belief that

products have an essence to them, a purpose for which they were made. If the object were to have feelings, these would be based on its desire to fulfill its essence. The purpose of a glass, for example, is to hold water.... The essence of a computer screen is to interface with a human. The essence of a unicycle is to be ridden in a circus. As for toys, their purpose is to be played with by kids, and thus their existential fear is of being discarded or upstaged by newer toys.

But Isaacson’s equation of “essences” with “purposes” only complicates matters further, since products can be made for purposes that have nothing to do with their essences. (Toys can be made for the purpose of making money.) Later Isaacson writes that “as usual Jobs pushed for the purest possible simplicity. That required determining what was the core essence of the [iPad]. The answer: the display screen.” But by this logic, the essence of a unicycle is the wheel rather than being ridden in a circus, as Isaacson himself claims above.

I DO NOT MEAN to be pedantic. The question of essence and form, of purity and design, may seem abstract and obscure, but it lies at the heart of the Apple ethos. Apple’s metaphysics, as it might be called, did not originate in religion, but rather in architecture and design. It’s these two disciplines that supplied Jobs with his intellectual ambition. John Sculley, Apple’s former CEO, who ousted Jobs from his own company in the mid-1980s, maintained that “everything at Apple can be best understood through the lens of designing.” You cannot grasp how Apple thinks about the world—and about its own role in the world—without engaging with its design philosophy.

Isaacson gets closer to the heart of the matter when he discusses Jobs’s interest in the Bauhaus, as well as his and Ive’s obsession with Braun, but he does not push this line of inquiry far enough. Nor does he ask an obvious philosophical question: since essences do not drop from the sky, where do they come from? How can a non-existent product—say, the iPad—have an essence that can be discovered and then implemented in form? Is the iPad’s essence something that was dreamed up by Jobs and Ive, or does it exist independently of them in some kind of empyrean that they—by training or by visionary intuition—uniquely inhabit?

The idea that the form of a product should correspond to its essence does not simply mean that products should be designed with their intended use in mind. That a knife needs to be sharp so as to cut things is a non-controversial point accepted by most designers. The notion of essence as invoked by Jobs and Ive is more interesting and significant—more intellectually ambitious—because it is linked to the ideal of purity. No matter how trivial the object, there is nothing trivial about the pursuit of perfection. On closer analysis, the testimonies of both Jobs and Ive suggest that they did see essences existing independently of the designer—a position that is hard for a modern secular mind to accept, because it is, if not religious, then, as I say, startlingly Platonic.

This is where Apple’s intellectual patrimony—which spans the Bauhaus, its postwar successor in the Ulm School of Design, and Braun (Ulm’s closest collaborator in the corporate world)—comes into play. Those modernist institutions proclaimed and practiced an aesthetic of minimalism, and tried to strip their products of superfluous content and ornament (though not without internal disagreements over how to define the superfluous). All of them sought to marry technology to the arts. Jobs’s rhetorical attempt to present Apple as a company that bridges the worlds of technology and liberal arts was a Californian reiteration of the Bauhaus’s call to unite technology and the arts. As Walter Gropius, the founder of the Bauhaus, declared, “Art and technology—a new unity.”

Bauhaus ideas inspired Jobs throughout his career. Speaking at a design conference in Aspen in 1983, Jobs proposed that, in contrast to Sony’s heavy high-tech look, “new design should take Bauhaus as its starting point and relate more to the functionality and true character.” (“True character” was another term for “essence.”) Jobs went on to say that “what [we at Apple are] going to do is make the products high-tech, and we’re going to package them cleanly so that you know they’re high-tech. We will fit them in a small package, and then we can make them beautiful and white, just like Braun does with its electronics.” Isaacson claims that it was at Aspen that Jobs was exposed to “the spare and functional design philosophy of the Bauhaus movement,” since the buildings of the Aspen Institute (of which Isaacson is president and CEO) were co-designed by the Bauhaus veteran Herbert Bayer. Perhaps it was from Bayer—who in the mid-1920s designed a typeface that eschewed uppercase letters—that Jobs got the habit of signing his name in lowercase.

In all likelihood Jobs was familiar with Bauhaus before he visited Aspen, since Bauhaus ideals—in a somewhat modified form—were embodied in consumer electronics designed by Braun. The design philosophy of Dieter Rams, Braun’s legendary designer, has shaped the feel and the look of Apple’s latest products more than any other body of ideas. Since joining Braun in 1955, Rams—who likes to describe his approach to design as “less, but better”—began collaborating with the faculty at the Ulm School of Design, which tried to revive the creative spirit of Bauhaus with a modicum of cybernetics and systems theory. Eventually Rams produced his own manifesto for what good design should accomplish. His “ten principles of good design” encouraged budding designers to embrace innovation and make products that were useful but environmentally friendly, thorough but simple, easy to understand but long-lasting, honest but unobtrusive. Rams wanted his products to be like English butlers: always available, but invisible and discreet.

One such budding designer was Ive, born in 1967 in London, who had been fascinated by the elegant simplicity of Braun products ever since his parents bought a Braun juicer in the 1970s. “It was the essence of juicing made material,” Ive writes in a moving foreword to a recent book on Rams. To him, Rams’s products “seem inevitable, challenging you to question whether there could possibly be a rational alternative. It is this clarity and purity that leads to the sense of inevitability and effortlessness that characterizes his work.” Ive’s tribute to his hero casts light upon the origin of Apple’s obsession with essence and purity. When Ive observes that “Rams’s genius lies in understanding and giving form to the very essence of an object’s being—almost describing its reason for existence,” he is plainly expressing his own credo, too.

The seventy-nine-year-old Rams, who claims not to like computers or to use them, and prefers to be called a “Gestalt engineer” rather than a designer, greatly appreciates the fact that his design legacy lives on in Apple. (Ive used to send him new Apple products.) In almost every interview, Rams is asked about Apple—and he is always quick to lavish praise on the company. Rams also likes to emphasize the hidden spirituality of his own products, and even claims that “my basic design philosophy and my ten principles are similar to the Zen philosophy.” Rams’s products were once exhibited in Kyoto’s historic Kenninji temple.

THE BAUHAUS LIVES ON in Apple also in other ways. In addition to its minimalism, the Bauhaus also championed an obsession with functionalism—the idea, revolutionary in its time, that form follows function. The Bauhaus enthusiasm for “function” is the precursor of Apple’s enthusiasm for “essence.” But how did the Bauhaus designers and architects explain the functions of their products and structures? Where did they come from, and how were they discovered? In a superb essay, in 1995, on the question of why the notion of “form follows function” was so attractive to the Bauhaus, the design historian Jan Michl observed that, instead of appealing to religious deities or nature to justify the product’s functions, modernist designers appealed to something abstract and objective: the spirit of the times. “Functionalists proper,” Michl writes, “those of the 1920s and 1930s, did not refer to God but rather to demands of the ‘Zeitgeist’, ‘Modern Epoch’, or ‘Machine Age.”... They referred to Purposes of an other-than-human Intelligence. Such Purposes, in not being human purposes, were allegedly no longer subjective but objective, and as such they sanctioned the vision of objective design.” Writing in 1926, Gropius, director of the Bauhaus at the time, proclaimed that “the Bauhaus seeks—by the means of systematic theoretical and practical research into formal, technical and economic fields—to derive the form of an object from its natural functions and limitations.”

The task of the designer, then, was not to please or to innovate. It was to uncover and to reveal—rather like scientists; for design is just a tangible, natural, and objective byproduct of history. As Michl put it, “Functional forms do not simply appeal to taste, because they are a matter of truth—and truth does not pander to taste.” It is no wonder that the functionalists loved to tout the supposed timelessness of their forms: truth, after all, has only one timeless form, or so many members of the Bauhaus believed. (That the modern epoch may eventually come to an end and be followed by an epoch with a different set of values and needs did not occur to them.) It’s no wonder that the highest compliment that Jony Ive could bestow on Dieter Rams was to call the design of his products “inevitable.”

Functionalism gave modernist designers the illusion of working outside of the crass realities of the commercial marketplace—they were in the truth-seeking business, after all—while also infusing them with a sense of autonomy that elevated them to the ranks of poets and artists: they were no longer mere instruments in the hands of big business. And functionalism had something to offer their customers as well. In pretending that they did not exist, functionalism, paradoxically, elevated the customers to a kind of priestly status. In what is probably the best explanation of Apple’s astonishing appeal to consumers, Michl notes about the Bauhaus’s ideology that

to be defined as a user worthy of the functionalist architect’s attention one had first to qualify as Modern Man, i.e., a person whose likes and dislikes were practically identical to those of the modernist architect himself. The functionalist references to users never suggested a readiness to consider the users’ wishes, demands, and needs on their own merits. The individual client, who until the arrival of modernism was thought to have a legitimate say in both functional and aesthetic matters, was now on his way to becoming an unperson. This was only logical: since forms were claimed to be intrinsic to functional solutions, there was no reason to take the form- or function-related preferences of clients and users seriously.

For consumers, to embrace such products was to embrace the higher spirit of modernity—a lesson that Steve Jobs understood all too well. Jobs famously expressed his utter indifference to the customer, who in his view does not really know what he wants. Apple’s most incredible trick, accomplished by marketing as much as by philosophy, is to allow its customers to feel as if they are personally making history—that they are a sort of spiritual-historical elite, even if there are many millions of them. The purchaser of an Apple product has been made to feel like he is taking part in a world-historical mission, in a revolution-and Jobs was so fond of revolutionary rhetoric that Rolling Stone dubbed him “Mr. Revolution.”

There was hardly an interview in which Jobs did not dramatize, and speak almost apocalyptically about, the stakes involved in buying Apple’s products. If Apple were to lose to IBM, “we are going to enter sort of a computer Dark Ages for about twenty years.” Or: “If Apple falters, innovation will cease. We will go into a ‘dark ages’ in computing.” He saw revolutionary potential in the most obscure things; he even claimed he had “never seen a revolution as profound” as objectoriented programming—a niche field that was the focus of his work before he returned to Apple. Occasionally his revolutionary rhetoric spread to non-Apple products as well; Time once quoted him as saying that Segway—yes, Segway—was “as big a deal as the PC.”

No wonder that the counterculture fizzled in the early 1980s: everyone was promised they could change the world by buying a Macintosh. Linking Apple to the historical process (Hegel comes to Palo Alto!), and convincing the marketplace that the company always represented the good side in any conflict, broke new ground in promotional creativity. Jobs turned to the power of culture to sell his products. He was a marketing genius because he was always appealing to the meaning of life. With its first batch of computers, Apple successfully appropriated the theme of the decentralization of power in technology—then also present in the deep ecology and appropriate technology movements—that was so dear to the New Left a decade earlier. If people were longing for technology that was small and beautiful—to borrow E.F. Schumacher’s then-popular slogan—Jobs would give it to them. Apple allowed people who had missed all the important fights of their era to participate in a battle of their own—a battle for progress, humanity, innovation. And it was a battle that was to be won in the stores. As Apple’s marketing director in the early 1980s told Esquire, “We all felt as though we had missed the civil rights movement. We had missed Vietnam. What we had was the Macintosh.” The consumer as revolutionary: it was altogether brilliant, and of course a terrible delusion.

THE LAST DECADE—Apple’s most successful—has been even more intriguing, for the company once again built products that perfectly responded to the spiritual and aspirational demands of the day—or at least it did an excellent job convincing people that this was the case. Amid all the current brouhaha about the liberating impact of social media, it is easy to forget that, as far as technology was concerned, the last decade began on a rather depressing note. First came the dot-com bubble, which all but shattered the starry-eyed cyber-optimism of the 1990s. It was quickly followed by September 11—hardly an occasion to celebrate the wonders of modern technology. The hijacked airplanes, the collapsing twin towers of the World Trade Center, the failure of the American military—the most technologically savvy force in the world—to do much about it, the invisible surveillance of our electronic communications in AT&T intercept facilities: it seemed as if technology was either malfunctioning or profoundly repressive.

Apple’s response to the mood of the time was to build technology that was easy to use and worked flawlessly. While its gadgets looked plain enough to match the sober spirit of the country, they also teased their users with the promise of liberation. “It just works”—Jobs’s signature promise at product launches—was soothing to a nation excited and addled and traumatized by technology. Nothing could go wrong: Apple had thought of everything. The technology would work as advertised; it was under total control; it would not get hacked. Apple promised a world in which technology would be humane, and used to ameliorate—rather than undermine—the human condition. For much of the last decade, Apple was not just selling gadgets; it was also selling technologically mediated therapy—and America, a nation that likes to cure its ills with redemptive shopping, could not resist the temptation.

It should be noted that Apple’s therapeutic role was not unique. The British historian Paul Betts argues that Braun’s products played a similar role in postwar West Germany, helping it to move beyond the fascist aestheticization of politics by means of grotesque political rallies and grandiose architecture to the postwar aestheticization of everyday life by means of sleek and efficient consumer electronics. As Betts puts it in The Authority of Everyday Objects, his magnificent cultural history of West German industrial design in the 1950s, “‘Neofunctionalism’ ... derived its moral authority from the specific postwar situation. In a country devastated by war and the crushing shortage of necessary goods and materials, the call for simple, practical, and long-lasting design was hailed as the very expression of a new postwar moral economy, one that did not squander precious resources or bow to black market pressures to pass off shoddily designed goods.” Braun’s products just worked.

The best designers of that era—and they were to be found teaching at Ulm and working at Braun—tapped into the same themes of humanity and spirituality that Apple would express and exploit half a century later. “The ’50s world culture of high design was very keen on uniting design practice and humanist culture,” Betts remarks. “The design ware was redefined as a distinctive ‘cultural good’ possessing certain ethical qualities and even a spiritual essence.” The parallels with Apple are obvious. Dieter Rams claimed that “designers can render a very concrete and effective contribution towards a more humane existence on earth in the future,” and Steve Jobs propounded something quite similar a few decades later. One of his more important goals in life, he said, was “creating great things instead of making money, putting things back into the stream of history and of human consciousness as much as I could.” In this endeavor, Jobs excelled. His machines would enhance—or so he insisted—nothing less than human life. No one ever mastered the art of retailing humanism—in beautiful boxes sold in beautiful stores—better than he did.

APPLE’S LINKS to Bauhaus, Ulm, and Braun suggest that the company has always operated in a much richer intellectual tradition than is generally recognized. The conventional view—that Apple is unique, so exceptional and so unpredictable that it defies easy categorizations—says more about the inability of technology analysts to cut through Apple’s design philosophy, which, while dense, has been quite consistent over time. While this philosophy has produced a bevy of beautiful products that are tremendously popular with the general public, it would be wrong to ascribe Apple’s success to superior design alone. Jobs never hid the fact that ultimately he was in the business of selling not computers but dreams. He was quite sincere about this. However harsh his business practices were, in his beliefs there was not a trace of cynicism.

Explaining the sad plight of his company in the early 1990s—when Apple was on a path to self-destruction while he was away—Jobs said that “Apple didn’t fail.... We succeeded so well, we got everyone else to dream the same dream.... The trouble is the dream didn’t evolve. Apple stopped creating.” It was a curious admission that Apple was not merely in the business of fulfilling customers’ dreams, but also of creating them from scratch. Apple is in fact a textbook example of “consumer engineering”—a concept eagerly embraced by the American advertising industry in the 1930s to try to get America out of the Depression. “Consumer engineering,” wrote Roy Sheldon and Egmont Arens in their classic work on the subject, “is the science of finding customers, and it involves the making of customers when the findings are slim.” The motto embraced by Regis McKenna, Apple’s p.r. firm, in the early 1980s was not all that different: Markets are made, not won.

It is hard—but not impossible—to reconcile such a view of business with the Bauhaus-inspired rhetoric of purity, essence, and function. If gadgets and devices in their ideal form exist independently of reality and are to be discovered by epistemologically privileged designers, then the science of making customers out of thin air can be justified only on the grounds that the designer is a kind of prophet who has access to a higher truth and needs to spread that truth as widely, as evangelically, and as remorselessly as possible.

Those who attack Apple as a quasireligion are more correct than they know: the company does function on the assumption that its designers, and Steve Jobs above all, are qualitatively different from the rest of us. The cult of the designer is the foundation of Apple’s secular religion. And there is a way for the rest of us to participate in the truth upon which the design is based, and to rise to the human level of the designers themselves: it is to buy an iPhone or an iPod. This is how Jobs explained the superiority of the iPod over other MP3 players: “We won because we personally love music. We made the iPod for ourselves.” Nothing ambiguous about that. Apple products are built by gods for gods. And in a free market, this privilege is available to anyone with the understanding and the money to acquire it.

Jobs, the Modern Man par excellence, did not do market research; all he needed was to examine himself. An Apple manager once described the company’s marketing research as “Steve looking in the mirror every morning and asking himself what he wanted.” That is not just a description of narcissism; it is also the natural consequence of viewing the designer as the medium through which the truth speaks to the world. So what if customers did not like some of his products? Jobs’s mirror told him not to worry. As Isaacson puts it, “Jobs did not believe the customer was always right; if they wanted to resist using a mouse, they were wrong.” He also notes that one of Jobs’s ex-girlfriends recalled that they had “a basic philosophical difference about whether aesthetic tastes were fundamentally individual, as [she] believed, or universal and could be taught, as Jobs believed.” “Steve believed it was our job to teach people aesthetics, to teach people what they should like,” she said. This smacks more of Matthew Arnold and Victorian Britain than of Timothy Leary and California in the 1970s.

As a philosophy, of course, this borders on paternalism, if not authoritarianism—something that Der Spiegel failed to grasp when it celebrated Jobs as a philosopher; but philosophy has no place on corporate spreadsheets. For many people, Apple’s success itself justifies Apple’s philosophy. What’s more intriguing, though, is how a company infused with such conservative ideas emerged as the most authentic representative of the counterculture, with Jobs arguing that he was simply extending the “power to the people” fight to the world of computing. Jobs wanted every household in the world to have an Apple product so that he could teach the bastards proper aesthetics: this was emancipation from the top down. It is a strange way to promote empowerment.

No wonder, then, that when he was asked about his life’s goal, Jobs replied that it was “to seek enlightenment-however you define it.” Here he is once again indebted to the functionalist ideology of Bauhaus and its successors. Dieter Rams sees the mission of the designer in similar terms: “We can only expect constructive progress from companies and designers that ... accept that they play an economic, civilizing, and ... cultural role in society.” Economic and cultural—sure. But civilizing? Was the world before Apple uncivilized? Do machines bring civilization? This is the same missionary agenda that gave us modernist apartment blocks with pre-installed blinds and furniture: it may have maximized the purity of the form, but only by disregarding individual idiosyncrasies. The difference is that Apple induces us to buy its civilizing products through clever marketing, while some people were forced by government fiat to move into Le Corbusier-inspired large-scale housing projects.

The Bauhaus was over before the modern advertising industry—at least as we know it today—came of age. Gropius, Breuer, Itten, and the other Bauhaus luminaries never properly theorized the relationship between design and advertising, even if they did print a lot of stylish ads. (Things were a bit more complicated with the New Bauhaus run by Moholy-Nagy out of Chicago.) Of course, neither the Bauhaus nor Ulm shied away from commercialism—both worked very closely with industry, and Braun is an apt illustration of how fruitful such collaborations were; but it is difficult to imagine either school being as obsessed with marketing as Jobs, who, at a weekly meeting, would approve every new commercial, advertisement, and billboard. (The left-wingers and socialists at the Bauhaus would certainly have looked askance at such an emphasis on advertising, or capitalist propaganda.) Jobs’s enormous creativity extended also to the practice of marketing. He set a new tone to product launches—his well-rehearsed unveiling of the Macintosh in 1984 was reportedly inspired by his reading about the tremendous success of Star Wars on its opening day—and he made American advertising look like art. (Apple’s celebrated “1984” commercial was the first American ad to win a Grand Prix at Cannes.)

Jobs never lost an opportunity to embellish his story or to connect dots that did not exist. The Economist’s cover on the week that Jobs died introduced him as “The Magician,” but a more accurate description would have been “The Mythmaker.” While there hardly exists an interview in which Jobs did not emphasize Apple’s origins in a garage—he liked to ruminate about “the purity of the garage,” and he described his rebellious Macintosh project as “the metaphysical garage”—Apple’s other co-founder, Steve Wozniak, always maintained that the garage played a very marginal role in how the first Apple computer was built. “I built most of it in my apartment and in my office at Hewlett-Packard,” he told Rolling Stone in 1996. “I don’t know where the whole garage thing came from.... Very little work was done there.”

That a man so enraptured by purity never seemed worried that the obsession with marketing might dilute the pristine nature of his products is odd. Jobs’s most impressive achievement was to persuade the shackled masses that they could see the Platonic forms without ever leaving their caves. Marketing—with its shallowness and its insidious manipulation of the consumer—would normally be relegated to the inferior realm of appearances, but it took on a different function in Jobs’s business metaphysics: it played the gospel-like role of showing us the way to the true, natural, and pure products that have not yet been spoiled by the suffocating and tasteless ethos of faceless corporations such as IBM and Microsoft. That Jobs could launch a campaign against capitalism by using capitalism’s favorite weapon—and get away with it!—was truly remarkable.

III.

APPLE’S EXTRAORDINARY success in the last decade has been owed, to a large extent, to its dogged and methodical commitment to understanding and avoiding the failures of other technology companies. Apple respects business history like no other company. Perhaps it was his early fascination with Hewlett-Packard—when he was twelve, Jobs cold-called Bill Hewlett and ended up getting a summer job at the company—that left Jobs always measuring Apple against veterans of the computer industry. As early as 1985 he was struggling with the existential question of why great companies stop innovating and die. “Ten to 15 years ago, if you asked people to make a list of the five most exciting companies in America, Polaroid and Xerox would have been on everyone’s list,” he ruminated to Playboy. “Where are they now? They would be on no one’s list today. What happened?” The case of Polaroid’s Edwin Land—a college dropout who built an extremely innovative company that eventually fell apart—haunted Jobs for a long time.

The answer Jobs offered at the time—that “companies, as they grow to become multibillion-dollar entities, somehow lose their vision [and] insert lots of layers of middle management between the people running the company and the people doing the work”—seems banal, but the thorny question of legacy, of what kind of company Apple would become after he was gone, stayed with him for decades. He told Isaacson that one of his goals was “to do what Hewlett and his friend David Packard had done, which was create a company that was so imbued with innovative creativity that it would outlive them.”

After his triumphant comeback, Jobs’s preferred way of ensuring that his company did not turn into another Polaroid was to put Apple in a permanent state of emergency—a state where normal business rules would not apply, where the color code for terror would always be red, where no business partners would be trusted, and where every Apple product would be treated as if it were its last. Inside Apple, a new book by Adam Lashinsky about the culture of the company, suggests that new Apple hires are often assigned to fake or ambiguous products so that their loyalty can be tested before they are put on real projects. “Don’t get too comfortable” is the spirit that Jobs wanted to cultivate at Apple on his return.

It is hardly surprising that Apple has accumulated almost $100 billion in reserves. Since Jobs could not know who would stab him in the back next, he wanted to have on hand enough resources for a battle. He was always expecting a battle. His legendary single-mindedness was partly a refusal to be distracted from the Darwinian environment, to desist from the fight against the internal decay or the external competition that he feared would strike Apple at any moment. That his role model HewlettPackard was beginning to implode much like Polaroid must have only intensified his paranoia. (He told Isaacson that it was “tragic” that H.P. was “being dismembered and destroyed.”) So instead of watching Apple lose its innovative edge and get crushed by some arrogant start-up, Jobs took two pre-emptive measures. One was to ensure that Apple would never get too attached to a particular product line, no matter how profitable it was. The other was to maximize its autonomy in the marketplace and ensure that it could not be pushed around by others.

HERE JOBS was drawing on another intellectual tradition that Isaacson mentions at least in passing. Jobs’s thinking about how great companies come to live, innovate, and die was influenced by Clayton M. Christensen’s book The Innovator’s Dilemma, which appeared in 1997. Christensen, a theorist of innovation at Harvard Business School, postulated that many companies fall victim to their own success: having discovered a product that could turn their business into a gold mine, they hold on to it for too long. In the meantime they become too slow, inattentive, and complacent to see what might disrupt their own business. Thus, cameramakers were too slow to grasp the implications of digital photography and cellphones; map-makers missed the advent of GPS technology; search engines failed to grasp the importance of social networking.

Seen through the prism of The Innovator’s Dilemma, Apple’s business strategy is easier to discern. A conventional technology company might hesitate to launch a phone that does everything that its own highly profitable music player is already capable of doing, because the sales of the phone might cannibalize the sales of the player. Likewise, a conventional technology company might be reluctant to launch a tablet computer that would compete with its own profitable line of laptops and desktops. But Apple defied such conventions. It has consistently been taking risks—internecine risks, competing against itself. Not only does it introduce products that vie with each other, but it is not afraid to say so: one of the first ads for the iPhone noted that “there has never been an iPod that can do this.” Apple’s reasoning seemed to be that, while sales cannibalization may eat into short-term profits, it is not the worst thing that can happen to a great company. Whatever it may lose in sales, the company would gain in innovation—that is, its designers and engineers would never get a chance to slack off—and in branding: that is, new products (released with impeccable regularity) would guarantee regular press coverage and produce an even stronger association of its brand with progress and innovation.

Even though Apple is often criticized for screening which apps are allowed to run on its phones and its tablets, it has been far more permissive—especially when it comes to the apps of its competitors—than one would expect. Today anyone with an iPad can stream music from Spotify and watch films from Netflix, which bypass Apple’s iTunes store and directly undermine one of Apple’s revenue streams; and anyone can read newspapers, magazines, and books through Amazon’s Kindle app, again bypassing Apple’s own iBooks store and App Store. Apple does not seem to mind. (It did not initially allow apps from Skype and Google Voice, but this was probably driven by its commitment to AT&T, its exclusive carrier partner, rather than by its fear of eroded business models.)

Of course, innovation potential and smart branding may not be immediately reflected in a company’s stock price. For many firms, the lure of easy profits from existing products is too great to resist—especially given that they are run by hired managers who do not much care about legacy and will find a cushy new job anyway. Contrasting Apple’s cavalier attitude to profit with the rest of the corporate world, Christensen has noted that “most companies cannot bring themselves to make decisions that result in the market for their existing core products being completely destroyed. When they consider it from a financial perspective, it just doesn’t make sense to create new products at the risk of jeopardizing your profitable, existing products.... It’s exactly that fear that has led many great companies to leave themselves vulnerable to disruption from others.” Obsessed with issues of legacy, Jobs was different in that he went in for the long term; and by focusing on innovation and branding rather than on the immediate bottom line, he delivered more value to his company than any other CEO in recent history. If you had invested $100,000 in Apple in 1997 when Jobs returned to the company, your shares would have been worth $6.86 million on his retirement.

INNOVATION IS one thing, but making money off it is another. Were Apple to stick to hardware manufacturing—putting all its energy into designing its phones, tablets, and computers while someone else was building the operating system—it would need to insure that all its latest hardware innovations would actually be supported by the next version of someone else’s operating system. This, of course, might not happen, or it might happen too slowly. Jobs sought to avoid such dependence at all costs. As he saw it, Apple in its short history had already accumulated too many scars—most of them from competitors and former collaborators. It would take risks and enter into business relationships only in situations where it could expect to gain autonomy, or at least the upper hand.

Jobs understood that abandoning control at any point in the chain—he would eventually build his own retail stores to manage the very last stage in the process—comes with risks, whatever the savings. For Apple to innovate at its own pace, which was the only option acceptable to Jobs, it had to control everything or get out. As Jobs put it, “We didn’t want to get into any business where we didn’t own or control the primary technology because you’ll get your head handed to you.” This does not mean that Apple is an autarky. Its goal has never been self-sufficiency; it does not aim to produce every single component in-house. Its ultimate goal is innovation, which, in Jobs’s view, could only come with relative independence. To that end, working with other companies is acceptable as long as it does not give any single supplier outsize influence on the company. Apple’s list of suppliers—released for the first time earlier this year—lists 156 companies (which account for 97 percent of its primary suppliers).

Can Apple’s obsession with innovation become a liability, blinding it to numerous ethical issues involved in the manufacture of its products? This is an increasingly burning question. Isaacson recounts a visit that Danielle Mitterrand, then France’s first lady, paid to Apple in the early 1980s. When she asked Jobs about overtime pay and vacation time for workers, Jobs got annoyed and told her interpreter that “if she’s so interested in their welfare, tell her she can come work here any time.” Isaacson fails to relate this episode to Apple’s more recent troubles with working conditions in the factories of its China-based suppliers, a subject that—inexplicably—is never broached in the book. The controversy is casting a growing cloud over Apple’s image. But Jobs promised aesthetic purity, not moral purity. Apple has certainly not been deaf to the mounting public concern over how its gadgets are manufactured—it has recently joined the Fair Labor Association, a respected non-profit; but it is not clear how far it is willing to go in revamping its supply chain. Given his ambition to minimize his company’s dependence on any one player, it is surprising that Jobs was willing to place so many of Apple’s eggs in the basket of Foxconn, its largest and most disturbing partner in China.

IV.

JOBS NEVER PRODUCED a coherent theory of technology, but his curt responses to interviewers reveal that he was indeed prepared to think about technology philosophically. In one interview after another, Jobs comes off as a pragmatic but sophisticated thinker—certainly not your average one-dimensional tech-loving engineer. Jobs accepted that his products would be used and modified in unforeseeable ways: “People are creative animals and will figure out clever new ways to use tools that the inventor never imagined.” He was aware of the pernicious appeal and the low effectiveness of technological fixes (“What’s wrong with education cannot be fixed with technology. No amount of technology will make a dent. It’s a political problem”), and he seemed to believe that much of the digital revolution was being overhyped (“It’s a disservice to constantly put things in this radical new light—that it’s going to change everything. Things don’t have to change the world to be important”).

While most technologists believe that technology is value-neutral, Jobs was prepared to talk about the values embedded in the gadgets and the appliances that we use. This was most evident when he described how he went about buying a washing machine:

We spent some time in our family talking about what’s the trade-off we want to make. We ended up talking a lot about design, but also about the values of our family. Did we care most about getting our wash done in an hour versus an hour and a half? Or did we care most about our clothes feeling really soft and lasting longer? Did we care about using a quarter of the water? We spent about two weeks talking about this every night at the dinner table.

Jobs’s meticulous unpacking of the values embedded in different washing machines, and his insistence on comparing them to the values he wanted to live by, would be applauded by moralistic philosophers of technology from Heidegger to Ellul, though it may be a rather arduous way of getting on with life. But Jobs understood the central point that philosophers of technology had tried (and failed) to impart: that technology embodies morality.

Jobs himself was never shy about the value that Apple products were to embody: it was liberation—from manual work, from being limited to just a few dozen songs on your music player, from being unable to browse the Internet on your phone. Yet liberation is hardly the only value that matters. We need to identify the other moral instructions that may be embedded in a technology, which it promotes directly or indirectly. And this fuller analysis requires going beyond studying the immediate impact on the user and engaging with the broader—let us call it the “ecological”—impact of a device. (“Ecological” here has no environmental connotations; it simply indicates that a technology may affect not only its producer and its user, but also the values and the habits of the community in which they live.)

Whether a washing machine uses a quarter of the water or more matters morally only if its users can establish a causal connection between water use and climate change, ocean depletion, or some other general concern. Jobs understood this. The problem was that Jobs, while perfectly capable of interrogating technology and asking all the right questions about its impact on our lives, blatantly refused to do so when it came to his own products. He may have been the ultimate philosopher of the washing machine, but he offered little in the way of critical thinking about the values embedded in the Macintosh, the iPod, and the iPad. When he discussed his own products, he switched from philosophical reflection on the effects of consumer choices to his Bauhaus mode of the vatic designer.

Tellingly, it was not the washing machine that Jobs invoked to promote his gadgets, but the automobile. In the early 1980s, he regularly compared the computer to the automobile, stressing the emancipatory power of the latter. The Macintosh was pitched as a Volkswagen, and the more expensive Lisa model as a Maserati. Jobs spoke of “the Crankless Computer”: in a memo he declared that “personal computers are now at the stage where cars were when they needed to be cranked by hand to be started.” In his Playboy interview, he said that “people really don’t have to understand how computers work. Most people have no concept of how an automatic transmission works, yet they know how to drive a car.” He persistently praised Henry Ford for making cars accessible without following the whims of his customers; it was apparent that he also liked to think of himself that way. On the whole, the media found Jobs’s car analogy persuasive. In 1982, The New York Times reported that Apple was “to the personal computer what the Model T Ford was to automobiles.”

On the surface, the car analogy seems flawless: both technologies allowed customers to do what they wanted, and boosted their autonomy, and gave them more choices about how to live their lives. But as any environmentalist, urbanplanning activist, or committed cyclist can attest, liberation was only one part of the impact that the automobile had on how we live, especially in America. Congestion, pollution, suburban sprawl, the decline of public transportation, the destruction of public space in the name of building more highways—these are only some of the less discussed effects of the automobile. Of course, the automobile did not have the same effects everywhere—compare how easy and pleasant it is to get around without a car in Portland versus Dallas—so simple appeals to technological determinism, or to the zeitgeist, or to the canonical myths about how the automobile would transform and liberate our culture, do not explain very much. Some cities and communities simply approached the automobile with the kind of philosophical sensibility that Jobs applied to his washing machine, and others did not.

NOW, WHAT DOES all this tell us about Apple? What is its ecological impact, and should we fear it? It is tempting to point to the impending death of bookstores and music stores and suggest that Apple—along with Amazon, Google, and Netflix—is culpable. Technologists already have a well-rehearsed rejoinder: that this is all just “creative destruction,” and what needs to be preserved is the content, not the distributional channel. At least Apple, they say, has been something of a savior to record labels and book publishers. Well, perhaps—but bookstores have other important functions for the individual and the community that will not be easily replicated online.

Yet this is not the most interesting and troubling aspect of Apple’s ecological impact. Ironically enough, the most consequential of Apple’s threats is not to the physical but to the virtual: the company may eventually suffocate the Internet. Apple’s embrace of the “app paradigm”—whereby activities that have been previously conducted on our browsers shift to dedicated software applications on our phones and tablets—may be destroying the Internet in much the same way that the automobile destroyed the sidewalks and the playgrounds.

The idea of the Internet is still too young to produce strong anti-app sentiment. We do not yet have an adequate understanding of cyberspace as space. While it is safe to speculate that different design arrangements of the online world give rise to different aesthetic experiences, we still do not know the exact nature of this relationship. Nor do we know enough about how the design and the interconnection of online platforms affect the distribution of civic virtues—solidarity, equality, and flânerie, to name just a few—that we may wish to promote online. Just as we recognized many of the important civic functions of the sidewalk only after it had been replaced by the highway, so we may currently be blind to those virtues of the Internet—its inefficiency, its unpredictability, its disorder—that may ultimately produce a civic and aesthetic experience that is superior to the “automatic, effortless, and seamless” (one of Apple’s advertising slogans) world of the app.

The point is not that we should forever cling to the shape and the format of the Internet as it exists today. It is that we should (to borrow Apple’s favorite phrase) “think different” and pay attention to the aesthetic and civic externalities of the app economy. Our choice is between erecting a virtual Portland or sleepwalking into a virtual Dallas. But Apple under Steve Jobs consistently refused to recognize that there is something valuable to the Web that it may be destroying. Jobs’s own views on the Internet stand in stark contrast to how he thought about the washing machine. Asked about the future of the Internet in 1994, he was clearly reluctant to think about it in ecological terms:

Rolling Stone: Let’s talk more about the Internet. Every month, it’s growing by leaps and bounds. How is this new communications web going to affect the way we live in the future?
Jobs: I don’t think it’s too good to talk about these kinds of things. You can open up any book and hear all about this kind of garbage.
Rolling Stone: I’m interested in hearing your ideas.
Jobs: I don’t think of the world that way. I’m a tool builder. That’s how I think of myself. I want to build really good tools that I know in my gut and my heart will be valuable. And then whatever happens is ... you can’t really predict exactly what will happen, but you can feel the direction that we’re going. And that’s about as close as you can get. Then you just stand back and get out of the way, and these things take on a life of their own.

Had Henry Ford been asked about the impact of his cars on the quality of urban life in America, he would probably have given the same answer.

Standing back and getting out of the way and letting things take on a life of their own is not a variety of moral reflection, though it makes sense as a way to think about a wildly successful product. The total and exclusive focus on the tool at the expense of its ecosystem, the appeal to the zeitgeist that downplays the producer’s own role in shaping it (“whatever happens is ... ”; “feeling the direction”), the invocation of the idea that technology is autonomous (“these things take on a life of their own”)—these are all elements of a worldview that Lewis Mumford, in criticizing the small-mindedness of those who were promoting car-only travel in the 1950s, dubbed “the bankruptcy of social imagination.”

Should we hold Ford Motors responsible for the totality of its impact on our lives, or just for the part that deals with liberation and autonomy? Perhaps it would set the bar too high to hold it accountable for pollution, congestion, and the disappearance of public space. But Apple’s brand, its lofty conception of itself, has been built on the idea that it is not a company like other companies. It was Apple that insisted that it wants to think different, and that it is not dominated by “suits” who care only about its quarterly earnings. So it is Apple who set this bar so high—and Apple that seems to have fallen short of it.

CURIOUSLY, A YEAR AFTER his tirade in Rolling Stone, Jobs gave another interview—to Newsweek—and offered a more cogent (and more disturbing) prediction about the future of the Internet:

The way to look at the Web is, it’s the ultimate direct-to-customer distribution channel.... You’re going to see more and more Web sites where you feel like you’re driving, where you’re asking questions.... You won’t be looking at a Web page that 3,000 other people are looking at. You’re looking at one that’s exactly what you want to see, whether it’s information on that new Chrysler Neon that you want to buy, with exactly the color you want and the dealers that have it in your area, or whether it’s Merrill Lynch showing you your portfolio of stock, updated every time you check in. It’s really going to be customized.

This is an accurate description of what Apple is doing at the moment—except that instead of customized Web pages, it is taking the form of personalized apps. But as his interview makes clear, Jobs outright rejected the possibility that there may be a multiplicity of irreconcilable views as to what the Web is and what it should be. For him, it is only a “direct-to-customer distribution channel.”In other words, Jobs believed that the Web is nothing more than an efficient shopping mall, and he proceeded to build his business around what he believed to be the Web’s essence.

That the Web did become a shopping mall fifteen years after Jobs made his remark does not mean that he got the Web right. It means only that a powerful technology company that wants to change the Web as it pleases can currently do so with little or no resistance from anyone. If one day Apple decides to remove a built-in browser from the iPad, as the Web becomes less necessary in an apped world, it will not be because things took on a life of their own, but because Apple refused to investigate what other possible directions—or forms of life—“things” might have taken. For Jobs, with his pre-political mind, there was no other way to think about the Internet than to rely on the tired binary poles of supply and demand.

This is not to say that Apple’s embrace of apps at the expense of the Web is bad for innovation—a charge that is often levied at the company by Internet academics and advocates of the vague ideal of “Internet openness.” The concern of thinkers such as Jonathan Zittrain (who uses the term “generativity” to refer to innovation) is that Apple—which inspects and approves every single app submitted to its app store—may not be able to recognize the next Wikipedia when it is sent its way. They propose that something—it is rarely specified—needs to be done so that Apple loses its fantastic gatekeeping powers. Anyone not steeped in the world of Internet theory would find the idea that Apple harms innovation quite ridiculous. If Harvard’s Christensen is right, moreover, there would inevitably come a moment when the Internet itself gets disrupted—by apps or something else—and such disruption would probably be good for innovation. Yet the promotion of innovation cannot be the sole determinant of how our digital future will be mapped. Ethical and aesthetic considerations should also serve as an important impetus for regulation and activism. But since most discussions about the future of the Internet have been dominated by lawyers and venture capitalists, innovation is still the overriding concern. And as long as innovation is the value that dominates the public debate about the Internet, Apple has nothing to fear.

What is most troubling is that Apple is not doing anything to explore its online footprint. Perhaps Apple’s design mentality—combined with its messianic self-portrayal as the only company in the world that is fighting some anonymous corporate menace (even as it is one of the most valuable companies in the world!)—has worn down its ability to ask the sort of big-picture questions that Jobs was so prone to asking in his youth. Apple, with its total fixation on the user and its complete disregard of the community in which that user is grounded, does not seem well-equipped to identify and evaluate the threats that it poses to the Internet, let alone do something about them. You would be hard-pressed to see Apple—the largest technology company in the world—sponsor events, festivals, think-tanks, books, or any other kind of research or debate about technology. It is as if they are convinced that the intellectual justifications of their work are all self-evident. After all, why re-visit truth?

Even Google, with its naïve technocratic ethos, is more committed to questioning the impact that it is having on the Internet and the world at large. They fund a bevy of academic and policy initiatives; they have recently launched a Berlin-based think tank dedicated to exploring the social impact of the Internet; they even started a quarterly magazine. Granted, Google is doing this partly in response to mounting regulatory pressure, but even so one must acknowledge that Google has not shied away from engaging many of its critics. Apple, by contrast, holds itself above the fray. It seems to believe that such discussions of meanings and consequences do not matter, because it is in the design business, and so its primary relationship is with the user, not with the society. This may be what some parochial designers thought about themselves until the 1970s—but today the advent of design that is critical, value-sensitive, and participatory has exposed the great moral void of the rigid functionalist paradigm. But Apple, alas, remains stuck in the most conservative, outdated, and bizarre interpretation of the Bauhaus, which was, ironically, a movement that flaunted its commitment to social reform and utopian socialism. Would a job applicant who spends weeks pondering the morality of washing machines get a job at Apple now?

Unfortunately, most of us are too addicted to Apple’s products to demand or to expect anything more of the company. As long as Apple can ship new devices every quarter, much like a dealer would ship new drugs, few questions are asked. How little has changed since Lewis Mumford complained that

For most Americans, progress means accepting what is new because it is new, and discarding what is old because it is old. This may be good for a rapid turnover in business, but it is bad for continuity and stability in life. Progress, in an organic sense, should be cumulative, and though a certain amount of rubbish-clearing is always necessary, we lose part of the gain offered by a new invention if we automatically discard all the still valuable inventions that preceded it.

As for Jobs, his own tragic limitations were on full display when Rolling Stone asked him about the future of technology—whether genetic research and cloning “were pushing it all too far.” He rolled his eyes. “You know—I’d rather just talk about music. These big-picture questions are just—zzzzzzzz,” he said, and started snoring. The philosopher of the twenty-first century, indeed.

Evgeny Morozov is the author, most recently, of The Net Delusion: The Dark Side of Internet Freedom (PublicAffairs)This article appeared in the March 15, 2012 issue of the magazine.