The Language of War:
Literature and Culture in the U.S.
From the Civil War Through World War II
by James Dawes
(Harvard University Press, 300 pp., $39.95)
"The real war," Walt Whitman wrote soon after Appomattox, "will never get in the books." In "The Wound Dresser" and other poems, Whitman tried to transcribe his Civil War experience in a Washington hospital, where he tended the dismembered and the dying. But he sensed that there was something new about the carnage of modern war, something that resisted literary convention and ultimately language itself. He was not alone. The crowds who gathered to hear casualty reports, which had been telegraphed to the metropolitan newspapers, seemed also to realize the incapacity of words to make mass death meaningful. "For the benefit of some who had no papers, one of us read the telegram aloud, while all listened silently and attentively," Whitman recalled. "No remark was made by any of the crowd, which had increased to thirty or forty, but all stood a minute or two, I remember, before they dispers'd. I can almost see them there now, under the lamps at midnight again."
Under those lamps at midnight, a modern sensibility was being born. Whitman was present at the emergence of an important vernacular perspective, one that included modernist writers as well as unlettered citizens. It was a point of view that recognized the horrors of modern war as truly unspeakable, that refused the false comfort of vacant utterance. Its classic expression was in Ernest Hemingway's A Farewell to Arms (1929), when the protagonist Frederic Henry hears a fellow soldier say their comrades could not have died in vain.
I was always embarrassed by the words sacred, glorious, and sacrifice and the expression in vain. We had heard them, sometimes standing in the rain out of earshot, so that only the shouted words came through, and had read them, on proclamations that were slapped up by billposters over other proclamations, now for a long time, and I had seen nothing sacred, and the things that were glorious had no glory, and the sacrifices were like the stockyards at Chicago if nothing was done with the meat except to bury it.... Abstract words such as glory, honor, courage, or hallow were obscene beside the concrete names of villages, the numbers of roads, the names of rivers, the numbers of regiments and the dates.
This was the modernist recoil from vaporous abstraction, situated amid the mindless slaughter of the Great War. But it was also the infantryman's view of war, juxtaposed against the vapid oratory of staff officers and politicians. And it became the view of some civilians as well. As Paul Fussell observed in The Great War and Modern Memory, a distrust of empty verbiage marked the inscriptions on the tombstones at the Somme, not the ones written by Rudyard Kipling but the ones composed by the families of the dead themselves: "In addition to the still hopeful ones about dawn and fleeing shadows, we find some which are more 'modern,' that is, more personal, particular, and hopeless.... And some read as if refusing to play the game of memorial language at all: 'A sorrow too deep for words.'" This vernacular modernism resurfaced in the minimalist vocabulary of ordinary American soldiers during World War II—so many of whom "just had a job to do" and "just didn't want to talk about it" when they returned home. This was more than inarticulateness in the face of the unimaginable; it was also an implicit recognition of the valor of silence.
The modernist disdain for conventional pieties had subtle political consequences. To be sure, a rejection of platitudinous patriotism did not require an embrace of absolute pacifism. To acknowledge the grim absurdity of organized violence was not to deny its occasional necessity—even, sometimes, its justice. Yet the modernists' linguistic sobriety, like the veterans' eloquent silence, did imply an important conclusion for public policy debate: that any responsible discussion of modern war must try honestly to confront its concrete actuality. The language of war and the politics of war were intertwined.
JAMES DAWES DISCUSSES their relationship in his new book. As he reminds us, the idea that war is lost in its translation to words can be linked to a larger intellectual tradition—a way of thinking about the relation between language and violence that is embedded in modern liberal thought. From this perspective, language and war are at opposite ends of a continuum. Violence is not only unspeakable; it is what happens when "talks break down." Diplomacy is—almost always—preferable to military action. Talking is the best (and maybe the only) alternative to fighting. These notions find elaboration and legitimation in liberal political theory, from Hannah Arendt and Jrgen Habermas to the tradition of international law embodied in the Geneva Conventions. That tradition depends on the assumption that language can ultimately master violence, that disputes can be resolved through what Habermas calls "the unforced force of the better argument." Behind this hope lies a powerful faith—some would say a nave faith—in human rationality.
Alongside the liberal rationalists' "emancipatory model" of language, with its belief in words as instruments for promoting agreement on universal values, Dawes places a "disciplinary model" that defines language as an expression of violence rather than an alternative to it. "The emancipatory model presents force and discourse as mutually exclusive," whereas "the disciplinary model presents the two as mutually constitutive." Dawes aims to press the disciplinary model into the service of the emancipatory model, to deepen and to darken the optimistic vision of liberal rationality by stressing the uses of language as force, without losing the rationalists' faith in language altogether.
His enterprise is clouded from the outset by conceptual confusion. Consider the ambiguity of the word "force." It takes many forms—rhetorical, logical, moral, emotional; and most of these forms can be expressed through language. Dawes knows this, and in the end he deploys this ambiguity to bring the two models together. Yet for most of his book he keeps them apart by presenting the disciplinary model in its most dogmatic and least defensible form. Throughout much of Dawes's account, the disciplinary model simply reduces linguistic force to physical violence, and refuses to acknowledge the distinction between a bureaucratic memorandum and a burst of machine-gun fire. Dawes does little to clarify or even to call attention to this fundamental confusion, and as a consequence his book is enveloped by an oddly unreal atmosphere. It is the rarefied air that one finds on the thirty-sixth floor of the humanities building.
THE IMPULSE TO blur the boundaries between language and violence—between words and deeds, even if sometimes words are deeds—is not simply an academic malady. It has possessed a whole tradition of thinkers whose work has had decidedly non-academic consequences: from Clausewitz the "realist," who defined war as "another form of speech or writing," to Goebbels the propagandist, who showed how language could be manipulated to promote the systematic extermination of an entire people. But in recent decades the discovery that words can wound has acquired a new ideological purpose and a new academic legitimacy. The disciplinary model has emerged in the work of such cultural theorists as Georges Bataille, Michel Foucault, Louis Althusser, and Judith Butler, who have defined language as by its very nature a form of coercion.
In the postmodern academy, one does not have to be a Nazi to conflate language and violence. During recent decades this tendency has spread on the cultural left, as an outgrowth of the "Western Marxist" effort to leaven the lump of materialist explanation. How did the ruling class continue to rule? Clearly its success involved the command of language as well as the command of the means of production. Indeed, the devotees of the disciplinary model began to argue that human subjectivity itself was constituted through language—a language over which the human subject had no control. Althusser's concept of "subject interpellation," as Dawes observes, "depicts the individual as constituted through the ideology and language of a culture in much the same way that a pedestrian is hailed and accosted on the street by a police officer." In this formulation, as elsewhere in the disciplinary model, the grain of an insight was engulfed by a relentless reductionism. Yes, language can perform coercive functions for the powerful, but surely it is not reducible to those coercions—nor can they be routinely equated with physical brutality.
Despite those difficulties, the disciplinary model became the norm for left- leaning humanists in the academy. The idea that "words are weapons" justified campaigns against "hate speech" and ever more elaborate definitions of sexual harassment. The language of the seminar room became charged with anomalous melodrama, as Dawes unwittingly demonstrates when he quotes Butler's first- person account of coming into existence through language. It is, she says, a primal trauma. "To be named by another," Butler observes, "is traumatic: it is an act that precedes my will, an act that brings me into a linguistic world in which I might then begin to exercise agency at all. A founding subordination, and yet the scene of agency, is repeated in the ongoing interpellation of social life."
The implications of this passage are revealing and disturbing. It invokes the central poststructuralist maxim, derived from Jacques Derrida—the idea that naming is an act of "originary violence" (to use Derrida's term), the opening move in authority's unending effort to categorize otherness, to de- value difference, to establish hierarchy. There is an important idea here, of course: no thoughtful person would deny the capacity of naming to legitimate unequal power relations, or the pain that systems of classification can inflict on those labeled backward or deviant. But that is hardly the end of the story. Naming can also imply value, dignity, and attachment. Like other linguistic practices, it has multiple meanings depending on historical circumstances; and its complexity mirrors the imperfections and the possibilities of social existence.
In Butler's comments, however, this multiplicity is swept away by a flood of hyperbole and by a simplistic obsession with power. Butler's outrage at "an act that precedes my will" reveals the radical individualism at the heart of so much postmodern thought—the Nietzschean celebration of an almost god-like self. Her description of what amounts to a fall into language seems to me to be merely a fall from fantasies of perfect autonomy into interdependent life as a social being. One is tempted to say, unphilosophically: get over it.
YET THE DISCIPLINARY model involves more than linguistic libertarianism. As Dawes occasionally hints, it derives from other modes of thought that lead in more interesting directions. Through the last decade, Butler and other theorists have promoted an emphasis on the "performativity" of language—and, more broadly, on the performance of culture. For the last ten years or so, these ideas have been cropping up everywhere in the academic humanities, especially in cultural studies and queer theory, which depend heavily on the assumption that such categories as race or gender are socially performed rather than biologically given. The philosophical origin of performativity is the work of the Oxford philosopher J.L. Austin, who in a modest little book (actually a posthumously published series of lectures) called How to Do Things With Words (1962) argued that under certain circumstances—say, a marriage ceremony—language could have "illocutionary" as well as "perlocutionary" force. That is, it could constitute reality, rather than merely naming or describing it.
One can only imagine the Oxford don's bemusement at the uses to which his ideas have been put. Hugely influential for decades among professional philosophers, Austin's thought has now become hip, the accompaniment to clever bohemian weddings (as reported in the August issue of Harper's) as well as to abstruse analyses of cross-dressing. Yet his work has a significance that most interpreters of culture have not sufficiently explored. If it is detached from the reductionist implications of the disciplinary model, the Austinian tradition presents itself as a bracing alternative to positivism. Austin's emphasis on the "illocutionary" force of language allows for more creative interpretations of gestures, incantations, and other cultural expressions that cannot simply be evaluated in terms of their empirical truth or falsity, but that—if they resonate with normative traditions and individual needs—can foster a different kind of truth, an emotional or moral truth. As a number of anthropologists have discovered in their accounts of sacred ritual, the Austinian tradition opens up the possibilities of language beyond the duality of emancipation and discipline. To consider the language of war in this larger context would be to explore the boundaries of poetry and music, to ponder what resources we have when words fail us in their conventional forms, to listen to the voices of silence in their vernacular as well as their modernist versions.
Dawes does not really consider the larger implications of this view of language. His aims are more modest, though still significant. He wants to marry the emancipatory model of language to the disciplinary model, in the cause of counteracting international violence. Disciplinary force, after all, can also be for the good. Dawes admires the commonsense decency of the emancipationist model, its faith in universal values and in the capacity of language to promote agreement on them; but he wants it shorn of its rationalist navet and its potential for complicity with status quo power relations. That is where the disciplinary model comes in, to show that performative language can promote the universalist goals of the liberal rationalists. Finally, at the very end of his book, Dawes rescues the disciplinary model from its equation of language and violence. Behind the dogmatic assertions, he finds a more defensible intellectual strain: an emphasis on language as the power to constitute reality, rather than merely to reflect it; a recognition that language can embody violence, but also other more beneficent forms of force—including the moral force of international law.
THE PIVOTAL FIGURE in this endeavor is William James. James's insistence on the pragmatic consequences of thought combined with his radical empiricism to underwrite his belief in the power of belief: if we act as if our convictions are true, that action will contribute to making them true. ("My first act of free will shall be to believe in free will," he wrote in a youthful moment of crisis.) In a thinker without James's tragic sense of "life's bitterer flavors, " the belief in belief could degenerate into mere positive thinking. But it could also lead in more fruitful directions, as Dawes's argument suggests.
After many textual detours, The Language of War finally enlists James to preside at the marriage of the emancipatory and disciplinary models. Dawes concludes with a Jamesian flourish: "it is the act of treating [moral categories] as real that makes them real." Bringing two warring academic traditions together in the service of a benign ethical aim is surely a worthy goal, a necessary counterpoint to the demented dualisms that plague our intellectual life. And Dawes eventually pulls it off, on his very last page. But the overall effect of the book is unsatisfying: there are too many interesting ideas left barely articulated. Dawes often seems more concerned with demonstrating his erudition than with developing a coherent argument.
A part of the problem is Dawes's crabbed academic style, which rarely allows him to make a straightforward statement. A single example will suffice. It concerns the reason (as if there were only one) why Northern soldiers fought in the Civil War: "As James McPherson reveals," Dawes writes, "Union soldiers of all ranks, ethnicities, and levels of education were motivated to fight because they perceived secession as an unacceptable subversion of the hallowed idea that a generalized communicative consensus buttressed by a verbal artifact could achieve a force equivalent to the physical coercion that attacks monarchy. " Whatever this muddle may mean, it is a ludicrously rarefied account of soldiers' motives in wartime. In its wild abstraction, it is a little insulting to the experience of battle.
The literariness of Dawes's approach leads him to focus on a handful of texts, which provide a thin and arbitrarily selective record of "literature and culture" between the Civil War and World War II. It is hard to imagine how any interpretation of the American language of war could ignore the rhetoric of Protestant Christianity, especially its faith in Providence. The idea that the United States has a divinely ordained destiny in the sacred drama of world events has pervaded our public life from the Revolutionary era down to the present, suffusing the language of war--or at least the official language of war—with a sense of redemptive purpose. (World War II, the most easily justifiable of our wars, was also the least susceptible to this rhetorical strategy.)
The Providentialist tradition could be inflected in different idioms, from melodrama to tragedy. The Reverend Henry Ward Beecher demonstrated the melodramatic mode in his oration at the ceremony commemorating the re-taking of Fort Sumter in February 1865, as he imagined the "mighty miscreants" of the Confederacy being "whirled down to perdition" at the Last Judgment. Only a few weeks after Beecher cast the Confederates into Hell, Lincoln demonstrated the power of Providence as tragedy in his second inaugural address, when he characterized the Civil War as a national expiation—common to North and South alike—for the sin of slavery. Providentialist rhetoric could flatten or deepen the language of war; it could foster arrogance in victory but also humility in the face of unimaginable loss.
By ignoring the religious dimensions of the language of war and concentrating on a handful of literary and philosophical texts, Dawes creates a curiously abstract and fragmented version of American culture. And by restricting his range to the emancipatory and disciplinary models of language, he produces an impoverished account of human expression under duress. We learn much about the violent force of language, and much about its moral force as well, but very little about its aesthetic force—its capacity to take us, however fleetingly, to a world elsewhere, beyond the realm of morality and violence altogether.
STIL, WITHIN HIS limits, Dawes is often provocative. A pattern emerges, though the reader has to work hard to discern it. Judging by Dawes's examples, the writers whose work we remember—Crane, Hemingway, Heller—tack between the two ways of thinking about language that he has described. They emphasize the gulf between talking and fighting, the honor of silence in response to the unspeakable; and they recognize that language can create alternatives to violence. But they also see that language can itself be a form of coercion, an instrument of violence—especially when it is unsutured from dialogue and even from concrete referents. Propagandistic slogans and bureaucratic directives reduce language to a prisoner of the powerful, but writers are still plotting its escape. In the struggle between words and war, they remain partisans of language. The appalling and unprecedented casualties of the Civil War posed a serious challenge to Western traditions of describing combat. General William Tecumseh Sherman adopted a successful and characteristically modern method of evasion: quantification. In his Memoirs, Dawes writes, Sherman represented the chaos of battle as "a finite collection of clean, containable units of information." A bank manager before the war, Sherman relied on the comforting power of statistics, the modern numerology. He referred to "valuable" men (officers) by name (in his tradition, naming was a mark of dignity rather than oppression), while "those who do not count [enlisted casualties] are simply counted." This approach has served senior officers well through the subsequent century; it is the language of the official report and the authorized biography—the sort of writing that gives claims to objectivity a bad name.
Ulysses S. Grant developed a more refined version of objectivity. The war showed him, he said, "how little men control their own destiny." "Circumstances always did shape my course different from my plans," he remarked in his Memoirs, which is characterized by the use of the passive voice and the systematic suppression of the author's self. Through this self-obliteration, Grant achieved the kind of objectivity that is tinged with empathy—his detachment fostered the recognition, for example, that the enemy was as fearful of him as he was fearful of the enemy.
"Grant's objectivity," Dawes writes, "demands a self-transcendence that elevates him precisely by treating him as a fallible, potentially expendable object." Habermas could not have asked for a better demonstration of liberal rationality in the public sphere. "This self emptied of subjectivity," Dawes comments, "or rather continually outstripping its subjectivity, becomes the model of what a public man ought to be," one capable (like Grant) of "a personality-erasing thought experiment, in which we view the world from behind a veil of ignorance about ourselves, thereby severing our special connections to 'private' interests." One does not have to be a Freudian to see that such a project is fated to fail. This is a tragic contradiction at the heart of Grant's ideal of objectivity: the impossibility of achieving true self-transcendence, combined with the necessity of continuing to try.
Grant recorded his thought-experiment when he was an old man dying of cancer, who in spite of his pain had managed to achieve a stoical serenity. Younger generations, coming of age in the shadow of the Civil War, had more difficulty cultivating Grant's Olympian perspective on the insignificance of human life (including his own). The philosopher Josiah Royce, perhaps the last Absolute Idealist in America, devoted his career to searching for unified meaning against the backdrop of "a world of brute natural fact" determined by "vain chances" and "caprices"—"a chaos of unintelligible fragments and scattered events." Royce wanted to reduce the confusion of the all to the simplicity of the one, to relieve the misery of our separation from the Absolute (or God) by declaring the universe to be the "unified idea of a single creative will."
To which William James responded, "Damn the Absolute!" James hated meaningless pluralism as much as Royce did, but he resisted Royce's totalizing solution. Yes, life presents us with situations that are fundamentally absurd, situations in which we cannot reconcile our human aspirations with what we know to be philosophically true of the world. But to act effectively in the world we have to make coherence from absurdity. This was the strategy that allowed James to move from skepticism to pragmatism: the notion that belief is a performative act rather than an act of recognition. "Belief, in other words, changes the conditions of existence," Dawes writes, "making real what had previously been neither real nor unreal." It is a cheering thought—but it is a thought easily vaporized in the heat of battle.
Or so one would think. The knowledge of death, after all, constitutes the original source of our sense of the absurd—the realization that all our desperate longings and ambitions are temporary, time-bound, destined for dust. And the Civil War made untimely death a banal and potentially meaningless fact of everyday life. The theater of war was a theater of the absurd, a daily reminder of our own insignificance. Stephen Crane recognized this and explored the terrain retrospectively in (among other writings) his story "An Episode of War." A Union army lieutenant is seriously wounded in his right arm. He wanders about the battlefield in a daze, receives some distracted and incompetent attention from a fellow officer, and eventually encounters a bullying surgeon who at first refuses to amputate and later (offstage) proceeds to do so. When the lieutenant returns home, "his sisters, his mother, his wife, sobbed for a long time at the sight of the flat sleeve. 'Oh well,' he said, shamefaced amid these tears, 'I don't suppose it matters so much as all that.'"
By recognizing his own insignificance, Dawes suggests, the lieutenant has developed the capacity to see himself from outside. This objectivity, like Grant's, can paradoxically produce meaning through ironic distance. A severed arm matters, but amid the piles of corpses at Fredericksburg or Antietam it does not matter all that much. Crane's lieutenant becomes the standard character of modern literature in the ironic mode--the man, as Fussell writes, "whom things are done to." Irony was a weary, chastened version of James's response to absurdity. And irony was the characteristic self-protective move of war novelists through the twentieth century.
FOR AMERICAN MODERNISTS as for their European counterparts, World War I exposed the lies of the language of progress, above all the equation of industrialization and moral advance. Yet even the most disillusioned among them recognized that words preserved a connective dimension as well as a destructive one: as Hemingway and his contemporaries understood, language in wartime could foster care as well as harm—a sense of human significance as well as human insignificance. Alongside the revulsion from the empty rhetoric of public life ("the words sacred, glorious, and sacrifice and the expression in vain"), there arose the recognition that language could heal the psychic wounds inflicted by war.
W.H.R. Rivers, the British psychiatrist who pioneered the treatment of battlefield trauma, argued that the soldier's experience
should be talked over in all its bearings. Its good side should be emphasized, for it is characteristic of the painful experience of warfare that it usually has a good, or even noble side, which in his condition of misery the patient does not see at all, or greatly underestimates. By such conversations an emotional experience, which is perhaps tending to become dissociated, may be intellectualized and brought into harmony with the rest of the mental life ... the relief afforded to the patient by the process of talking over his painful experience, and by discussing how he can readjust his life to the new conditions, usually gives immense relief and may be followed by a great improvement, or even by the rapid disappearance of his chief symptoms.
This is the classic case for talk therapy—mentioning the unmentionable, dragging trauma from the murky depths of unconscious memory to the reassuring light of day. Sometimes the valor of silence concealed crippling pain. Rivers found ways to relieve it.
The difficulty, and Rivers knew it, was that in "curing" the victims of shell shock he was returning them to combat, the source of the trauma. It is the dilemma of all therapies that aim at "readjusting" the patient to conditions that promoted "maladjustment" in the first place. And it is part of the larger paradox involved in any attempt to use language as a means of making trauma coherent. In giving voice to suffering, we render it somehow more acceptable. Sometimes we even make it beautiful, as Dawes notes, citing Frederic Henry's description of an Austrian artillery barrage: "Like strange flowers blossoming in the 'wet autumn country,' the explosions are 'soft puffs with a yellow white flash in the center.'" Still, the aestheticizing of violence is at best a holding action. "The material world's pressure against language and memory is implacable," Dawes writes, "like the blood that drips onto Frederic from the soldier helplessly bleeding to death in the stretcher above him." Unlike the theorists who blithely equate language and violence, Dawes has a healthy respect for the intractable actualities of biological existence.
Yet biological experience can have philosophical consequences, which Dawes explores in his discussion of luck. War involves not the survival of the fittest but the survival of the fortunate. The inexplicable vagaries of chance—sparing this one, destroying that one—are pervasive and inescapable. On this point, from the common soldier's point of view, Clausewitz's acumen is undeniable. War is a card game, he wrote, "a meaningless exercise in calculating chance." No wonder men in combat festoon themselves with amulets and ceaselessly perform luck rituals. The sovereignty of luck on the battlefield plays havoc with traditional morality--particularly the Kantian tradition that emphasizes the centrality of moral intentions and the irrelevance of actual consequences. If we act in accordance with universal moral law, says the Kantian, we will have acted morally and thus achieved true freedom, whatever the practical results of our action.
But World War I made it harder to be content with good intentions. Four years of explosions shook the foundations of universal law. When the explosions stopped, John Dewey (among others) decried Kantian metaphysics as a denial of chance. After the war, as Dawes points out, a number of naturalistically inclined philosophers joined Dewey in rejecting Kantian intentionalism in favor of "a consequentialist embrace of contingency." The consequentialist view held that we should obey a moral law even as we realize that there is no philosophical basis for it, provided that obedience will lead to desirable consequences. This was a skeptical, naturalistic version of pragmatism, without James's belief in the creative powers of belief. In the end, even Dewey sensed that this approach underwrote an impoverished, instrumentalist vision—one that justified just about any action if it contributed to a desired end.
For Dawes, the alternative to Kantian metaphysics and Deweyan pragmatism is the existentialist vision that Hemingway developed in A Farewell to Arms. Frederic Henry is an implicit Kantian: he cultivates moral discipline and plays by the rules. But he leaves himself open to chance by falling in love with Catherine Barkley, a consequentialist who disdains conventional morality and tries to persuade him to desert the army. Frederic is morally unlucky: he deserts, abandoning his conventional moral code, but then Catherine dies in childbirth. Frederic takes a consequentialist view: "It was the wrong thing because something bad happened," he says. Yet in the final scene, when Frederic kisses the dead Catherine and concludes that "it wasn't any good," Dawes finds an affirmation of longings for connection to "the good" in Hemingway. I am not convinced, but I am willing to give Dawes the benefit of the doubt. Maybe Hemingway really does both deny the existence of "the good" and at the same time assert its potentially transcendent reality. In that case, A Farewell to Arms can indeed be shelved with the likes of Simone de Beauvoir. "Meaning is never fixed," she wrote, "it must be constantly won."
THE TRAUMA OF world war broke down all sorts of boundaries—not only Kantian traditions of universal moral law, but also the very notion of an independent, separate self. Men who descended daily into a stew of human decay were not likely to preserve exalted notions of individual autonomy. (Pat Barker's recent novel Regeneration describes a man, one of Rivers's patients, who fell into a rotting corpse and could not stop smelling or tasting it for months.) Objective standards of truth were not likely to hold up well either. By the 1930s, the fascist rhetoric of reality was making a mockery of the ideal of linguistic transparency. Epistemological categories as well as moral ones were thrown into disarray.
There were two sorts of literary reaction, according to Dawes. One was the modernist assertion of "difficulty" against the false transparency of fascism. Oddly, he cites the work of Maurice Blanchot as an example, when he might have stayed closer to home and mentioned the entire editorial board of Partisan Review, men such as Clement Greenberg and Dwight Macdonald, who were constantly celebrating the values of "difficulty" against the philistinism of Right or Left totalitarianism. The other literary reaction was the reassertion of the need for genuine transparency, a return of language to its rightful role as depiction of reality. "Good prose is like a windowpane," George Orwell wrote. Orwell (also unmentioned by Dawes) was probably the most influential and honorable exponent of this philosophically naive but ethically bracing view. John Hersey, who reported with unflinching candor from Guadalcanal and Hiroshima, was probably its ablest American practitioner.
The desire to revive plain speech was intensified by the bureaucratic denaturing of language under way in the giant military organizations created by World War II. This moment marks in many ways the birthplace of the disciplinary model of language. The blank, impersonal face of modern authority hovers over the traumatic fall into language imagined by contemporary theorists—the fear that being named is akin to being hailed by a policeman. During the 1930s and 1940s, public language was pressed into the service of unprecedented violence. In Modernity and the Holocaust, Zygmunt Bauman convincingly shows how the organizational vocabulary of functional rationality diffused any sense of personal moral responsibility among Nazi officials, any sense of engagement with actual human beings. Goebbels's propaganda was similarly divorced from mutuality and referentiality.
Yet the fascists had no monopoly on denatured language. In Joseph Heller's Catch-22, the American Army Air Corps deploys a language system characterized by the absence of any evidentiary rules. Authority rather than referentiality determines the way events are represented. As a consequence the novel creates a grotesque linguistic atmosphere full of floating signifiers (the name "T.S. Eliot" that characters keep repeating sotto voce without any rhyme or reason) and mystified performatives (the meaningless phrase "bomb pattern" that General Peckem thinks will enhance his authority). By exposing such perversions of language, novelists such as Heller and Kurt Vonnegut helped to explain (and implicitly to justify) the silence of the men who came back from the war.
YET THE PARTISANS of language refused to retreat to silence. (Intellectuals will talk; it's what they do.) Hersey, Arendt, Orwell, and others continued to assert the need to give voice to silence (as, indeed, Heller and Vonnegut did in their own way)—to revive objectivity and accuracy in representational language. And the same effort characterized the growing influence of international law in postwar international diplomacy. The formalist tradition that had suffered such a blow with the failure of the League of Nations was exhumed, out of desperation, by nations weary of slaughter. The growth of international law, the spread of a global human rights movement, and the whole attempt to police nations' behavior in accordance with the Geneva Conventions: all of these developments have resonated in the realm of language.
The Geneva Conventions are rooted in the liberal rationalist faith that language can master indiscriminate violence. This notion of objectivity (unlike Grant's) rests on the humanist assumption that individual people matter. Naming, in this tradition, is not a crude power move; it is a foundation of human dignity. "Every child shall be registered immediately after birth and shall have a name," announces the International Covenant on Civil and Political Rights, seeking to ensure that states grant personhood to all their citizens. This is a far cry from the chic pessimism of Derrida and Butler.
The chief significance of the Conventions, in Dawes's view, is not substance but procedure. The myriad rules magnify the importance of language and multiply the opportunities for dialogue. Against the arbitrariness of power and the confusions of war, the Conventions offer precise definitions, consistent rules, an overlapping vocabulary between belligerents. Emotionally charged propaganda becomes neutral reference, enemies become combatants, vermin become civilians. Behind all this benign language is the easily parodied rationalist assumption that "war devolves into savagery not because savagery is the nature of humans but because war confuses us." Violent acts, from this view, are based on category mistakes—such as American commanders conceiving the bombing of civilian neighborhoods in Japan as simply "saving American lives." By overlooking more visceral motives—longings for vengeance and self-preservation intensified by the possibility of racial hatred—the rationalist etherealizes war-making beyond recognition.
THE LAWS OF war are dependent on the assumption that they can be universally applied--that one can always depend, for example, on the ability of war-makers to discriminate between military and civilian targets. Of course it is always possible to dismiss the very idea of "rules of war" as a schoolboyish fantasy of fair play: "This is not a game of cricket," the Japanese Colonel Saito says to the British Colonel Nicholson in The Bridge Over the River Kwai, after Nicholson has handed Saito a worn copy of the Geneva Conventions. Yet more sustained objections can also be raised. The Conventions' intent to avoid the murkiness of subjectivity can lead to an abstract universalism that overlooks idiosyncratic situations and context-dependent decisions. International law can unintentionally legitimate war by accepting the rhetoric of "clean, smart bombs" or "appropriate" levels of casualties. Violence, as always, can be legitimated by bland, universalist language.
But that is no reason to throw out universalist language altogether. "Whether or not one finally accepts these moral categories as objectively valid and universal in scope, procedurally grounded in the workings of our autonomy as Kant argued or in the interactive structure of discourse as Habermas argues, " Dawes writes, "it is at the very least in our collective self-interest to treat them as if they were so." The Conventions must by now be exerting at least a negative pressure, as states go to "torturous lengths" to show they are not in violation.
So we are left with the idea of language as performance—not the melodramatic notion that reduces all language to coercion, but the modest insight that to articulate a principle forcefully is to give it a palpable reality. The language of universal rights, however flawed, is the best alternative we have to unspeakable slaughter. Talking really is better than fighting. Whitman was right, but so was James. The real war will never get into the books, but we have to keep trying to put it there.
This is a humane and sensible conclusion, but it leaves something out. There are more languages of war than are dreamt of in Dawes's philosophical universe. To sense them we need to return from law to literature. In The Ghost Road, Pat Barker's protagonist Billy Prior echoes Hemingway's Frederic Henry in asserting that only the names of places have any meaning left: "Mons, Loos, the Somme, Arras, Verdun, Ypres." But then he looks around at the "linked shadows" of himself and his men and remembers "another group of words that still mean something. Little words that rip through sentences unregarded: us, them, we, they, here, there. These are the words of power, and long after we're gone, they'll lie about in the language, like the unexploded grenades in these fields, and any one of them'll take your hand off."
Prior dies in the last week of the Great War, but Barker has one unexploded grenade left. From Rivers's memoirs she fashions a scene of a dying soldier, an idealistic young officer named Hallett.
The whole left side of his face drooped. The exposed eye was sunk deep in his skull, open, though he didn't seem to be fully conscious. His hair had been shaved off, preparatory to whatever operation had left the horseshoe-shaped scar, now healing ironically well, above the suppurating wound left by the rifle bullet. The hernia cerebri pulsated, looking like some strange submarine form of life, the mouth of a sea anemone perhaps. The whole of the left side of the body was useless. Even when he was conscious enough to speak the drooping of the mouth and the damage to the lower jaw made his speech impossible to follow. This, more than anything else, horrified his family. You saw them straining to understand, but they couldn't grasp a word he said. His voice came in a whisper because he lacked the strength to project it. He seemed to be whispering now.
"Shotvarfet," Hallett seems to say. "Shotvarfet." Finally Rivers realizes what he is saying: "it's not worth it." The cry spreads across the ward. "A buzz of protest not against the cry, but in support of it, a wordless murmur from damaged brains and drooping mouths. 'Shotvarfet. Shotvarfet.'" The cry goes on and on, until in the end the mangled words fade into silence, and Hallett dies.
This is another language of war, a language closer to ritual incantation than to reasoned discourse—not the sort of language you could use in political debate. Without question there are times when political language is appropriate language, when decent and intelligent people can decide that a particular war is "worth it," particularly when their country is under attack. In recent months, arguments for a "just war" against terrorism have acquired unprecedented cogency, though the nature of that war and the character of that justice remain legitimate subjects for public controversy.
In this uncertain climate, to ponder the actual consequences of organized violence is not to endorse pacifism; it is only to think responsibly and realistically about war. Amid the official language of public policy debate, it may be sometimes worth recalling that darkened ward of broken men and their murmured chant of hopelessness. To ignore that cri de Coeur would be worse than a category mistake.