9/11 and the “Good War” on Terror
How the Greatest Generation helped pave the road to Baghdad
Christopher Hayes
On September 11, 2001, George W. Bush wrote the following impression in his diary: “The Pearl Harbor of the 21st century took place today.” He wasn’t alone in this assessment. In the days after the attacks, editorialists, pundits and citizens reached with impressive unanimity for this single historical precedent. The Sept. 12 New York Times alone contained 13 articles mentioning Pearl Harbor.
Five years after 9/11 we are still living with the legacy of this hastily drawn analogy. Whatever the natural similarities between December 7, 1941, and September 11, 2001, the association of the two has led us to convert – first in rhetoric, later in fact – a battle against a small band of clever, murderous fundamentalists into a worldwide war of epic scale.
The toll has been steep: more than $1 trillion will be spent for the ongoing combat and occupation in Afghanistan and Iraq; 2,900 dead American soldiers, 20,000 wounded, and somewhere between 50,000 to 150,000 dead Iraqi and Afghan civilians. We have detained hundreds of “enemy combatants” in Guantánamo, denying them due process, and until recently, habeas corpus. The terms “black sites” and “extraordinary rendition” have entered our lexicon, respective euphemisms for secret U.S. prisons abroad where torture occurs and for the practice of transferring prisoners to other countries that employ torture. Polls show international opinion of the United States at record lows.
How did we get here?
The best place to look for the answer is not in the days after the attacks, but in the years before. Examining the cultural mood of the late ’90s allows us to separate the natural reaction to a national trauma from any underlying predispositions. During that period, the country was in the grip of a strange, prolonged obsession with World War II and the generation that had fought it.
The pining for the glory days of the Good War has now been largely forgotten, but to sift through the cultural detritus of that era is to discover a deep longing for the kind of epic struggle the War on Terror would later provide. The standard view of 9/11 is that it “changed everything.” But in its rhetoric and symbolism, the WWII nostalgia laid the conceptual groundwork for what was to come – the strange brew of nationalism, militarism and maudlin sentimentality that constitutes post-9/11 culture.
To fully understand what has gone wrong since 9/11, it is necessary to rewind the tape to that moment just before.
Before the storm
The late ’90s was a strange time in American history. With the Cold War over, the country faced no overarching enemy for the first time in decades. The United States seemed possessed of no greater national purpose than making money through IPOs and an ever-expanding Dow. Our politics were dominated by the petty and trivial: from school uniforms to the president’s sex life.
Memories of former glory rushed in to fill this vacuum. In 1994, the 50th anniversary of D-Day prompted both an NBC special commemoration hosted by Tom Brokaw and the publication of historian Stephen Ambrose’s D-Day June 6, 1994: The Climactic Battle of World War II, which would go on to sell 800,000 copies. The book attracted the attention of Steven Spielberg – a man with a preternatural sense of the zeitgeist – who would launch the pop cultural phenomenon in all its excess in 1998 with Saving Private Ryan, which opened to rave reviews and grossed $433 million.
An explosion of associated products came on the heels of Saving Private Ryan’s commercial success: Brokaw’s three “Greatest Generation” books (which sold 5 million copies), a book about veterans of the Pacific Theater called Flags of Our Fathers (a film adaptation produced by Spielberg and directed by Clint Eastwood will be released this fall), and a clunking Bruce Willis vehicle called Hart’s War. With such an irresistible financial incentive, Ambrose would generate 10 more books between 1994 and 2001, including a distilled history of the war for “young readers” called The Good Fight. Tom Hanks, who starred in Saving Private Ryan, became a kind of WWII commemoration crusader, cutting a series of radio ads that advocated for a World War II memorial to be built on the Mall. After a seven-year-campaign, it was dedicated in 2004.
Nostalgia quickly descended into kitsch: In 1999, People named “The World War II Soldier” one of its “25 Most Intriguing People,” right next to Ricky Martin and Ashley Judd. But unlike so many pop culture phenomena, this one had legs, extending into the new millennium when Hollywood released the summer blockbuster Pearl Harbor in May 2001. Months later, HBO broadcast with great fanfare “Band of Brothers,” a miniseries based on Ambrose’s eponymous book about the exploits of the famed “E Company” as it fought its way across Europe. Produced by Tom Hanks and Steven Spielberg, the series debuted on Sept. 9, 2001.
The flag of our fathers
Explaining why he made Saving Private Ryan, Steven Spielberg told an interviewer, “The most important thing about this picture is that I got to make a movie about a time that my dad flourished in.” During the Vietnam War, Spielberg explained, he resented people like his father who were proud to be American and displayed the flag. “Only when I became older did I begin to understand my dad’s generation,” Spielberg said. “I went from resenting the American flag to thanking it.”
That American flag receives loving treatment in Saving Private Ryan’s opening moments, when it stiffly, proudly flutters across the screen. In fact, the flag, which had become a legendary culture war symbol after being torched during Vietnam protests, enjoys an earnest revival throughout the literature of the WWII nostalgia. In Flags of Our Fathers, James Bradley writes that the image of his father and his fellow soldiers raising the flag at Iwo Jima “transported many thousands of anxious, grieving, and war-weary Americans into a radiant state of mind: a kind of sacred realm, where faith, patriotism, mythic history, and the simple capacity to hope intermingled.”
In The Greatest Generation, Brokaw also celebrates this simple, old-fashioned patriotism. “They love life and love their country,” Brokaw writes of his subjects, before adding, “and they are not ashamed to say just that.”
“If there’s a common lament of this generation,” he notes later, it is “where is the old-fashioned patriotism that got them through so much heartache and sacrifice?”
It’s not just patriotism, though, that distinguishes “the Greatest Generation any society has ever produced.” According to Brokaw, members of it share “a sense of duty to their country” that is not “much in fashion anymore.” Due to the “military training and discipline” they received during the war, they are models of self-control, and complain that, “the way you’re told to raise your kids now, there’s no discipline.” They are allergic to conspicuous consumption, humble and stoic, “refusing to talk about [the war] unless questioned and then only reluctantly.” They are “self-sufficient,” and characterized by “a sense of personal responsibility and a commitment to honesty.”
If this litany of values seems familiar, it’s because in the oppositional vocabulary of the culture war, they are virtues that, like the flag itself, conservatives claim as their own. In conservative mythology, it was the baby boomers – undisciplined, self-indulgent, unpatriotic – who unmoored the country from the traditional values of their forebears. Because the right has spent the better part of three decades pillorying the cultural legacy of the ’60s, it’s impossible for any work that celebrates the WWII generation not to serve a tacit culture war function.
Even before 9/11, Karl Rove understood this all too well. In his essay “Operation Enduring Analogy: World War II, the War on Terror and the Uses of Historical Memory,” David Hoogland Noon, a history professor at the University of Alaska, Southeast, writes that even in his first campaign George W. Bush “consistently referenced World War II not simply to justify his own policy aims, but more importantly as a cultural project as well as an ongoing gesture of self-making,” positioning himself as “an heir to the reputed greatest generation of American leaders.”
“In the world of our fathers, we have seen how America should conduct itself,” Bush said in a 1999 speech at the Citadel. Now, the moment had come “to show that a new generation can renew America’s purpose.” Throughout both his campaigns, Bush would go out of his way to criticize the dominant ethos of “If it feels good, do it,” instead calling for a “culture in which each of us understands we’re responsible for the decisions we make.”
Bush’s allusions to the Greatest Generation were so persistent that the press came to see him – a Boomer child of privilege known for his youthful carousing – as a kind of throwback. Reporting on Bush’s first inaugural address, Newsweek’s Evan Thomas wrote that “Bush wants the White House to recover some of its dignity, to rise above baby-boomer self-indulgence and aspire to the order and self-discipline prized by the Greatest Generation.”
After 9/11 it seemed as if the entire country was ready to adopt the Greatest Generation values that Bush had so assiduously claimed as his own. We celebrated the manly heroism of the cops and firefighters who sacrificed their lives to save people. Editorials proclaimed the “death of irony” and a return to earnest patriotism. The flag that Spielberg had once resented and later come to love seemingly now hung from every home.
Bush, then, emerged as a kind of prophet. Because his image-makers had already portrayed him as having abandoned Boomer frivolity for Greatest Generation discipline, he seemed the natural choice to lead the country through its trials. In 2002, after congressional Democrats suffered losses in the mid-terms despite heavy campaigning from Bill Clinton, Time’s Margaret Carlson concluded this was due to a post 9/11 “shift in the culture,” in which “Clinton-era values are no longer America’s.”
“Though a baby boomer,” Carlson observed, “Bush rejects the instant-gratification ethic embraced by Clinton, the nation’s first baby boomer President. … [Bush] often laments not being one of the Greatest Generation he so admires. …Whereas Clinton liked going on MTV with 18-year-olds, Bush urges them and their parents to return to an ‘era of responsibility.’ “
The new militarism
It is impossible to separate the values celebrated in the Greatest Generation nostalgia from the experience of war itself, for the soldiers’ experiences formed the core of the entire liturgy.
Stephen Ambrose, whose work serves as the foundation for the canon, documents the minutest details of soldiers’ battle experience, expressing “awe” at what they were able to endure. When Ambrose’s account was dramatized in Saving Private Ryan, critics hailed its unvarnished look at the mayhem of battle. Janet Maslin’s review in the New York Times summed up the consensus. While “the combat film has disintegrated into a showcase for swagger, cynicism, obscenely overblown violence and hollow, self-serving victories,” she wrote, Spielberg’s film “simply looks at war as if war had not been looked at before.” This description suffices for the film’s opening sequence, but when applied to the film’s overall meaning, it obscures much more than it reveals.
In the film, a small company of American soldiers manages to survive the D-Day invasion, and are then led by their commander, John Miller (Tom Hanks), on a quest to find Private Ryan. Ryan’s three brothers have, unbeknownst to him, all recently died in combat, and U.S. General Command has decided to find the lone surviving Ryan boy and get him home to his grieving mother. Miller and his company, comprising a charmingly diverse assemblage of white guys, wander the French countryside still dotted with Germans, looking for the elusive private, who had parachuted ahead with the airborne.
But the film’s real message revolves not around Ryan, but Cpl. Timothy Upham. We first meet Upham when Miller goes to fetch him from his desk where he is poring over maps and translating communiqués from French and German. Young and wispy, with hair brushing his upper lip, Upham is a translator, not a fighter: He hasn’t fired a gun since basic training and wants to take his typewriter with him. He quickly earns the unit’s ire by annoyingly chatting everyone up and quoting books and poetry.
At one point, after engaging a German tank that manages to kill one of their own men, the American soldiers capture the lone surviving German and force him to dig his own grave before they execute him. As the German pathetically mutters nonsensical English phrases, Upham objects to Miller. “Captain, this isn’t right,” he says, “You know this. He’s a prisoner, he surrendered. He surrendered, sir.” Miller is skeptical, but ultimately swayed. He blindfolds the German and tells him to walk 1,000 paces and then turn himself in to the first American soldiers he sees. The other men grumble.
It’s not the last we see of the German. In the film’s climatic battle, as the Americans try to hold a bridge under a heavy German attack, this same former prisoner returns to shoot and kill Captain Miller. Meanwhile, during the battle, Upham is paralyzed by a fear so total that, as his Jewish comrade wrestles hand-to-hand with a menacing Nazi, he can only cower in the stairwell below, crying as the Nazi plunges a knife in the Jewish soldier’s chest.
The message is clear. In the great struggle for the future of the free world, the intellectual cannot be trusted. His concern for the laws of war means he is weak and cowardly, and will contribute to defeat. Only the true soldier can win the war. This is the ethos of the Cult of the Soldier, which would come to entirely dominate our politics in the years to follow.
“For it has been said so truthfully that it is the soldier, not the reporter, who has given us the freedom of the press,” Zell Miller boomed during his keynote speech at the 2004 Republican National Convention. “It is the soldier, not the poet, who has given us freedom of speech. It is the soldier, not the agitator, who has given us the freedom to protest. It is the soldier who salutes the flag, serves beneath the flag, whose coffin is draped by the flag who gives that protester the freedom he abuses to burn that flag.”
The Cult of the Soldier wasn’t confined solely to the Republican Party. Just a month earlier, the Democratic National Convention had been converted into a four-day military pageant, with home movies of John Kerry as a young solider, his Swift Boat crew assembled on stage on the convention’s final night, and the nominee opening his acceptance speech with a stiff salute and the words, “John Kerry, reporting for duty.”
It didn’t work. Whatever points Kerry scored from his military valor were negated by ceaseless attacks on his character: from the incessant charge of flip-flopping to the slander of the Swift Boat Veterans for Truth. More devastatingly, Kerry’s personal story didn’t fit the idealized notion of honorable, dutiful, courageous combat, because after his service he returned home to question the war’s purpose and the war crimes of his fellow soldiers.
If he played Miller in the war’s first act, he played Upham in its second.
But even without the particulars of Kerry’s own moral journey, it was still destined to fail. Reality can’t compete with the power of these established symbols. To reinforce the Cult of the Soldier is to reinforce the same set of oppositional culture war cliches that undergird our current political discourse. You’re either with the war or you are against the troops.
Not everyone was so naive as to miss this. Even before 9/11, historian Howard Zinn, himself a WWII bombardier, wrote in The Progressive that he refused to celebrate the Greatest Generation “because in doing so we are celebrating courage and sacrifice in the cause of war. And we are miseducating the young to believe that military heroism is the noblest form of heroism. … Indeed, the current infatuation with World War II prepares us – innocently on the part of some, deliberately on the part of others – for more war, more military adventures, more attempts to emulate the military heroes of the past.”
The experience of Vietnam had largely succeeded in cleansing Americans of whatever romantic notions of military heroism they may have once held dear. For neoconservatives, our collective suspicion of war was a weak-kneed impediment to fulfilling our imperial calling, a national illness they diagnosed as “Vietnam syndrome.” Searching for a cure took up no small amount of conservative energy, but it was the centrists and liberals who produced the WWII nostalgia who ultimately provided it.
It is a grand irony that Spielberg claimed repeatedly that his entire motivation behind making Saving Private Ryan was to deconstruct the simplified version of WWII that Americans had come to accept. “All wars,” he said in a typical interview, are “chambers of horrors.” And that’s certainly true of the film’s opening and of the gruesome descriptions in Ambrose’s books and Brokaw’s recounting. But what emerges from these works is a picture of war as a chamber of physical horrors – torn limbs, exposed viscera, muck, blood. Absent completely are the moral horrors of combat, the horror of taking a life, of feeling the killer within. There’s a good deal of evidence that suggests the most traumatic experience of war isn’t being the target of violence, but rather the agent. A 1994 study of post-traumatic stress in veterans of World War II, Korea and Vietnam found that “responsibility for killing another human being is the single most pervasive, traumatic experience of war.”
So when, as Spielberg and Brokaw both point out, WWII veterans refuse to say they are heroes, it may not be due to any generational humility, but rather because, in their view, they really aren’t heroes. Taking another human life may sometimes be necessary, but it is rarely, if ever, heroic.
In fact, the more recent Greatest Generation texts by and large display far less moral nuance than the classic World War II literature produced by the men who fought in it. In Catch-22, to name just one example, there is no glory or moral clarity, only surreal, horrific absurdity. At one point, as Yossarian is about to embark on a bombing run, he asks his comrades, “Do you guys realize, we are going to bomb a city that has no military targets, no railroads, no industries, only people?”
The WWII that emerges from accounts of the late ’90s is one scrubbed clean of its moral complexity. There is no mention of American big business financing the build-up of the Nazi war machine, no America First campaign determined not to shed American blood for European Jews, no firebombing of civilians in Dresden. The war was difficult, yes, and bloody, but pure and just: a battle, not to put too fine a point on it, between good and evil.
In the hands of the men who would come to dominate American military policy in the Bush administration, this Manichean framework was a useful template to apply indiscriminately to any and all of the military confrontations they had long sought. To the neocons and some breakaway lefties, al-Qaeda members are “Islamofascists,” 21st century heirs to the murderous ideologies of Nazism, fascism and totalitarianism. It is always Munich 1938, every dictator is a “tyrant,” and anyone opposed to a state of perpetual war is guilty of “appeasement.”
“In the 20th century, some chose to appease murderous dictators, whose threats were allowed to grow into genocide and global war,” Bush said in a March 17, 2003, address that would herald the beginning of the bombing of Iraq. “In this century, when evil men plot chemical, biological and nuclear terror, a policy of appeasement could bring destruction of a kind never before seen on this earth.”
Making WWII the touchstone for martial combat allowed the militarists we politely call “neoconservatives” to imbue all wars with the same moral purpose. The Greatest Generation nostalgia succeeded in helping to subtly shift the burden of proof, such that wars were presumed innocent and righteous, as opposed to the far more sane position that war is guilty until proven innocent.
If there’s a single guiding ethos for the Bush’s administration’s foreign policy, it is this: that contrary to the age-old insight about the “fog of war,” war brings moral clarity even as it clouds the senses. In the first days of the escalating missile and rocket strikes between Israel and Hezbollah, Dan Bartlett, a White House aide, explained that “[The president] mourns the loss of every life. Yet out of this tragic development, he believes a moment of clarity has arrived.”
Through the crucible of battle, evil and good announce themselves. In the absence of violence, they remain hidden.
The perils of unity
The people who produced the books and movies that would come to define WWII nostalgia were by no means reactionaries. Spielberg is famously liberal, Brokaw widely rumored to be a Democrat, and Ambrose an establishment centrist who in 1995 penned an op-ed calling for Colin Powell to run for president.
So whatever the nationalistic and militaristic effects of the symbolic vocabulary they built, war and patriotism weren’t the primary aims. No, what seems to motivate the soft-focus reflections on the ’40s is the unparalleled experience of unity that the Good War created. “The one time the nation got together was World War II,” says Sen. Daniel Inouye (D-Hawaii) in The Greatest Generation. “We stood as one. We spoke as one. We clenched our fists as one.”
By appealing to an era of broad national consensus, Brokaw, Spielberg and Ambrose tapped a popular urge to rise above the social striations and fissures of post-’60s upheavals. After 30 years of culture war, they were calling for a truce. And as the initial reaction to 9/11 showed, Americans were ready for one.
On the 60th anniversary of Pearl Harbor, with the country still just three months removed from the 9/11 attacks, George W. Bush invoked, as he would many times in the years that followed, the unwavering unity America had displayed during World War II. “During four years of war,” he said, “no one doubted the rightness of our cause; no one wavered in the quest of victory.”
A state in which “no one doubts the rightness” of its cause is a state in which politics has ceased to exist. In retrospect, that is what the nation sought in the waning days of the 20th century. Crowding into theatres to watch Saving Private Ryan, curling up to read The Greatest Generation, Americans were longing for something greater, more noble and less petty than mere politics. But mere politics turns out to be the only bulwark we have against the collective madness that war engenders. When politics dies, when it is suffocated underneath the warm blanket of patriotic consensus, the conscience of the republic dies along with it.