"Without the Cold War," Rabbit Angstrom asks in John Updike's Rabbit at Rest, "what's the point of being an American?" Rabbit's question, which he posed in 1990, anticipated something in the national mood during the decade that followed. In 1995, social critic Christopher Lasch wrote that the United States had descended into a "democratic malaise," the most telling symptom of which, Harvard public policy scholar Robert Putnam wrote, was a decline in civic engagement. In his famous essay and then book, Putnam amassed a mountain of evidence--measuring everything from rates of church attendance to participation in bowling leagues--and pronounced that Americans were "bowling alone." A survey conducted by pollster Daniel Yankelovich in 1995 reported that Americans felt "a sickness in the very soul of society to which they cannot give a name." For conservatives especially, the '90s were wasted years, the decade's signature traits being narcissism, cultural rot, and sheer purposelessness. The coarseness of the public square "has shattered America's traditional confidence about itself, its mission, its place in the world," morality czar William Bennett wrote in Commentary.
Weekly Standard writer David Brooks diagnosed the condition, too. And, in 1997, he came up with a cure. In a cover story titled "a return to national greatness," Brooks echoed concerns raised by Alexis de Tocqueville nearly two centuries before. "Democracy has a tendency to slide into nihilistic mediocrity if its citizens are not inspired by some larger national goal," Brooks cautioned. More than anything else, his essay took aim at the trend toward civic disengagement that had been encouraged, in his telling, by the twin failures of cultural liberalism and Newt Gingrich-era conservatism. If it is to reverse this trend, Brooks elaborated, "the first task of government is to convey a spirit of confidence and vigor that can then spill across the life of the nation." The means to this end were largely beside the point. "It almost doesn't matter what great task government sets for itself," he concluded, "as long as it does some tangible thing with energy and effectiveness." With that, public figures eager to repair the civic damage of the '90s--mostly neoconservatives, but also officials in the Democratic Leadership Council orbit, Senator John McCain, and, later, certain members of the Bush team who still employ the national greatness lexicon today--struggled to devise a mission worthy of the name. And failed. Manliness, monuments, Mars--the more exotic the proposals to enhance civic vigor became, the more it became clear that what really counted was not the particular cause but its by-product.
The need for a moral equivalent of the cold war evaporated on September 11. Having failed to reverse the equation during the '90s, the architects of national greatness would henceforth make a virtue out of necessity. If most opinion-makers concentrated on the war abroad, the potential benefits at home were never far from the minds of others. Celebrating the end of America's "holiday from history," columnist Charles Krauthammer wrote that "this land of `bowling alone,' of Internet introversion, of fractious multiculturalism developed an extraordinary solidarity. ... It turned out that the decadence and flabbiness were just summer wear, thrown off immediately." As to what awaited the United States on its return from this holiday, Commentary Editor-at-Large Norman Podhoretz wrote, "Beyond revenge, we crave `a new birth' of the confidence we used to have in ourselves and in `America the Beautiful.' But there is only one road to this lovely condition of the spirit, and it runs through what Roosevelt and Churchill called the `unconditional surrender' of the enemy." President Bush put the point somewhat more bluntly: "For too long, our culture has said, `If it feels good, do it.' Now, America is embracing a new ethic and a new creed: `Let's roll.'"
The significance of national greatness was never the movement it spawned, but rather the moment it encapsulated--a minute, really, in which it was hoped that something good might come from bad. What its adherents anticipated after September 11 was really less a return to national greatness than a return to basic national goodness, a civic quality the excesses of the '90s seemed to have corroded. Civic attachments, a sense of shared purpose, a propensity to sacrifice for the common good--if historical precedent offers any guide, all of these should have been renewed in the aftermath of September 11. As Harvard's Theda Skocpol noted in her 2001 study, "Patriotic Partnerships: Why Great Wars Nourished American Civil Voluntarism," "America's civic vigor was greatly enhanced, both following the national fratricide of the 1860s and amidst the plunge into global conflict between 1917 and 1919." The pattern held during World War II and the cold war, conflicts that boosted everything from membership in voluntary associations to the fortunes of the civil rights movement. And, yet, not only has everything not changed since September 11; nothing has. According to a mountain of attitudinal and behavioral data collected in the past four years, the post- September 11 mood that former Homeland Security Secretary Tom Ridge dubbed "the new normalcy" resembles nothing so much as the old normalcy.
That matters at home, where the toll of civic disengagement, which has ranged from the loss of community life and its benefits at the local level to the hollowing of participatory democracy at the national level, has been well-chronicled by Putnam and others. And it matters abroad, too. An erosion of the common good, after all, can easily shade into an erosion of common purpose, even more so if that purpose demands public hardship--something the war on terrorism requires.
However intuitive the idea that September 11 ought to have sparked a return to civic engagement, the form that engagement should take is less obvious. A draft, for example, makes sense only as a response to military necessity, of which there is none today. The war in Iraq may be thinning the ranks of the all-volunteer military, but hardly to the point of requiring the conscription of tens of millions of young men--and, this time, women. Vague proposals for national service, while appealing as a means to promote social cohesion, run into the same problem. The war on terrorism doesn't require mass mobilization. Nor, in terms of civic renewal--that is, patriotism as an activity rather than merely a sentiment--should it. The vigorous citizen is not a helpless dependent awaiting a summons. That is an incompetent citizen. As political scientist Alan Wolfe puts it, "However important leaders, policies, and programs may be, greatness will not come about unless Americans care enough about it to will it into existence." For young Americans, meeting this condition could have meant at least entertaining the option of military service, a stint in law enforcement, or any number of philanthropic vocations. For others, it could have been expressed through activities as basic as volunteerism, attendance at public meetings, or membership in local organizations. For the rest of us, it could have simply been a contribution to the common good, whether through service to one's fellow citizen, one's neighborhood, or one's nation. In short, national greatness means citizens caring deeply about the fate of the nation. And, more important, acting like they do.
That so many of us seem not to has something, although not everything, to do with the quality of our leadership. Bush may employ high-minded rhetoric about America's purpose. But his rhetoric entails no obligation to act. The counterexample of Franklin Roosevelt has become a favorite cliche among the president's critics. And for good reason. If Bush has, in some respects, followed in the footsteps of FDR abroad, he has governed more like Warren Harding here at home. "No less than December 7, 1941," then-National Security Adviser Condoleezza Rice announced in October 2003, "September 11, 2001, forever changed the lives of every American." About 1941 she was correct. As the famous World War II poster put it, remember pearl harbor: work--fight--sacrifice!! And Americans did. Legions volunteered to join the military (and millions more were drafted), while, on the home front, millions of others labored directly in support of the war effort. They did so, in part, because their president asked them to. In 1943, FDR declared that "Doctor New Deal has been replaced by Doctor Win the War." And, through scrap drives, rationing, war bonds, and a doubling of their tax burden, the public responded in kind. "You see those bombers in the sky," the Irving Berlin tune went, "Rockefeller helped build them and so did I. I paid my income tax today."
Ironically, as historian David Kennedy documents in his book, Freedom from Fear: The American People in Depression and War, 1929-1945, FDR felt "let down" by the American people. In 1943, the president complained that too many Americans were "laboring under the delusion that the time is past when we must make prodigious sacrifices." What would he make of the present era? The circumstances that required mass mobilization during World War II are, of course, not the circumstances the United States confronts today. In Bush's telling, however, the war on terrorism requires something closer to mass demobilization. "Get on board," he urged in the immediate aftermath of September 11. "Fly and enjoy America's great destination spots. Get down to Disney World in Florida. Take your families and enjoy life." As for sacrifice, the president elaborated, "I think the American people are sacrificing now. I think they're waiting in airport lines longer than they've ever had before." Nor, in the ensuing four years, has Bush asked ordinary Americans to sacrifice much of anything else.
On one of the few occasions he did--the 2002 State of the Union Address, in which Bush summoned Americans to commit 4,000 hours to volunteer work during their lifetimes--the proposal, at least to judge by static rates of volunteerism, went nowhere. Bush's national service bill, which would have expanded the ranks of AmeriCorps, never even reached the floor of Congress. Not only has the president failed to offer even a symbolic initiative along the lines of war bonds, whose purpose was mostly to create the impression of participation in a larger effort, he has failed to offer even necessary measures. Never mind AmeriCorps: Bush began the war on terrorism by opposing the creation of the Department of Homeland Security. And, with the Armed Forces pinned down in two foreign wars and starved for manpower, he has yet to devote a speech merely to urging young Americans to consider the benefits of military service.
Bush's attempt to restore normalcy in the days after September 11 derived, according to a White House official involved in the effort, from a justifiable fear of a financial crisis. But, as the days stretched into weeks, the question of national sacrifice increasingly revolved around calculations of political risk--risks that then-special adviser Karen Hughes, in particular, urged the president not to take. And, as the weeks stretched into months, political inclinations hardened into philosophical certainties, of which Bush's multiple tax cuts were the clearest symptoms. At a time when U.S. troops are fighting two land wars abroad, the decision to slash government revenue clearly illuminated the president's understanding of the role of the state. "The unstated message [of Bush's tax cuts] was that we were not all in this together," Putnam says, "and, whatever the economic merits or demerits of the policy, the civic implications of that policy were abominable."
The president, it turns out, has little use for the "national" in national greatness. On the one hand, for the architects of Bush's "armies of compassion" and his "4,000-hour" initiatives, it has always been an article of faith that government action, even in wartime, frustrates the growth of civil society, which must come from below. At the time of the September 11 attacks, says Bill Galston, director of the University of Maryland's Center for Information and Research on Civic Learning and Engagement, "You had a Tocquevillian band in the White House and people like [compassionate conservatism guru] Marvin Olasky hanging around who wanted voluntary action but no public service." On the other hand, the Republican Party of Tom DeLay and Dick Armey--and, with it, the libertarian strain exemplified in the 1994 Contract with America and in what GOP activist Grover Norquist called the "leave us alone coalition"--remains the Republican Party of, well, Tom DeLay and Dick Armey. The laissez-faire dogma evident in the administration's domestic priorities--captured most recently by White House Chief of Staff Andrew Card, who, at an event hosted by the Partnership for Public Service, wondered aloud why applicants would "even want a job with the federal government or one of our agencies"--has deprived the president of the vision and even the vocabulary necessary to urge the nation toward greatness. It is as if the Bush administration, sensing the birth of a common purpose after September 11, consciously squashed it.
But is Bush alone responsible for the stillbirth of the national greatness project? Hardly. The insistence that, but for Bush's silence, we were all prepared to search the caves of Tora Bora together is nonsense. If the United States means to renew itself, there have to be at least a few stirrings from below. There are none.
Describing the national mood of a century ago, Walter Lippmann noted that "we have changed our environment more quickly than we know how to change ourselves." He could just as easily have been summarizing our own predicament. If Bush has shown no inclination to foot the national greatness bill, after all, neither have most Americans. "To the extent there is a call for enduring sacrifice, people just aren't in the mood," says James Davison Hunter, a professor of sociology at the University of Virginia and author of The Death of Character. "Shared notions about the public good simply don't exist anymore." Those notions have been eroding for the better part of four decades. The pressures of two-career families, suburbanization, television, the demise of the World War II generation--Putnam identifies these as the long-term culprits. Whatever civic benefits September 11 may have generated were also overwhelmed by what national greatness types decried in the '90s as unchecked habits of consumption and permissiveness--the first encouraged after September 11 by economic conservatives like the president himself, the second a product of the same cultural trends from which no American with electricity remains immune. The reluctance to abide any measure that might constrain personal autonomy, the subordination of the public good to private wants, the conflation of rights and duties--in the aftermath of September 11, the combination of these yielded, at best, an uncertain commitment to the fate of the nation. And that was true even before the bitterness from the Iraq war set in--in late 2001 and early 2002, when opinion surveys were anointing Bush the most admired man in the nation and Upper West Side brownstones were still festooned with American flags.
Beginning with the ultimate gesture of sacrifice--enlisting in the Armed Forces--all the anecdotes about young Americans rushing down to their local recruiting offices in the aftermath of the September 11 add up to nothing more than a myth. In fact, military recruitment numbers during the months following September 11 actually dipped from where they were at the same time the previous year. According to Pentagon figures, between October 1, 2001, and October 1, 2002, the number of entry-level Army recruits actually shrank by 32 percent. Defense Department surveys of young Americans found that, by early 2002, even the propensity to enlist had declined to summer 2001 levels. Meanwhile, a poll by Harvard's Institute of Politics reported that, even in the immediate aftermath of September 11, large pluralities of undergraduates would evade military service if asked to serve. "There was an eagerness to punish the enemy, but it wasn't enough of a factor to motivate people who had other plans in the lives," says National Defense University's Alan Gropman. As for less exalted endeavors, the University of California's annual survey of American freshmen found that the percentage of students who either volunteered or performed community service remained static from 2000 to 2002.
Nor did the country as a whole stir itself to much greater heights than its children did. According to Tom Smith, director of the National Opinion Research Center's General Social Survey, "Virtually every measure that shot up after 9/11 declined within three to six months as it became a historical remembrance." Rates of regular volunteering never budged at all. Neither, as a survey Putnam conducted after September 11 finds, did attendance at public meetings or membership in organizations. Charitable giving, according to the annual survey Giving USA, rose slightly in 2001 before declining again in 2002. As for the popular notion that September 11 had stimulated "one of the greatest spiritual revivals in the history of America," as Pat Robertson put it, by November 2001, according to Gallup, rates of weekly church attendance had returned to exactly their pre-September 11 levels.
Attitudes about government followed the same trend lines. Gallup found that the percentage of poll respondents who trust the government to "do what is right" dropped to pre-September 11 levels in 2002. Similarly, it reported that the percentage of Americans saying the government was doing too much, which declined after September 11, had returned to its libertarian norm by 2002. A Los Angeles Times poll found that the percentage of Americans willing to surrender some civil liberties to curb terrorism, which stood at 49 percent in 1995 and skyrocketed to 63 percent in 2001, had settled back to 49 percent in 2002--just as other polls showed that Americans' willingness to endure airport security screenings and random ID checks had declined as well. In a September 2002 article in the academic journal PS, Skocpol, relying on data compiled by Putnam, compared changes in civic attitudes just after September 11 with changes in civic behavior. She found that "Americans suddenly displayed new attitudes of social solidarity and trust in government, while barely changing their patterns of civic participation."
As for predictions by national greatness theorists that America's new sense of purpose would express itself through cultural sobriety, a glimpse at the most telling barometer of all reveals that they have gotten things exactly backward. A generation's worth of survey data has demonstrated a causal link between levels of TV viewing and civic disengagement. But what we watch matters nearly as much as how much we watch. During the '90s, the DDB Needham Life Style survey, which tracks viewer preferences alongside civic habits, showed that viewers who imbibed the trashiest fare were the least likely to be engaged in their communities, while those who watched the news were the most involved. Alas, while the amount of television that Americans watched increased after September 11, a Pew survey released in 2002 found that the "public's news habits have been largely unaffected by the Sept. 11 attacks and subsequent war on terrorism." So what have Americans been watching since September 11? Garbage. And, to judge by the Nielsen ratings of the past four years, more of it than ever before. "As long as you eliminate that two months after September 11," says Robert Thompson, professor of television and popular culture at Syracuse University, "we were back to `Fear Factor'; after six months, we had a celebrity boxing match between Tonya Harding and Paula Jones." True, Americans have been indulging in wartime escapism ever since the proliferation of carnivals during the Civil War. But "Fear Factor" isn't escapism. It's the hallmark of a society that feels it has nothing from which to escape.
If hopes for national greatness were never realized after September 11, they were decisively put to rest in Iraq. The extent to which greatness abroad can be transmitted to the home front depends, needless to say, on actually achieving something like greatness abroad. Who, after all, has ever heard of military defeat giving way to national renewal? The French experience in Algeria, the Soviet experience in Afghanistan, the U.S. experience in Vietnam--what these wars brought home was something else altogether. And, while the Iraq war, if only because of the small numbers of Americans fighting it, seems unlikely to introduce anything like the poison that Vietnam injected into the body politic, it certainly has done nothing to arrest the pre-September 11 trends that alarmed national greatness types in the first place. From levels of trust in the government and trust in fellow citizens to measures of public spiritedness, civic pride, and social cohesion--far from enhancing any of these metrics, post-2003 polls show that Iraq has eroded them even more.
Even if Iraq had greeted Americans with something other than bullets and roadside bombs, there was always something problematic about the idea of greatness abroad fostering greatness at home, especially in this day and age. The war on terrorism may have seemed to some like the ideal means to accomplish national renewal but for one fact: Very few Americans have any role in it. Quoting a veteran of the Second World War at the dedication of the World II Memorial last year, Bush said, "This was a people's war, and everyone was in it." But that was then. Those who most enjoy the benefits and freedoms of this country now serve it the least. There is a very simple reason for this: With the coming of an all-volunteer military in 1973--applauded at the time as a gesture of heightened moral awareness--the definition of American citizenship narrowed to the point of excluding the obligation to defend one's country. Aside from the 1.4 million men and women in the Armed Forces, their families, and members of law enforcement, virtually no one participates in today's effort.
Some national greatness critics actually tout this as progress. The portrayal of war as a means to civic renewal, after all, boasts a lousy pedigree--mostly the result of pronouncements made by intellectuals who, bored by the emptiness and trivialities of the Belle Epoque, welcomed war as something to wipe out the malaise of civilian life. (In His Last Bow, set on the eve of World War I, Sherlock Holmes tells his sidekick, "It will be cold and bitter, Watson, and a good many of us may wither before its blast. But it's God's own wind none the less, and a cleaner, better, stronger land will lie in the sunshine when the storm has cleared.") In the era of mass armies, the flaw in this sort of reasoning was that, whatever war's civic effects may have been, they paled next to its carnage and certainly didn't amount to a justifiable casus belli. In terms of war's impact at home, the problem today is nearly the reverse: The ordinary American has been assigned the role of spectator, and, if surveys tell us anything, that suits him just fine.
As a result, rather than accomplishing the goals of the national greatness theorists, we have come closer to fulfilling the predictions of another conservative camp from the '90s: the pessimists. Less concerned with the effects of U.S. foreign policy here at home than they were with the impact of U.S. domestic life on its conduct abroad, these foreign policy "realists" argued during the '90s that it was inconceivable that the same Americans who had grown so complacent at home would continue to reign supreme in the international arena. In a 1997 essay in The National Interest, former Defense Secretary James Schlesinger wrote that the ability to sustain U.S. preeminence was "being undermined by internal weaknesses," principal among them the disintegrating "elements and the habits of mind of the American democracy." Likewise, Samuel Huntington, the eminent political scientist and Clash of Civilizations author, complained that "Westerners identify their civilization with fizzy liquids, faded pants, and fatty foods" and are ill-suited to great crusades.
However vituperative they may have been, the pessimists at least grasped that, in a democracy such as this one, the home front cannot be ignored. The fact is, greatness at home does not require greatness abroad, but greatness abroad does require greatness, or at least some level of exertion, at home. Yet the pessimists' answer, which was to accommodate and adapt U.S. foreign policy to what they saw as the degradation of American civic life, provides no adequate response to the imperatives of the war on terrorism. Hence the need to be candid about the costs and purposes of U.S. strategy: Once the demands of the war on terrorism exceed the public's readiness to sacrifice--in terms of the willingness to put even a volunteer army in harm's way or to pay the cost of what will be a decades-long enterprise--the effort will become unsustainable.
Alas, Bush still acts as though national life can somehow be compartmentalized, with a nation of couch potatoes footing the bill for ambitious foreign and military policies. Thus has the White House invited Americans to indulge in the conceit that distant wars obviate the need for broad sacrifice or the mobilization of national power. Unfortunately, the wider scope of action permitted by waging war on the cheap is illusory. The whole business reflects the administration's insistence on concealing the costs of a "hard" foreign policy from what members of the Bush team presume, with some reason, to be a "soft" American public. It was precisely this dilemma that obliged the president and his chief associates to offer assurances that any military action in Iraq would be quick and painless. Now that every American recognizes the costs, Bush has been reduced to pleading for "more sacrifice and continued resolve," as he did last week in his radio address. The time for such a request was four years ago. But it's never too late to heed.