As the U.S. economy fails to recover, there is a growing fear that the United States has entered a phase of long-term decline. Conservatives blame “big government” for throttling entrepreneurship; liberals tend to take aim at Wall Street. Rolling Stone writer Matt Taibbi memorably described Goldman Sachs as “a great vampire squid wrapped around the face of humanity, relentlessly jamming its blood funnel into anything that smells like money.” Among less inventive critics, the term in vogue is “financialization.” According to author Kevin Phillips, who popularized this notion, financialization is “a process whereby financial services, broadly construed, take over the dominant economic, cultural and political role in a national economy.”
Elements of this thesis can be found in scores of books, articles, and blog posts on the state of the U.S. economy. Phillips blames financialization not just for the “Great Recession,” but for “excessive debt, great disparity between rich and poor, and unfolding economic decline.” In their book, 13 Bankers, former International Monetary Fund (IMF) chief economist Simon Johnson and James Kwak blame financial factors for the “anemic growth” in the overall economy prior to the crash. And, in an influential essay—titled “WHAT GOOD IS WALL STREET?”—The New Yorker economics writer John Cassidy pointedly contrasts the period when regulators restrained the growth of the finance sector (when wages, investment, and productivity grew, lifting “tens of millions of working Americans into the middle class”) with the period of growth experienced by the finance sector since the early ’80s (when “financial blowups have proliferated and living standards have stagnated”).
One thing is clear: Financialization, in some form, has taken place. In 1947, manufacturing accounted for 25.6 percent of GDP, while finance (including insurance and real estate) made up only 10.4 percent. By 2009, manufacturing accounted for 11.2 percent and finance had risen to 21.5 percent—an almost exact reversal, which was reflected in a rise in financial-sector employment and a drop in manufacturing jobs. It is also clear that high-risk speculation and fraud in the financial sector contributed to the depth of the Great Recession. But Phillips, Johnson, and the others go one step further: They claim that financialization is the overriding cause of the recent slump and a deeper economic decline. This notion is as oversimplified, and almost as misleading, as the conservative attack on the evils of big government.
Much of what liberals blame on financialization is a result of profound changes to both the United States and the global economy that date from as early as 1968—well before the onset of financialization. In fact, the growth of the finance sector was partly a product of these developments. It is true that the speculative disruptions caused by financialization have to be addressed if we don’t want to suffer another crash down the road. But, if policymakers truly want to arrest America’s decline in the world and attend to the various ills that have accompanied it, then they must come to terms with the much broader story of what has happened to American industry and global capitalism in the last four decades. Simply cracking down on Wall Street won’t be enough.
THE FOUNDATION OF the postwar international economy was laid at Bretton Woods, New Hampshire, in 1944, when the United States and Britain agreed to a new international monetary system based on the dollar. The dollar’s value was fixed at $35 for an ounce of gold; other countries could only revalue their currencies in relation to the dollar with IMF approval. As long as American industry reigned supreme in the world, and dollars were in demand to buy U.S. goods, other countries had no incentive to exchange their dollars for gold. But, in 1968, the U.S. balance of trade began to plummet. In 1971, the United States imported more than it exported for the first time in the twentieth century. Except for two recession years, the balance of trade has remained negative ever since.
The immediate cause for this reversal was the Vietnam war. Already worried about public support for the venture, the Johnson administration was initially unwilling to raise taxes to fund it. Instead, it ran deficits at a time of full employment, which increased the demand for goods, but—with employment full—not the capacity to meet it. The growing gap in America’s trade balance, along with Vietnam war-induced inflation and overseas expenditures, threatened the value of the dollar. European countries, led by France, began exchanging their dollars for gold. Because the value of the dollar was based on America’s possession of sufficient gold reserves to redeem dollars, the greenback came under intolerable strain.
In 1971, President Richard Nixon finally detached the dollar from the gold standard and also stuck a temporary 10 percent surcharge on imports in order to pressure Japan and Germany to raise the relative price of their exports by revaluing their currencies. That December, the United States and other industrial nations tried to restore Bretton Woods with a dollar pegged at $38 per ounce of gold, but the agreement collapsed, and the dollar and other currencies began to float in relation to each other. No one intended that result, nor the instability it introduced into the world economy.
Under this arrangement, the dollar itself—no longer backed by gold-became the de facto global currency. Importers in, say, Chile exchanged pesos for dollars to buy goods from Italy that were priced in dollars for export rather than lira. But, like Bretton Woods, the new system brooked a contradiction. Under Bretton Woods, war-torn countries acquired through U.S. foreign aid, investment, and military expenditures the dollars to buy U.S. goods and to exchange goods with each other. But, when they began accumulating dollars beyond the capacity of the United States to redeem in gold, Bretton Woods began to totter.
Similarly, in the new era of floating exchange rates, countries acquired dollars when the United States ran trade and payments deficits. The United States needed to run these deficits to make the system work. But, if these deficits grew too large, then countries—fearing that the dollar would fall sharply in value from supply outrunning demand—might want to unload their U.S. currency. This could lead to a run on the dollar and bring the global economy crashing down.
What kept the trade deficit in the red after the Vietnam war was cost competition overseas, as Western Europe and Japan—and later China and other Asian nations—sought to replicate at a lower cost what was already being produced in the United States. As UCLA economic historian Robert Brenner demonstrated in The Economics of Global Turbulence, the resulting overcapacity led to a dramatic fall in the rate of profit for U.S. manufacturing. To make matters worse, the new overseas producers often protected their products with formal and informal trade barriers.
Many U.S. companies, unable to compete abroad, also became unwilling to compete against imports at home by lowering their profit margins. Over the next decades, American industry abandoned many lines of production, such as cameras, televisions, and other consumer appliances, which it had previously dominated. In the late ’70s, as U.S. trade deficits mounted, the entire system appeared on the verge of collapse.
IN 1979, AT a time of soaring inflation, President Jimmy Carter appointed Paul Volcker to head the Federal Reserve. Over the next three years, carrying over into the Reagan administration, Volcker tightened the money supply, raising interest rates to more than 20 percent. That killed off inflation and strengthened the dollar, as other countries and private exporters exchanged their currencies for dollars to take advantage of the high interest rates. But the same measure that protected the dollar also further undermined U.S. exports by raising their prices in relation to products priced in other currencies.
Officials in Ronald Reagan’s Commerce Department advocated helping American industry, and Gary Hart and other Democrats called for an “industrial policy,” but Reagan Budget Director David Stockman and White House economists declared that the decline in America’s export edge was inevitable. At a White House meeting to discuss the plight of high-tech industry, then-Commerce Department official Clyde Prestowitz reported, Reagan himself dozed off.
Nothing was done, and, as a result, the trade deficit shot up in 1982, almost doubled in 1983, and almost doubled again in 1984. The need to finance the budget deficit also put pressure on the United States to raise interest rates again in order to sell Treasury bills, which would have choked off the recovery. It was a genuine dilemma, but the Japanese provided a temporary solution—although not one the Reagan administration expected or to which it ever formally agreed.
By 1985, Japan accounted for about 40 percent of the trade deficit, thanks to its lower costs, superior quality control, and a trade strategy that used formal and informal barriers to maintain a surplus. Had the Japanese been cashing in their surplus dollars for yen, the value of the dollar would have fallen, and U.S. products would have become more competitive. But the Japanese did not want to see their exports priced out of the U.S. market, so they invested their surplus dollars back in the United States—in Treasury bills, real estate, and factories. This practice, described in R. Taggart Murphy’s The Weight of the Yen, preserved the dollar’s place in the new monetary arrangement, helped finance the deficit, and kept real interest rates low. But it also put domestic American industry at a significant disadvantage.
For much of the next 30 years, the United States accepted this bargain, but there were two periods when it sought to alter it. In September 1985, then-Treasury Secretary James Baker negotiated the Plaza Accord, which devalued the dollar in relation to the yen. At the Pentagon’s urging, the administration sought to boost high-tech industry; and Congress pushed through legislation mandating retaliation against unfair foreign trade practices. In its first two years, the Clinton administration took a similar tack. These measures held down the trade deficit and encouraged a boom in information technology.
But, by mid-’95, Bill Clinton’s secretary of the Treasury, Robert Rubin—who was already uncomfortable with the administration’s aggressive trade strategy—became increasingly worried that the unfolding recession in Japan would halt America’s recovery. He reduced the pressure on Japan to remove its trade barriers, and he let the value of the dollar rise against the yen. As a result, the U.S. trade deficit began growing in 1996 and quadrupled by 2000. And, by then, Japan was no longer the only country running and retaining surpluses. After the East Asian financial crisis in 1997, East Asian nations, fearing another run on their currencies, retained surplus dollars rather than converting them to domestic currencies. More important, China eclipsed Japan as the leading surplus country, retaining dollars in order to keep the yuan’s value low and Chinese exports cheap. Convinced that integrating China into global capitalism would promote democratic change, the Clinton and George W. Bush administrations chose to downplay their managed currency and subsidized industries.
Fed chair Alan Greenspan and Clinton officials believed that the new economy would sustain the boom, but, by the end of Clinton’s second term, the global market in computer electronics had become saturated, leading to the bursting of the dot.com bubble and setting the stage for a decade of slow growth and recession. The United States had initially managed to escape the contradictions of the post-Bretton Woods economy, but, by the early 2000s, the U.S. economy faced daunting challenges even greater than those it had faced in the early ’80s.
Washington’s informal arrangement with Japan and China had institutionalized the decline of American industry, and, particularly, of American manufacturing, which found itself priced out of many foreign markets. American firms could only have competed effectively if Washington had abandoned this arrangement. That would have meant, however, recasting the world monetary system and potentially losing some of the advantages of the dollar as a world currency, including the ability to run deficits and to finance military operations abroad without substantial tax increases. It might have also upset America’s post-cold-war strategy in Asia by provoking Japan and China.
But the consequences of retaining the arrangement were profound. There was the erosion of middle-class living standards, for one thing. To defend against fierce competition from abroad, American firms that produced tradable goods and services attempted to hold down wages in part by going on the offensive against private-sector labor unions. (Other firms simply moved production overseas.) As a result, real wages failed to grow much at all in these industries. According to the Economic Policy Institute, average hourly wages for production workers fell 6.2 percent between 1979 and 1989 after having risen steadily for most of the previous three decades. Wages only rose slightly in the ’90s and even less in the 2000s.
The arrangement also produced anemic growth. From 1949 to 1979, GDP grew at an average of 3.9 percent; from 1980 to 2010, it averaged about 2.7 percent. This slowdown underlay the recessions of 2001 and 2007 and the flaccid recovery from 2002 to 2006, in which private non-residential investment in 2005 was only 2 percent higher than it had been in 2000. It was the weakest recovery since World War II—until the current one.
During the current recovery, private non-residential investment remains lower than it was in 2008. The Federal Reserve reports that, while banks are easing their lending terms, businesses are reluctant to take out loans to expand. That’s partly due to the overhang of private debt created by the financial crisis, but it’s also because American producers are unwilling to compete in a global economy in which their profit rates have plummeted. If financialization had been at the heart of the Great Recession, then the economy would have fully recovered once the banks got back on their feet two years ago. That it hasn’t suggests that the most fundamental problems of the U.S. economy lie elsewhere.
CRITICS OF FINANCIALIZATION usually blame the bank lobby and irresponsible federal officials who did its bidding—led by Greenspan, Clinton Treasury Secretaries Rubin and Lawrence Summers, and Bush Securities and Exchange Commission head Christopher Cox. These men do deserve some of the blame for the speculative excesses that resulted in the crash of 2007-2008. However, financialization was also a product of pressures created by post-Bretton Woods American capitalism—a dependent rather than an independent variable in the U.S. economic equation.
After Bretton Woods was replaced with a system of floating exchange rates, the United States, Europe, and later parts of Asia and Latin America gradually removed controls on the mobility of capital and the value of their currencies. That gave the world’s leading banks and insurance companies, as well as a host of hedge funds like the infamous Long-Term Capital Management, new ways to make money. In the United States, banks clamored to get larger so that they could compete internationally with Japan’s or Europe’s mega-banks.
The arrangement with Japan and then China brought in foreign capital, and, at the same time, it kept real interest rates low by maintaining the demand for Treasury bills. This allowed banks to borrow short-term at low rates in order to loan at higher rates, fueling both the volume and rate of financial profits. By the end of the ’90s, according to figures compiled by the French economists Gérard Duménil and Dominique Lévy, finance’s annual rates of profit were almost double those of non-financial corporations.
Higher profits meant greater investment, which meant a larger financial sector. Many non-financial firms also got into the game. The financial arms of private firms steadily grew from 1982 onward and, by 2000, made up more than 20 percent of their total profits. Companies like Ford, General Motors, and General Electric (GE) devoted a growing amount of their business to their financial divisions: GE Capital Financial Services became the most profitable division of GE.
In the early and mid-’90s, financial institutions had helped fund the new information economy, but after that, as opportunities in the real economy began to disappear, they looked elsewhere to invest—toward securitization (packing subprime housing loans into saleable securities) and other derivatives. They had capital in hand, some of it from overseas—many of China’s dollars went to Fannie Mae and Freddie Mac, for instance. And interest rates were low—again, partly a result of foreign t-bill purchases.
If the Clinton administration had pressed hard for regulating derivatives, that might have discouraged reckless securitization. If the Fed had raised margin requirements on stock purchases in the late ’90s, that might have halted the dot.com bubble. And, if Greenspan had raised interest rates after the recovery in 2002, that might have prevented the housing bubble. But these missteps had less to do with irresistible pressure from the bank lobby on public officials than with the blind exuberance of the late ’90s boom or with real fears in the 2000s that, by raising interest rates, the Fed could choke off a fragile recovery. They were more traceable to the permutations of the business cycle than to the machinations of Goldman Sachs.
IF FACTORS OTHER than financialization are principally responsible for America’s long-term economic ills, there are important implications for what Washington should do. First, the primary focus should be on reviving American industry, which includes everything from machine tool factories to software producers and from auto companies to biotech labs. Doing that will entail modifying our arrangement with Asia. Making these kinds of changes can be difficult, but it has happened before—during Reagan’s second term and in the first two years of Clinton’s presidency, when the United States got tough with its trading partners and subsidized innovation and growth.
Conservatives often dismiss government intervention as “industrial policy.” The Reagan-Clinton approach was different, however, from the industrial policy of the early ’80s, which was often aimed at reviving older American industries. Reagan sought to boost the emerging semiconductor industry by preventing Asian competitors from dumping cheap computer chips on the U.S. market and by funding research into new technologies. Emerging industries are not subject to global overcapacity, which is why a boom in information technology and software could take place in the ’90s.
The Obama administration initially embraced a similar approach by promoting green technology—a broad category that includes new kinds of construction, food production, electric cars, and transformation of the electrical grid. President Obama also pressured the Chinese to revalue their currency. But, in the face of Chinese recalcitrance and Republican opposition to “industrial policy,” the administration has now retreated into a defensive crouch.
This could spell disaster. If the United States continues to accept the status quo, then it can look forward, as Phillips has warned, to a fate similar to Dutch capitalism in the eighteenth century or British capitalism in the twentieth century. American industry will continue to wither away until rising trade deficits lead other countries to seek alternatives to the dollar. At that point, the United States would no longer enjoy the advantages of the dollar as a universal currency—and, with much of its industry having given up or moved elsewhere, would not be in a position to enjoy the advantages of a devalued dollar.
Some critics of financialization have insisted that breaking up the banks is the key to reviving the U.S. economy. Charles Munger, the vice chairman of Berkshire Hathaway, has said, “We would be better off if we downsized the whole financial sector by about eighty percent.” It’s certainly true that, if the derivatives market isn’t thoroughly regulated and if reserve requirements for banks aren’t raised, then a very similar crash could happen again. The Dodd-Frank bill goes part of the way toward accomplishing this, though it leaves too much to the discretion of the Treasury, the Federal Reserve, and regulatory agencies.
But, unless the United States takes the necessary measures to revive its industrial economy, radical downsizing of the financial sector could do more harm than good. It could even deprive the economy of an important source of jobs and income. Many mid- and large-sized cities—including New York, San Francisco, Jacksonville, Charlotte, Boston, Chicago, and Minneapolis—are now dependent on financial services for their tax bases. Instead of agitating for breaking up the banks, critics of financialization would do well to make sure that Republicans don’t gut Dodd-Frank.
Why, then, has financialization played such a starring role in explanations of America’s economic ills? One obvious reason is that the financial crash did turn what would have been an ugly recession into a “great” recession. This sequence of events is an almost exact replay of the Depression, which began with a short recession in 1926, the result in part of growing overcapacity in auto and other industries, leading to the diversion of investment into stock speculation, which led to the financial crisis, which occurred on top of the slowing economy and the growing breakdown in the international economic system. In both cases, the financial crash played the most visible role.
Another reason is the centuries-old tendency in American politics to allow moral condemnation to outweigh sober economic analysis. Picturing bankers and Wall Street as a “parasite class” or as “vampires” is an old tradition in American politics. It goes back to Andrew Jackson’s war against the Second Bank of the United States and to Populist Party polemics against “a government of Wall Street, by Wall Street, and for Wall Street.” And currently it is one of the few ideological bonds between the Tea Party and left-wing Democratic activists. But, just as free silver wasn’t the answer to the depression of the 1890s, smashing the banks isn’t the answer to the Great Recession of the 2000s. The answer ultimately lies in the ability of U.S. businesses to produce goods and services that can compete effectively at home and on the world market.
John B. Judis is a senior editor at The New Republic. This article originally ran in the August 4, 2011, issue of the magazine.