The Forgotten History of the Financial Crisis

“September and October of 2008 was the worst financial crisis in global history, including the Great Depression.” Ben Bernanke, then the chair of the U.S. Federal Reserve, made this remarkable claim in November 2009, just one year after the meltdown. Looking back today, a decade after the crisis, there is every reason to agree with Bernanke’s assessment: 2008 should serve as a warning of the scale and speed with which global financial crises can unfold in the twenty-first century. 

The basic story of the financial crisis is familiar enough. The trouble began in 2007 with a downturn in U.S. and European real estate markets; as housing prices plunged from California to Ireland, homeowners fell behind on their mortgage payments, and lenders soon began to feel the heat. Thanks to the deep integration of global banking, securities, and funding markets, the contagion quickly spread to major financial institutions around the world. By late 2008, banks in Belgium, France, Germany, Ireland, Latvia, the Netherlands, Portugal, Russia, Spain, South Korea, the United Kingdom, and the United States were all facing existential crises. Many had already collapsed, and many others would before long. 

The Great Depression of the 1930s is remembered as the worst economic disaster in modern history—one that resulted in large part from inept policy responses—but it was far less synchronized than the crash in 2008. Although more banks failed during the Depression, these failures were scattered between 1929 and 1933 and involved far smaller balance sheets. In 2008, both the scale and the speed of the implosion were breathtaking. According to data from the Bank for International Settlements, gross capital flows around the world plunged by 90 percent between 2007 and 2008. 

As capital flows dried up, the crisis soon morphed into a crushing recession in the real economy. The “great trade collapse” of 2008 was the most severe synchronized contraction in international trade ever recorded. Within nine months of their pre-crisis peak, in April 2008, global exports were down by 22 percent. (During the Great Depression, it took nearly two years for trade to slump by a similar amount.) In the United States between late 2008 and early 2009, 800,000 people were losing their jobs every month. By 2015, over nine million American families would lose their homes to foreclosure—the largest forced population movement in the United States since the Dust Bowl. In Europe, meanwhile, failing banks and fragile public finances created a crisis that nearly split the eurozone.

Ten years later, there is little consensus about the meaning of 2008 and its aftermath. Partial narratives have emerged to highlight this or that aspect of the crisis, even as crucial elements of the story have been forgotten. In the United States, memories have centered on the government recklessness and private criminality that led up to the crash; in Europe, leaders have been content to blame everything on the Americans. 

In fact, bankers on both sides of the Atlantic created the system that imploded in 2008. The collapse could easily have devastated both the U.S. and the European economies had it not been for improvisation on the part of U.S. officials at the Federal Reserve, who leveraged trans-atlantic connections they had inherited from the twentieth century to stop the global bank run. That this reality has been obscured speaks both to the contentious politics of managing global finances and to the growing distance between the United States and Europe. More important, it forces a question about the future of financial globalization: How will a multipolar world that has moved beyond the transatlantic structures of the last century cope with the next crisis? 


One of the more common tropes to emerge since 2008 is that no one predicted the crisis. This is an after-the-fact construction. In truth, there were many predictions of a crisis—just not of the crisis that ultimately arrived. 

Macroeconomists around the world had long warned of global imbalances stemming from U.S. trade and budget deficits and China’s accumulation of U.S. debt, which they feared could trigger a global dollar selloff. The economist Paul Krugman warned in 2006 of “a Wile E. Coyote moment,” in which investors, recognizing the poor fundamentals of the U.S. economy, would suddenly flee dollar-denominated assets, crippling the world economy and sending interest rates sky-high. 

But the best and the brightest were reading the wrong signs. When the crisis came, the Chinese did not sell off U.S. assets. Although they reduced their holdings in U.S.-government-sponsored enterprises such as the mortgage lenders Fannie Mae and Freddie Mac, they increased their purchases of U.S. Treasury bonds, refusing to join the Russians in a bear raid on the dollar. Rather than falling as predicted, the dollar actually rose in the fall of 2008. What U.S. authorities were facing was not a Sino-American meltdown but an implosion of the transatlantic banking system, a crisis of financial capitalism. 

And the crisis was general, not just American, although the Europeans had a hard time believing it. When, over the weekend of September 13–14, 2008, U.S. Treasury Secretary Henry Paulson and other officials tried to arrange the sale of the failed investment bank Lehman Brothers to the British bank Barclays, the reaction of Alistair Darling, the British chancellor of the exchequer, was telling. He did not want, he told his American counterparts, to “import” the United States’ “cancer”—this despite the fact that the United Kingdom’s own banks were already tumbling around him.

To Europeans, the crisis was the United States’ comeuppance.

The French and the Germans were no less emphatic. In September 2008, as the crisis was going global, the German finance minister, Peer Steinbrück, declared that it was “an American problem” that would cause the United States to “lose its status as the superpower of the world financial system.” French President Nicolas Sarkozy announced that U.S.-style “laissez faire” was “finished.” To Europeans, the idea of an American crisis made sense. The United States had allowed itself to be sucked into misguided wars of choice while refusing to pay for them. It was living far beyond its means, and the crisis was its comeuppance. But confident predictions that this was a U.S. problem were quickly overtaken by events. Not only were Europe’s banks deeply involved in the U.S. subprime crisis, but their business models left them desperately dependent on dollar funding. The result was to send the continent into an economic and political crisis from which it is only now recovering. 

Even today, Americans and Europeans have very different memories of the financial crisis. For many American commentators, it stands as a moment in a protracted arc of national decline and the prehistory of the radicalization of the Republican Party. In September 2008, the Republican-led House of Representatives voted against the Bush administration’s bailout plan to save the national economy from imminent implosion (although it passed a similar bill in early October); a few months later, after a lost election and at a time when 800,000 Americans were being thrown out of work every month, House Republicans voted nearly unanimously against President Barack Obama’s stimulus bill. The crisis ushered in a new era of absolute partisan antagonism that would rock American democracy to its foundations. 

Europeans, meanwhile, remain content to let the United States shoulder the blame. France and Germany have no equivalent of The Big Short—the best-selling book (and later movie) that dramatized the events of 2008 as an all-American conflict between the forces of herd instinct and rugged individualism, embodied by the heterodox speculators who saw the crisis coming. Germans cannot ignore that Deutsche Bank was a major player in those events, but they can easily explain this away by claiming that the bank abandoned its German soul. And just as the Europeans have chosen to forget their own mistakes, so, too, have they forgotten what the crisis revealed about Europe’s dependence on the United States—an inconvenient truth for European elites at a time when Brussels and Washington are drifting apart.

Eduardo Munoz / Reuters

Lower Manhattan during a power outage, October 2012.


Europe’s persistent illusions were on full display in an August 9, 2017, press release from the European Commission. In it, the commission announced that the “crisis did not start in Europe” and that the underlying problem had been “exposure to sub-prime mortgage markets in the United States,” which triggered the deep European recession that followed. Brussels went on to take credit for mitigating that recession through the “strong political decisions” of EU institutions and member states.

The timing of the press release was significant. It came on the tenth anniversary of what most experts consider to be the true start of the global financial crisis—the moment on August 9, 2007, when the French bank BNP Paribas announced that it was freezing three of its investment funds due to volatility in asset-backed securities markets in the United States. This was the first indication that the downturn in housing prices, which had begun in early 2007, would have global ramifications. That same day, the European Central Bank (ECB) was sufficiently alarmed to inject $131 billion in liquidity into Europe’s banking system. 

The commission’s analysis of what happened in 2007 was telling. Set aside, for a moment, the fact that problems at a French bank were the occasion of the anniversary, that there were massive homegrown real estate busts in Ireland and Spain, and that Greece and Italy had accumulated dangerous debt stocks of their own. What, exactly, did the implosion of U.S. subprime mortgage markets expose? 

The United States’ mortgage system was obviously broken. Some of the lending was criminal. And the design of mortgage-backed securities, many of which earned the highest bond ratings by bundling together bad mortgages, was flawed. But none of these problems explains why the downturn precipitated a global banking crisis. After all, investors lost more money when the dot-com bubble burst in 2000 and 2001, but that did not bring the global financial system to the brink of disaster. 

What turned 2008 into the worst banking crisis in history was a new business model for banks. Traditionally, most banks had funded their operations through what is known as “retail” banking, in which consumers lend money to banks in the form of deposits, which banks use to make loans. Beginning in the 1980s, however, banks across the world increasingly moved toward “wholesale” banking, funding their operations through large, short-term loans from other financial institutions, such as other banks and money market funds. The motive for this shift was profit and competitive survival. Wholesale funding gave banks the ability to borrow much larger sums of money than they could in the retail market, allowing them to become more leveraged—and thus more exposed to risk—than ever before. 

But the real threat to the global economy was not just that banks in the United States, Europe, and, to some extent, Russia and Asia were becoming overleveraged; it was also that much of these banks’ short-term funding involved currency mismatches. In order to do business in the United States, non-U.S. banks needed dollars, which they obtained from wholesale markets through a variety of methods: borrowing unsecured cash from U.S. sources, issuing commercial paper (essentially short-term IOUs), and, crucially, using currency-swap markets to receive short-term dollar loans in exchange for their own local currencies, with a promise to “swap” the currencies back at the end of the loan term. In short, foreign banks were racking up sizable liabilities that had to be paid in dollars. If the money markets where they obtained these dollars ceased to function, many of the world’s banks would immediately be at risk of failure. 

And in fact, that is precisely what happened. The first big bank to fail spectacularly was the British lender Northern Rock, in August and September 2007. It had no exposure to American subprime mortgages, but its funding model relied overwhelmingly on wholesale borrowing from around the world. What cut off Northern Rock’s access to funding was BNP Paribas’ August 9 announcement. This sent a signal to wholesale lenders that more banks were holding bad assets than anyone had previously understood. With the extent of the contagion unknown, wholesale lending ground to a halt. Five days later, Northern Rock informed British regulators that it would need assistance. 

The shutdown in bank funding quickly rippled across the global financial system, even reaching Russia and South Korea, countries remote from the subprime debacle but whose banks relied on the same wholesale markets now under stress. The world was witnessing a trillion-dollar, transnational bank run.

By late 2007, the world was witnessing a trillion-dollar, transnational bank run.

People tend to think of globalization as involving the rise of emerging markets such as China and India, and in manufacturing and commodities, these countries have indeed been the engines of growth. But in the early twenty-first century, financial globalization still revolved around the transatlantic axis, and it was between the United States and Europe that the real disaster threatened. The Bank for International Settlements estimated that all told, by the end of 2007, European banks would have needed to raise somewhere between $1 trillion and $1.2 trillion in order to cover the gaps on their balance sheets between dollar assets and dollar funding. In the good times, these banks had easily obtained funding through currency swaps and wholesale markets. Now, with interbank markets drying up, they were desperate for dollars. 

By the fall of 2007, officials in the United States had begun to fear that European banks, in a frantic bid to earn dollars to pay their bills, would liquidate their dollar portfolios in a giant fire sale. And because these banks owned 29 percent of all nonconforming, high-risk mortgage-backed securities in the United States, this was not just a European problem. The nightmare scenario for the Americans was that European banks would dump their dollar holdings, driving the prices of mortgage-backed securities to rock bottom and forcing U.S. banks, which held even larger quantities of those securities, to recognize huge losses, thus triggering a bank run that would have overwhelmed the furious efforts of the U.S. authorities to restore stability. It was this risk of simultaneous implosion on both sides of the Atlantic that made 2008 the most dangerous crisis ever witnessed. 


With disaster threatening, the question became how to respond. In the fall of 2008, governments across the West rushed to bail out their ailing financial institutions. In the United States, Washington came to the aid of the investment bank Bear Stearns, Fannie Mae and Freddie Mac, and the insurance giant AIG. The United Kingdom effectively nationalized HBOS, Lloyds, and the Royal Bank of Scotland. Belgium, France, Germany, Ireland, and Switzerland all took emergency measures to rescue their own banking sectors. 

As the trouble spread, crisis diplomacy kicked in. The inaugural G-20 leadership summit convened in November 2008, bringing together heads of state from developing countries such as Brazil, China, and India, in addition to those from the developed world. The birth of the G-20 reflected a multipolar world economy in which emerging markets had new weight. But it also made recourse to institutions such as the International Monetary Fund, which many developing countries viewed with hostility, all the more sensitive. No one in Washington wanted a repeat of the controversies of the Asian financial crisis in the late 1990s, when the IMF’s draconian loans came to be seen by their recipients as violations of national sovereignty. 

Behind the scenes, U.S. officials were putting an alternative rescue mechanism in place. The central problem was that the world’s banks needed dollar funding. And the only institution that could fill that need was the Federal Reserve. 

Officials at the Fed had already started to worry about European banks’ funding gaps toward the end of 2007. By December of that year, Bernanke and Timothy Geithner, then the president of the New York Federal Reserve Bank, had begun offering special liquidity programs to Wall Street, giving U.S. financial institutions access to cheap cash in the hopes of stabilizing their balance sheets and avoiding a ruinous selloff of mortgage-backed securities. Immediately, European banks started dipping into these funds. The Europeans took more than half of the $3.3 trillion offered through the Fed’s Term Auction Facility, which auctioned off low-interest short-term loans, and 72 percent of the deals provided through the Single-Tranche Open Market Operation, a little-publicized Fed liquidity program that ran from March to December of 2008. (Credit Suisse alone took one-quarter of that program’s funds.) 

For the Fed to be acting as lender of last resort to foreign banks was no doubt unusual, but these were desperate times, and it needed to avoid a European fire sale of U.S. assets at all costs. As the crisis intensified, however, the Fed’s leaders found that simply providing the European banks with access to the Wall Street liquidity programs would not be enough. Their funding needs were too great, and they lacked sufficient high-quality collateral in New York. So Geithner and the New York Federal Reserve resorted to an indirect mechanism for providing them with dollars, repurposing a long-forgotten instrument known as a “liquidity swap line.”

First responders: Henry Paulson and Ben Bernanke testifying in Washington, July 2008.

First responders: Henry Paulson and Ben Bernanke testifying in Washington, July 2008.

Liquidity swap lines are contracts between two central banks, in this case, the Fed and a foreign central bank, to temporarily exchange currencies: the Fed provides its counterpart with a fixed amount of dollars and in return receives an equivalent amount of that bank’s local currency. (The foreign central bank also pays a margin of interest to the Fed.) Liquidity swap lines had been used extensively in the 1960s to deal with tensions in the Bretton Woods system—which, by compelling countries to back their money with gold, led to frequent currency imbalances—but had since been confined to emergencies, as when they were used to help the Bank of Mexico during the peso crisis of 1994–95. The revival of liquidity swap lines in 2007–8 ensured that there would be no dangerous spikes in the funding costs of key Asian, European, and Latin American banks. If interbank funding got too tight, the global financial system would receive dollars directly from the Fed. 

The major beneficiaries of the swap lines were the central banks of Japan, Europe, and the major emerging-market countries, which could now take dollars from the Fed to pass on to their own struggling banks. The Fed introduced the liquidity swap lines in December 2007, and they were rapidly increased to a permissible limit of $620 billion. On October 13, 2008, they were uncapped, giving the major foreign central banks unlimited dollar drawing rights. By December 2008, the swap lines were the single largest outstanding item on the Fed’s balance sheet. The swap lines operated over various terms, ranging from overnight to three months. But if, for accounting purposes, they were standardized to a 28-day term, between December 2007 and August 2010, the Fed provided its Asian, European, and Latin American counterparts with just shy of $4.5 trillion in liquidity, of which the ECB alone took $2.5 trillion. That the European banks’ giant funding gap did not escalate into a full-blown transatlantic financial crisis is thanks in large part to these swap lines. 

Although the swap lines could be dismissed as technical in-house arrangements between central banks, they represented a fundamental transformation of the global financial system. The world’s central banks effectively became offshore divisions of the Fed, conduits for whatever dollar liquidity the financial system required. The Fed, that is, made itself into a global lender of last resort. Whereas before 2008 many had expected an imminent dollar selloff, the crisis ended up confirming the centrality of the Fed to the global financial system. And by successfully managing the crisis, the Fed reinforced the dollar’s attractiveness as the world’s commercial currency. 

But in establishing the swap-line system, the Fed also confirmed a hierarchy of central banks. The system included the obvious European central banks, such as the ECB, the Bank of England, and the Swiss National Bank, and those of Canada and Japan. But it also included the central banks of major emerging-market financial centers, such as Brazil, Mexico, Singapore, and South Korea. They were in; the likes of China, India, and Russia were not. Veterans of the swap-line program at the Fed, who spoke to me on the condition of anonymity, admitted that they knew that by rolling it out they were straying into geopolitical terrain. They carefully compiled a list of the 14 central banks that were to participate in the program, all of which had to be approved by the U.S. Treasury Department and the State Department. The Fed’s minutes from the meeting of the Federal Open Market Committee on October 29, 2008, record that at least two applicants were rejected, but their names were redacted. Not everyone was sufficiently important—or sufficiently politically and economically aligned with the United States—to qualify. 

The swap-line system wasn’t secret, but it wasn’t trumpeted, either. This was no Marshall Plan moment, and U.S. officials had no desire to publicize the fact that they were coming to the world’s rescue. The inability of Europe’s megabanks to come up with the trillions of dollars they owed posed such a risk to the U.S. economy that doing nothing was simply not an option. So discreetly, the Fed offered the Europeans a helping hand.

The world’s central banks effectively became offshore divisions of the Fed.

The liquidity swap lines wound down rapidly in 2009, as private credit markets began to recover. The full details of the liquidity programs were not disclosed until 2011, when the U.S. Supreme Court ordered the Fed to release the data to reporters from Bloomberg. There was good reason for secrecy: central banks do not wish to stigmatize borrowers that avail themselves of support when they need it, and announcing that the world’s most important central banks were desperate for dollar funding could have frightened international markets. The result, however, is that the Fed’s actions to save the global financial system have largely been forgotten. An unprecedented intervention effectively disappeared down a memory hole.


Today, the swap lines are an obscure part of the narrative in the United States; in Europe, they have been forgotten altogether. The European Commission is free to peddle its story that it was prompt action by the European authorities that saved Europe from a crisis made in the United States. European banks such as Barclays and Deutsche Bank can proudly proclaim that, unlike their American counterparts, they came through the crisis without state assistance, despite the fact that they took hundreds of billions of dollars in liquidity from the Fed. Although such depictions are profoundly misleading, they speak volumes about the state of transatlantic relations in the early twenty-first century. The United States and Europe remain massively interdependent, but they lack a common story to glue the relationship together.

The year 2008 can thus be seen as a moment of transition. On the one hand, it marked a twenty-first-century global crisis. On the other hand, the management of that crisis relied on networks of interdependence shaped by the twentieth-century history of the transatlantic relationship—networks that were deep but that leaders on both sides of the divide now seem eager to leave behind. 

What are the implications for the future? Many predicted that in the aftermath of the crisis, the dollar would lose its status as the world’s leading currency, but the opposite has happened. According to figures compiled by the economists Ethan Ilzetzki, Carmen Reinhart, and Kenneth Rogoff, today the dollar is the anchor currency—the standard against which other currencies are pegged—for countries representing around 70 percent of global GDP, up from closer to 60 percent at the turn of the millennium. It was European, not American, finance that retreated. The events of 2008 left the European banks in a weakened position, and since then, they have repeatedly looked to Washington for support. When the eurozone crisis was at its most acute, in 2010, the Fed reopened its swap lines, and in November 2013, they were made permanent.

At the same time as the Fed tided the European banks over during the crisis, U.S. regulators began to take an increasingly dim view of their stability. During negotiations in the Basel Committee on Banking Supervision throughout 2010, U.S. and European officials clashed over tightening banking rules and capital requirements. And after Obama signed the Dodd-Frank financial regulations into law in July of that year, U.S. regulators began using the law’s provisions to force European banks in the United States to either comply with the tougher standards or exit the U.S. market. 

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *