During the 1980s and 1990s, the stock market followed a rising trajectory. This trend had very few significant interruptions, and those interruptions were brief. The last dozen years have presented quite a contrast to those exuberant times: financial markets have undergone staggering drops, and risen and fallen again and again. At this point in time there is no end in sight to the painful oscillation. These recent developments, however, cannot be understood in isolation. They are dependent to a large extent upon the events which preceded them, and are part of a great web of causes and effects that is largely influenced by human nature, by the laws of supply and demand, and by chains of actions and consequences that began many years ago.
The Stock Market Bubble
Imitation is one of the fundamental means of learning for human beings. This can be seen in adults as well as in children. The most prominent area in which this occurs among adults is in the world of fashion. An influential person comes up with a slightly different style; that style is copied by others, and yet others, until a noticeable trend emerges. If the product doesn’t have an enduring practical value, such as that which t-shirts and jeans have been proven to have, then that style will fade and be replaced by another one.
This pattern of imitation, peak, and then collapse can also be seen in the financial sphere, although generally with less frequency than in the world of fashion. By the mid 1990s, stocks in general had been doing well for many years. The spread of the internet and the increased adoption of personal computers brought technology stocks in particular to people’s attention. As more and more people heard about others who had made money on the stock market, they wanted to do the same thing. Money that had been generated as the result of the actual production of goods and services was invested in the increasingly high-priced stock market at an accelerating rate. This bubble seemed good for the economy at the time, and even helped to temporarily balance the budget of the federal government.
Unfortunately, those prices were not sustained by a commensurate increase in the production of goods and services, and the high values of the stock market could not be maintained. In March of 2000 the bubble burst, and for the next two years the value of the stock market declined. Hundreds of billions of dollars were lost, and the previous trend of upward economic growth was dramatically changed. This change was exacerbated by the events of 9/11, and by the subsequent wars in Afghanistan and Iraq.
If the government had done nothing after the bursting of the stock market bubble, the economy would have eventually recovered as people generated more income and invested the surplus in some facet of the economy. Unfortunately, this sort of recovery can take years and is not very popular with people who want a quick fix. Politicians like to take credit when the economy is going well, and they get blamed when the economy fares poorly, so solutions that take an amount of time greater than the election cycle are unpalatable to them. In order to preserve their jobs they felt they had to do something in order to speed up the recovery, and both the federal government and the Federal Reserve acted to make that happen.
One of the ways in which the government can stimulate the economy is with a long-term reduction in tax rates, provided that the tax cuts do not impair fulfillment of the government’s important roles of facilitating commerce, mediating disputes, and maintaining public safety. Long-term tax cuts allow more money to remain in the hands of the private sector. This generally results in an increase in the consumption of goods and services, and provides more money that entrepreneurs can use to initiate the creation of goods and services. Money that is spent by the government also creates activity, but the government has no way of knowing which people will supply the innovations that fuel prosperity, and political and bureaucratic cultures tend to stifle innovation and to spend wastefully.
During the first term of George W. Bush’s administration long-term tax cuts were implemented. These tax cuts kept more money in the private sector, and they also helped to create an attitude of confidence among large sectors of the business community and of the general public. This boost in confidence may have actually been an even greater boon to the economy than the tax cuts themselves. Unfortunately, those tax cuts were not offset by cuts in government expenditures, so the bill for those tax cuts will have to be paid by taxpayers in the future.
The Federal Reserve, which is ostensibly apolitical but which is inevitably influenced by political factors, felt that it too had to do something. That something was to lower interest rates with the expectation that doing so would stimulate economic growth. Consequently more money was available to the banking system, and mortgage rates dropped while taxpayers were taking home a greater percentage of their income.
The result of these actions by the federal government and the Federal Reserve was that instead of recovering over time from the effects of the 1990s stock bubble, a new bubble was funded.
The Housing Bubble
During the first century after the United States became independent from England, it was only necessary to go west to get land for oneself. That was easier said than done, but the situation was that land was plentiful. As the land became increasingly settled, and as the population of the United States continued to grow rapidly, the price of property increased. This was particularly true in urban areas. The industrial revolution, combined with large-scale immigration from Europe, increased the percentage of people living in urban areas. By 1920, approximately 50 percent of the population lived in urban areas1. When the Great Depression hit, housing for the poor became a very sensitive issue.
Another change that occurred was in the attitude of the people regarding the role of government. When the United States was founded, most of the people who established the framework of this country believed that the role of the government was to provide for the common defense and to create a regulatory framework in which people could engage fairly in trade. However, throughout our country’s history there has been tension between people who believed that the government should take a more active role in managing prosperity, and those who believed that the government should interfere as little as possible in economic affairs. As time passed, the ratio of those who believed in a “hands off” government to those who believed in active management slowly declined. The economic crises of the 1890s and the 1900s led to a notable increase in the popularity of the idea that the government should be more of a manager, and the presidencies of Theodore Roosevelt and Woodrow Wilson saw significant expansions of government power and activity. The Great Depression finally tilted the scales of public opinion in favor of an active government.
In order to make housing more affordable so that it would be available to more people, the government created Fannie Mae in 1938. Fannie Mae’s role was to help provide money via the government to finance housing. Fannie Mae was eventually expanded to include private investors, and in 1968 it became wholly investor owned. In 1970 the government created Freddie Mac to compete with Fannie Mae, and it was also investor owned. To further facilitate home ownership for those with less than stellar credit, and for those who didn’t have the requisite cash for a down payment, the federal government created the Housing and Community Development Act of 1992. Among other things, this required Government Sponsored Enterprises (e.g. Fannie Mae and Freddie Mac) to purchase quotas of mortgages made to lower-income individuals2.
In order to increase the percentage of homeowners by applying pressure on the banks, Congress passed the Community Reinvestment Act in 1977. This act was intended to eliminate the practice that some banks had of redlining, which was the denial of credit to people and businesses from certain geographic subdivisions within metropolitan areas. The banks claimed that those subdivisions were credit risks. These districts were usually populated by minorities, and racism was a concern regarding the practice of redlining. In 1994 the government passed a law allowing interstate branch banking, but declared that the bank’s compliance with the Community Reinvestment Act would be a factor in whether a bank would be permitted to retain branch banks outside of their home state3. This enlarged the effect of the Community Reinvestment Act and incentivized banks to increase sub-prime lending.
The combination of the factors listed above, along with the conviction among most people that real estate was an investment that would continually increase in value, and the desire to imitate those people that had had financial success selling houses in the rising market, turned a trend into a bubble.
The Credit Crisis
Bundles of mortgages were (and still are) bought and sold by large institutions, and those institutions depended on ratings agencies, primarily Moody’s and Standard and Poor’s, for assessing the risks of those assets. Unfortunately, the ratings agencies were being paid by the firms who were selling the assets, creating a major conflict of interest. This resulted in less than accurate assessments of the risks involved in purchasing those securities.
Many people making loan arrangements didn’t work for the companies which would ultimately purchase the contracts, since the mortgage contracts would be bundled together and sold off to an organization such as Fannie Mae or Freddie Mac. These mortgage brokers were rewarded for facilitating lots of loans, and since they weren’t the ones who would ultimately take the risks, they had less incentive to be careful regarding the creditworthiness of the borrowers. This created another weakness in the system.
Many institutional investors who wanted to reduce the risk of buying these securitized mortgages purchased credit default swaps, which were intended to act as insurance policies in the event that the purchased mortgages went into default. Most organizations that sold credit default swaps hedged their risks by purchasing offsetting positions elsewhere. One major insurance company, AIG, was happy to take the insurance premiums for securities that the ratings agencies declared were safe investments, and didn’t see the need to offset the majority of the positions it was selling. This lack of caution, as well as other poor management practices at AIG, in combination with the company’s size, created a very unstable node in the international financial network.
To someone on the outside, banks and other financial institutions seem to be completely separated, but they are actually dependent on each other. The investments that banks make are often long-term investments, and sometimes the need for cash for daily operations arises. Banks make short-term loans to each other to cover such shortfalls, and this system of interbank lending creates a web of dependency throughout the international financial community. Events that affect one large bank can ripple throughout the investment world. Because of the dependence of businesses and individuals on banks and other financial institutions, these ripple effects aren’t limited to just the financial community; they affect nearly everyone.
The situation in 2008 was made even more unstable because speculators were allowed to purchase credit default swaps for securities that they didn’t own, allowing those speculators to bet on the failure of certain securities. When a financial institution took credit default swap premiums from owners of the securities it was acting in an insurance capacity. When it began accepting swaps premiums from speculators who neither owned the securities being swapped, nor who were hedging their current swap positions, it went from acting as an insurer to acting as a bookie. When credit default swaps are purchased by holders of the underlying securities there is a limit to the amount of risk possible, that limit being the total value of the assets being securitized. When credit default swaps are allowed to be purchased by those who don’t own the underlying securities, then the total systemic risk can become many times greater than the sum of the underlying assets.
Another factor that worsened the financial crisis was a change in accounting rules that took effect in late 2007, and which affected record keeping for many companies. The Fair Accounting Standards Board, which sets the accounting standards to which companies registered with the SEC must adhere, enacted a new rule called FAS 157.4,5,6 This rule, and the SEC and FASB interpretations of this rule, required companies to mark most securities to market using the latest prices if there were prices available, providing those sales were not distressed sales. These valuations were mandated regardless of whether the underlying cash flows had changed, and regardless of whether the holders of the securities intended to hold them long-term. Companies with liquidity problems were forced to sell their securities at a discount to what they would have asked had they not been compelled by circumstances to sell, and the market price for those securities fell. Consequently, other institutions holding similar securities were forced to mark down the value of their own holdings, amplifying the losses that the drop in housing values had itself caused.
Adjustable rate mortgages were another key weakness, possibly the most important.7 Institutions making loans apparently thought it was a good idea to sell loans that would pay them more money if interest rates rose. Unfortunately, many of the people who were getting the loans didn’t have the financial resources to meet the higher payments; nor did those people realize the risks inherent in the mortgages they’d just signed. They saved money in the short term, but paid for it when rates went up. If those sub-prime borrowers had had the ability to pull money out of thin air, such as the Federal Reserve is able to do, then rising interest rates wouldn’t have been a problem. In 2005, over one third of all mortgages were adjustable rate mortgages.8
The Federal Reserve began lowering interest rates two months before the stock market crash in 2000.9 They continued to lower those rates until mid-2003. They left them alone for a year, then began raising them in 2004, and they continued to raise them until the middle of 2006. The effects of these raises were not felt immediately because there was a delay between the rise in interest rates and the increase of the mortgage payment. As interest rates rose, eventually payments due on adjustable rate mortgages also rose, as did the foreclosure rate as a substantial number of people were no longer able to make their mortgage payments. As the rate of foreclosures climbed, the supply of houses on the market increased, causing the prices to drop further. The amount owed on many homes was greater than the declining values of the homes themselves, and some people left their homes rather than pay for an investment that had lost money. The Federal Reserve began lowering interest rates in August of 2007, but by then the damage had been done. The value of mortgage backed securities fell, bank balance sheets lost profitability, and the entire financial system was strained. The economy began to slowdown, causing job losses and more foreclosures. Many financial institutions were acutely affected, and Lehman Brothers was the first major domino to fall.
The shockwave of Lehman’s collapse rocked the financial world. AIG was on the verge of implosion as well, threatening the entire financial system, so the federal government stepped in. It was too late to avoid a disaster, but the disaster could have been far worse had nothing been done. The federal government, in reacting to that crisis (a crisis which they helped to create with their housing policies, and which the Federal Reserve helped to finance), has incurred vast amounts of new debt. The interest paid on the federal debt in 2010 was over $400 billion dollars11, and over a third of the money went to foreign holders of U.S. debt.12. Whether that debt helps to create a new and even greater problem remains to be seen.
The Sovereign Debt Crisis
In the twentieth century, it was amply demonstrated all over the world that central management of the economy produced a lower standard of living than free markets. Free markets were not ones that had no rules, but markets the direction of which the government didn’t try to influence. We live in a country that has learned that government control of prices, such as existed over gasoline and airfares prior to the Reagan administration, is not in the consumer’s best interests. Perhaps our government’s housing policies and the central control over interest rates need to be rethought as well.
Currently exacerbating the already stressed markets is the problem of sovereign debt, particularly in Europe. A substantial portion of the public in both Europe and America has learned that it can vote itself money from the public treasury, but that segment hasn’t yet learned the lesson that money doesn’t appear out of thin air (except for the Federal Reserve), and it can’t be forcibly taken from others without consequences. Money that is added to the financial system without a commensurate generation of goods and services will decline in value. Money taken from the productive segment of the population reduces their incentive to increase productivity, and often to even maintain their current levels of production. Money given to people who haven’t earned it decreases the incentive of those people to be productive in the first place.
Some European countries, and Greece in particular, have voted themselves more resources than they are able to produce, and that process is sustainable only as long as people are willing to loan them money to make up the shortfall, and as long as the government is able to make the payments on the debt. When the government can no longer make payments on the debt, then both the borrowing nation and the enablers of that nation’s irresponsibility will have to pay a price, and that price has only begun to be paid.
Exacerbating nearly all of the financial crises of the past two centuries is the fractional reserve banking system. Fractional reserve banking is the practice of accepting deposits and telling customers that they can withdraw their money at any time, even though most of that money will actually be loaned out at any given time and will therefore be unavailable. This works fine most of the time, but it creates havoc when those depositors show up en masse for their money and the money isn’t there, as has often occurred during financial crises.
The problems we face will not bring about the fall of civilization, or anything close to it, but the economic turmoil isn’t over yet.
2Housing and Community Development Act of 1992/Title XIII/Subtitle A/Part 2/Subpart B/Section 1333.
3Riegle-Neal Interstate Banking and Branching Efficiency Act of 1994, sections 109 and 110.