You are hereThe Birth of the Great Housing Bubble
The Birth of the Great Housing Bubble
by John Evans
About fifty-six years ago, when I was still an undergraduate student at the University of Texas, I decided to major in economics. In those days, the economics department at Texas was heavily oriented toward the study of economic history and was well-known for its strongly “institutional” bent and its benign view of leftist ideology. Its best known professor was Clarence Ayres, a man of immense learning and a great capacity to inspire awe in inquiring young minds, who seemed to feel that it was his special mission to disabuse students as of any notions about the existence of a benevolent deity. Appropriately, his heroes were such scholars as Thorstein Veblen, John Dewey, and Charles Darwin. In my case, however, his message generated an intellectual resistance that has grown steadily stronger over the ensuing decades. About fifty-six years ago, when I was still an undergraduate student at the University of Texas, I decided to major in economics. In those days, the economics department at Texas was heavily oriented toward the study of economic history and was well-known for its strongly “institutional” bent and its benign view of leftist ideology. Its best known professor was Clarence Ayres, a man of immense learning and a great capacity to inspire awe in inquiring young minds, who seemed to feel that it was his special mission to disabuse students as of any notions about the existence of a benevolent deity. Appropriately, his heroes were such scholars as Thorstein Veblen, John Dewey, and Charles Darwin. In my case, however, his message generated an intellectual resistance that has grown steadily stronger over the ensuing decades. Ayres and his colleagues did succeed, however, in arousing my intellectual curiosity and in helping to instill in me a passion for the study of economic history. When I left Austin in September 1954 to report for duty at Fort Bliss, I made it a point during my two years of military service to spend part of my recreational time reading economic history, and my enthusiasm for the study of it has never flagged since then. Although I never formally taught a course in either economic history or institutionalism during my long career as a professor of economics and international finance, I always injected historical material into my courses. This background proved invaluable when I wrote my textbook in international finance, which was published in 1992.
I have now been fully retired from academia for over eight years and do not claim up-to-the-minute familiarity with the financial instruments and practices that have played a major role in bringing about the great financial crisis that has suddenly come to life on the world stage. On the other hand, I offer a background that I believe enables me to understand the historical origins and implications of the current crisis in considerably greater depth than many of the commentators in the mainstream media who are presented to the public as authoritative voices. So, for what it is worth, here are some of my thoughts on how we got into our current economic mess. I shall be following this article with a companion piece that completes the task of explaining how we brought on the present financial disaster and offers some speculations about where we are headed.
If we are to fully comprehend the current world financial crisis, we need to view it in the context of being an aspect of a profound spiritual crisis that has now come to a head and is certain to have a lasting impact upon our world. I am confident that we have now entered into one of the most memorable decades in all of human history. This does not mean, however, that I have joined the ranks of those who feel that the end of earthly things is at hand. I remain convinced that Christ “came” in AD 70, that He is with us today, and that He will act as a corrective force in guiding humanity toward an ultimate destiny that will yet require a considerable time for its fulfillment.
A convenient starting time for my analysis is the period that began with the election of John F. Kennedy to the presidency in 1960 and culminated in the collapse of the Bretton Woods System on August 15, 1971. Among the great events of that period which helped shape the American economy as its exists today were the enlargement of the Cold War with the Soviet Bloc, which led to American involvement in the Vietnam War; the Kennedy-Johnson tax cut legislation of 1964; the flowering of the Civil Rights Movement in the wake of the racial integration of the U. S. military forces and Brown v. Board of Education of Topeka in the 1950s; the great boom in higher education that became evident in the latter half of the 1960s; and the final collapse of the gold-based international monetary arrangements known as the Bretton Woods System.
Although I believe that a good case can be made that the United States could have prevailed in Vietnam and that its involvement in the conflict there bought valuable time that helped prevent Communism from being victorious in Thailand, Malaysia, and Indonesia, its defeat became politically inevitable in the face of the growth of media opposition to American involvement in the conflict and the parallel growth of opposition to the military draft upon which the government had relied to meet its manpower requirements. Given its determination to expand domestic spending in support of its ambitious Great Society project, the Johnson administration felt compelled to maintain the draft. This allowed it to pay the members of the military below market wages and benefits for their services. By maintaining the draft, however, the government assured the growth of militant opposition to the war and set the stage for a leftward shift in the already liberal domain of academia that has profoundly impacted American culture in the ensuing forty years. To rework the “inspiring” words of the Reverend Jeremiah Wright, the chickens that were hatched during the decade of the 1960s came home to roost a long time ago and have prospered exceedingly in the interim.
By April 4, 1968, the date of the assassination of Martin Luther King, it had become clear to me that the relatively moderate approach to the great challenge of fully integrating native-born blacks into American culture that was favored by Dr. King and his partner Ralph Abernathy was giving way to the more militant and confrontational tactics of Malcolm X and his ideological brethren; and even though I still considered myself to be a Democrat in those days, I sensed that as a nation we had started moving down a path that could ultimately lead to social and economic disaster. That America had a lot to answer for with regard to its treatment of its citizens with ancestral roots in Sub-Saharan Africa was obvious, but it was also obvious to those with clear vision that it had reached the point of no return with regard to pursuing the goal of having a fully integrated society. What remained to be determined was just how it would choose to reach that goal. With the benefit of hindsight, I am convinced that it did not choose wisely.
It was also in 1968 that I launched my career as a specialist on the economy of Mexico, which was to last for fourteen years. Prior to that year I had done a good deal of reading on Latin America’s economic development—or lack thereof—but it was not until the spring of 1968 that I finally began writing my doctoral dissertation on the evolution of the Mexican tax system. In subsequent years, I made several short research trips to Mexico City, and I spent calendar 1976 as a Fulbright Lecturer at the University of Nuevo León in Monterrey.
Among the consequences of my focus on Latin America in general and on Mexico in particular was a profound awareness of the extent to which leftist ideology and its associated religious skepticism have prevailed among the intellectual and political elites in that part of the world. I shall refrain from going into detail as to why this has been the case, but I suggest that the answer is to be found fundamentally in Latin America’s heritage from its colonial past and in the generally secular; i.e. materialist, worldview of many of its influential thinkers. A particularly relevant aspect of this secular worldview has been the readiness with which so many influential Latin Americans have eagerly participated in the popular blame game of attributing the region’s problems to the Colossus of the North. There have been exceptions, most notably Chile, where, after the passing from the scene of its Marxist president, Salvador Allende, in 1973, the adoption of free-market principles advocated by the economists known as the “Chicago Boys” has been followed by over thirty years of generally superior economic performance. Unfortunately, while the Chilean model has exercised some influence in Latin America during those thirty years, most Latin American nations have continued to favor approaches to economic development in which the state imposes barriers to private enterprise and involves itself heavily in the production of goods and services that would be produced under competitive conditions in the United States.
I think that it was in 1957 that I read a speech delivered in Rio de Janeiro by Princeton economist Jacob Viner which opened with the observation that “The world will always be grateful to the Perons—for showing how not to achieve economic development.” Unfortunately for Argentina—and for the rest of the world—the lessons that should have been learned from the failures of the Perons and many other state-directed approaches to economic development have not been taken to heart, with the result that leftist ideological movements continue to attract large followings. It seems as though whenever a particular political leader or movement on the left gains power and proceeds to fail economically, a new leader or movement a new leader or movement emerges soon afterward that offers the latest version of a future secular paradise on Earth. We evidently need to update the maxim “There’s a sucker born every minute,” which has been wrongly attributed to P. T. Barnum, and apply it to the entire world, with due allowance for population growth. I am suggesting, for example, that as far as the secular left is concerned, a “sucker” is born every second somewhere in the world who can be sold on the idea that the solution to his or her problems is to be found by looking to the political left and big government for deliverance.
The leftist blame game tactic has a long and dishonorable history. It is prominent to at least some degree not only in Latin America, but in all parts of the globe. It was the key to the successes of the followers of Karl Marx, and it is an essential ingredient of numerous political movements, including liberation theology, that bear a close kinship to Marxism. One of its most characteristic manifestations is the singling out of the United States and its government as the cause of many of the world’s problems. In the “theology” of most of the political left, the United States—or at least its government—is a great barrier to human progress that deserves to be duly punished for whatever it is that the left regards as the equivalent of biblical sin.
That men are born sinners who need to strive inwardly and individually to overcome their sinful inclinations is an idea that the left summarily rejects. It prefers instead to promote the notion that under the tutelage of the elite leadership of a benign and caring government, a culture can be developed in which almost everyone behaves in conformance with societal norms. From the tactical perspective, it is often far easier to gain political influence by encouraging the “masses” to blame others for their failures and to regard themselves as victims who deserve generous compensation than to lecture them on their character shortcomings and encourage them to focus on what they can do individually to improve their lives. We need to recognize, of course, that there really are some bad guys in the world who exploit others, but we also need to recognize that the political left has carried its reliance on the blame game and the cultivation of victimology to logically absurd—though politically successful—lengths.
In a very recent article posted at americanthinker.com by Richard Berry, the author attributes the current market turmoil in the United States to a “Boomer Elite” that “has long exhibited . . . unbridled greed and hubris, exorbitant self-regard, breathtaking recklessness, insatiable appetite for immediate gratification, and a rollicking sense of entitlement.” I have no difficulty in agreeing that all of these behavioral traits have played a role in bringing about the current crisis, but I regard them as symptoms of a larger illness that affects all of America’s culture, namely a profound sickness of the spirit. And I would add to the list of the shortcomings of the “Boomer Elite” the consequences of the mishandling of the integration of America’s black citizens into its society and the political left’s immense talent for manipulating public opinion by playing the blame game, capitalizing on white guilt, and instilling norms of political correctness.
In looking back on the history of our nation’s black population since the King assassination, I am persuaded that the collective decision of America’s blacks to ally themselves with the Democratic Party has had unfortunate consequences for most blacks and for the American population as a whole, among which is the timing and extent of the current financial crisis. Given the broad scope of the fundamental trends that have been at work, I am confident that a major “readjustment” of the world’s financial institutions and practices would have become inevitable at some point in the not very distant future; but the eruption of a full-blown financial crisis in September 2008 is directly traceable to the adoption by the U. S. government of a particular set of financial policies designed to encourage the provision of “affordable housing,” particularly for those who can be portrayed as victims of discriminatory practices.
The Johnson administration embarked on a course of greatly expanding the role of the federal government in providing social services through both its spending and its oversight powers that encountered resistance during the Reagan administration but has generally flourished under other presidents, including both Bushes. Fortunately for the nation, the pronounced tendency of government spending to adversely affect economic growth once it expands beyond what I shall term its natural boundaries was offset in large part by other actions, most notably the tax cuts of the Kennedy-Johnson and the Reagan administrations.
In referring to what I term the natural boundaries of government spending, I have in mind what I regard as government’s legitimate role in a culture grounded in the belief that individuals should be allowed to have a high degree of personal freedom, including the freedom to own private property and operate profit-seeking businesses. Ideally in such a culture, government undertakes to do what private enterprise cannot do effectively and seeks to offset the market imperfections that economists call “externalities”; e.g. the pollution that results when toxic waste is dumped into a stream or the harm that society incurs if individuals are not required to be vaccinated at government expense against smallpox. Obviously, people can and do disagree as to precisely where to draw the line that government should not cross; but as the current financial crisis indicates, there can be little doubt that government has often engaged in activities whose costs to society have far exceeded any plausibly demonstrable benefits.
Some politicians, including the candidate who is currently favored to capture the presidency, are fond of complaining about the activities of political lobbyists and making grandiose promises about restraining their activities and influence. In reality, of course, the growth of lobbying activity is a natural byproduct of the growth of government. That activities aimed at influencing what government does will expand in scope and magnitude as government’s power to affect the wellbeing of the governed increases is so predictable that to suggest otherwise is an insult to our intelligence. The larger the role of government in the economy, the larger will be the proportion of the nation’s income devoted to spending on activities devoted to gaining political influence. Under these circumstances, denouncing lobbying can be compared in effectiveness to a dog barking at the wind. While we can subject lobbying to rules that hopefully will restrain some of its potential excesses, we delude ourselves if we think we are going to diminish its quantity.
Beyond some point, whose precise location can be debated, the expansion of government can affect a nation’s economy in a manner analogous to what can happen to a tree when a parasitical plant attaches itself to it. The more productive the private sector of the economy becomes, the greater the potential gain from using the power of government to capture some of the private sector’s “action.” Thus, while Marxists and their intellectual relatives like to vilify “capitalists”—particularly those who don’t pay sufficient amounts of protection money to the representatives of the political left—the reverse is often closer to the truth; i.e. the “workers” and their political representatives exploit the private sector for their benefit. Unfortunately for them, the exploitation process ultimately tends to become self-destructive.
Even in an era of growing government power and influence, it may sometimes be possible, for a while, to offset the impact on economic growth of the parasitical forces at work within the governmental apparatus. This is precisely what happened with the Kennedy-Johnson and Reagan tax cuts. If government continues over time to grow in size and power relative to the private sector, however, the time must inevitably come when the economy’s capacity to grow will diminish in the absence of reforms that give greater scope to the operation of market forces. In my judgment, we have reached a point in our history where the likelihood of economic stagnation is staring us in the face if we do not call a halt to the expansion of government’s influence.
In view of the established myth among members of the Democratic Party and the media elite that Franklin Roosevelt led the nation out of the Great Depression of the 1930s, I think it is worth briefly noting that the truth is that Roosevelt’s economic policies substantially prolonged the Depression. He accomplished this feat with some help from the Federal Reserve System, but most of the “credit” for it can be assigned to him and his Congressional allies, who insisted on taking actions that adversely impacted business confidence and undermined the incentives for private investment while neglecting opportunities to stimulate consumer spending.
It is only fair to note that Roosevelt was “assisted” in turning the Great Depression into the disaster that it became by Herbert Hoover and the Congressional Democrats, who joined forces to substantially increase taxes in 1932. For example, the highest marginal federal personal income tax rate was raised from 25 percent to 63 percent. The tax increase of 1932 was a remarkably stupid action, but it was not recognized as such by the successor Roosevelt administration. Indeed, under Roosevelt, taxes were increased further in the midst of the Depression. In 1936, for example, the highest marginal tax rate on personal income was raised to 79 percent. While it is true that the government used some of its increased tax revenues to help finance its highly ballyhooed make-work projects, the overall unemployment rate remained at high levels until the defense buildup associated with the outbreak of World War II. Taxes were raised again during Word War II, and the high marginal income tax rates established at that time were essentially retained until 1964.
Although many of the political “progressives” of the first half of the twentieth century seemed inclined to deny that high marginal tax rates might adversely impact economic incentives to any significant degree, the political reality was quite different. The advocacy of soak-the-rich tax policy proved to be a successful vote-gathering tactic, but many of the politicians who went along with it realized that high tax rates adversely affect incentives and lead to large-scale tax avoidance maneuvers. They also quickly perceived that by offering tax relief for favored constituents, they could influence business activity and enhance their own power and influence. And thus came into flower the phenomenon of the tax “loophole” industry, whose growth is difficult to measure but which has surely been one of the most important sources of support for the livelihood of the nation’s political class during the past seventy-five years. The high nominal tax rates that remained in place for many years provided politicians with “proof” to the mass of voters of their devotion to “tax equity” and the goal of income redistribution while the expanded use of tax loopholes over time helped prevent the economy from lapsing into stagnation.
The Kennedy-Johnson tax cut of 1964 came about in response to the realization that the federal government’s tax policy had become a major impediment to economic growth. Walter Heller, the economist who was its architect, was politically sympathetic to the idea of having the federal government assume a more activist role in American life, but he also perceived that the high federal income high tax rates in existence were exerting a restraining effect upon economic growth and that any loss in revenues resulting from a modest cut in income tax rates might be offset by the increase in revenues resulting from greater economic growth. The legislation of 1964 cut personal income taxes across the board, reducing the highest marginal rate on personal income from 91 percent (where it had been since 1950) to 70 percent. The tax rate applied to most of the income of corporations was lowered from 52 to 48 percent.
The tax cut of 1964 proved to be an outstanding success, but its effectiveness tended to diminish over time because of the influence of “bracket creep”; i.e. the process of bringing about an increase in the average tax rate on income over time because of the failure to adjust tax brackets and personal exemptions for inflation. It is worth remembering, incidentally, that “creeping inflation” had become a firm but usually unstated aspect of the federal government’s conduct of economic policy. It did so under the influence of the Keynesian theory, which advocated that when the economy was experiencing a rise of unemployment to unacceptable levels due to insufficient aggregate demand, the government was well-advised to increase total spending by borrowing rather than through tax increases.
As originally developed by John Maynard Keynes, the body of economic theory named after him condoned the financing of government spending by borrowing when the economy lacked sufficient aggregate demand to bring about full employment, but it did not advocate deficit spending by government in other circumstances. By providing politicians (and ambitious economists) with a rationale for government borrowing, however, Keynes opened a door through which a great many influential politicians soon passed, a door leading to the wonderland of the ever-increasing national debt. It did not take long before national governments readily increased their net indebtedness even when their economies were prospering and rationalized the resulting inflationary pressure by adhering to the neo-Keynesian belief that maintaining full employment required the toleration of creeping; i.e. modest, inflation. The successful implementation of this policy required the deliberate deception of the public about the government’s real intentions, but since many politicians are highly skilled in the art of misleading the public, the creeping inflation approach to economic policymaking has enjoyed great success during the past five decades. To a considerable degree, however, the expectations of the public have adjusted to the reality of government policy, with the result that governments have been forced to place greater restraint upon their borrowing.
In the United States, inflation accelerated in the United States during the 1970s in response to the combination of increases in the price of petroleum engineered by OPEC and the continued reliance upon what I shall term neo-Keynesian monetary and fiscal policy. The bracket creep in tax rates resulting from the increase in inflation tended to nullify the beneficial effects of the tax cut of 1964. In 1979, under the leadership of its chairman, Paul Volcker, the Federal Reserve slapped the brakes on monetary growth, with the result that interest rates soon soared to historically high levels and a major recession set in. In due course, however, the approach favored by Volcker yielded abundant fruit by giving the nation a greater degree of price level stability than it had known for many years. The beneficial effect of the alteration in the Fed’s monetary policy was reinforced by the Reagan tax cuts of 1981 and 1986.
Under Reagan, personal and corporate income tax rates were substantially lowered and tax law was simplified. By the end of his administration, the marginal personal income tax applicable to the highest incomes had fallen to 28 percent, while the rate applied the most of the income of large corporations was lowered to 34 percent. In pushing for these tax cuts, Reagan wholeheartedly endorsed supply-side economics and the associated concept of the Laffer curve, which simply indicates that as marginal tax rates are increased, a point must eventually be reached where further increases in tax rates will lower total tax revenues because of their effects on taxpayers’ behavior. Since it is reasonable to believe that at a marginal tax rate of 100 percent or more, the incentive to collect additional taxable income will cease to exist—unless most high-income recipients are masochistic supporters of Joe Biden’s version of patriotism—we can be reasonably certain that the Laffer curve conforms to economic reality. In its treatment of the Laffer curve and supply-side economics, however, the political left has repeatedly demonstrated that it is quite prepared to ignore reality in exchange for political gains.
In the years since the end of the Reagan administration, marginal tax rates on personal and corporate incomes have been increased by a few percentage points and the deductions and relief provisions allowed to higher income individuals have been reduced somewhat. On the whole, however, the enthusiasm of political liberals for raising taxes has been constrained by economic reality. They have succeeded, however, in complicating the tax code and reinvigorating that part of the lobbying industry that focuses on obtaining tax breaks. Other nations have emulated the Reagan approach to taxation—even going beyond it in some instances—with great (though largely unpublicized) success; and even though liberals commonly refer to supply-side economics and the Laffer curve in contemptuous terms, one has to assume that they are not as ignorant as they appear to be on the surface.
Meanwhile, as is exhibited particularly by the career of George Soros, the financial world has spawned a class of master market manipulators who have successfully speculated in the foreign exchange and other markets by being able to predict and influence the behavior of governments. In the world of Soros and the people who think like he does, the concept of a benevolent deity who oversees the unfolding of history has no relevance. If the descent of the world into utter chaos is to be avoided, government must exert sufficient control over human affairs so that disintegration can be prevented. But since history demonstrates convincingly that Soviet-style approaches to economic management do not work, ways must be found to allow market forces to operate subject to the careful supervision of the duly enlightened.
In light of the unpleasant reality that efforts to increase taxes generate enormous resistance and that there may, after all, be a natural limit to the expansion of government revenues through tax increases, political liberals have responded by “encouraging” private financial institutions and other private firms to behave in what they regard as socially responsible ways. This approach has been particularly pronounced in the field of housing, where the emphasis on massive public housing projects financed by the federal government has given way to the coercion of the private sector of the economy to compel it to make “affordable housing” available to people whose household incomes fall below a certain level. Most of these people, of course, “coincidentally,” are likely to vote reliably for the candidates of the Democratic Party.
Unfortunately for the nation and the world, the parasitical aspect of big government has now manifested itself in a spectacular form that promises to inflict great punishment on the world before adequate steps are taken to prune it. On the other hand, notwithstanding the massive quantity of cognitive dissonance exhibited on the political left, the causal relationship between the current financial crisis and the housing policies that it helped to bring about is too clear-cut to be plausibly denied. In due course, though it may take a while, the responsibility for the debacle is likely to be placed where it belongs.
In closing this article, I feel obligated to stress that although I regard the federal government’s housing policy as the proximate cause of the current financial debacle; i.e. the proverbial last straw—or bale of feed—that broke the camel’s back, the fact that the crisis is worldwide in scope is traceable to the way the international monetary system evolved after the collapse of the Bretton Woods System in 1971. Well before that event, the U.S. dollar had become the world’s primary reserve asset; i.e. the type of money that governments and their central banks use for international settlements and for intervention purposes in the foreign exchange market. Before August 1971, however, the U.S. Government sought to maintain confidence in the dollar by committing itself to redeem dollars held by foreign governments in gold at the fixed price of $35 per ounce. Over time, however, the persistent inflation in the world undermined the fixed exchange rates of the Bretton Woods System and ultimately led to its demise and the repudiation by the U.S. government’s of its commitment to sell gold.
The dollar continued to serve as the world’s primary reserve currency after 1971. This meant that the demand for dollars by foreign governments and other participants in the international monetary system continued to grow. The United States supplied the international demand for dollars by developing a persistently large excess of imports of goods and services over exports of goods and services; i.e. what economists call a current account deficit. All that was required for the United States to accomplish this task was that it maintain a low personal saving rate as its economy grew. Some of the impact of increased consumption spending that accompanied the process of economic growth thus spilled over into imports, thereby allowing the nation to develop the required current account deficit. The dollars that financed this deficit were then used the increase the quantity of dollar assets held by foreign governments, banks, and investors.
In due course, the United States became the world’s largest debtor nation by a huge margin. In order to keep the foreign holders of dollars assets reasonably happy, it was sometimes necessary for the U.S. government to maintain interest rates at higher levels than it might have preferred, but it was also put under pressure to maintain a high rate of real economic growth in order to inspire investor confidence in the U.S. economy. Fortunately for the United States, the changing age structure of the populations of the Western European nations and some other nations and the slowing of their economic growth rates increased the attractiveness of holding dollar-denominated assets. In effect, the pensions of Europe’s welfare state retirees became increasingly dependent upon the returns on such assets.
The problem of the growing international indebtedness of the United States merged with the political emphasis on the necessity of providing affordable housing when its government opted to engage in massive intervention in the private mortgage market. As I shall demonstrate in the second article in this series, the government intervened in ways that were designed to create market incentives for offering home mortgages that would have been deemed unduly risky by previous standards, and it also took action to promote the creation of financial assets secured by mortgage loans that would offer attractive returns for foreign holders of dollar assets.
The colossal financial house of cards that our government built with help from the rest of the world has now collapsed. I do not claim to possess a crystal ball that allows me to foretell the future, but in my follow-up article I shall provide a little additional detail on how politicians led us down the road to disaster and offer some comments about where I think we are going. I think the road to the future is going to be quite rough.
 Richard Berry, “The Great Boomer Comeuppance,” americanthinker.com, October 5, 2008.