The Late Great United States: The Decline and Fall of the United States of America

Joseph George Caldwell

23 March 2008

Revised 17 December 2008

Part Two of Two Parts: Appendices

(Part One is the main text.)

Copyright © 2008 Joseph George Caldwell.  All rights reserved.  Posted at Internet website http://www.foundationwebsite.org.  May be copied or reposted for non-commercial use, with attribution to author and website. 


Contents

The Late Great United States: The Decline and Fall of the United States of America. i

Appendix A. 184 Reasons Why the US Is Dying (or Already Dead) 3

Destruction of the Biosphere. 3

Peak Oil 4

Overpopulation. 6

Fractionated Culture. 6

Decline in US Culture. 8

Loss of Spirituality and Manifest Destiny. 9

Globalization. 10

Low Security. 10

The Politics of Envy. 11

Oppression. 12

Decline in Freedom.. 12

Alienation of the US People from the Government 13

Quality of Life Is Declining for the US Middle Class, and Will Decline Rapidly with the Passage of Peak Oil 37

Increasing Income Gap. 40

Political Incompetence and Corruption. 41

Miscellaneous Technical Reasons. 43

Appendix B. Excerpts from Patrick Buchanan’s Day of Reckoning (Thomas Dunne Books / St. Martin’s Press, 2007) 44

Appendix C. Excerpts from Michael Neumann’s The Case Against Israel (CounterPunch and AK Press, 2005) 51

Appendix D. Excerpts from Ilan Pappe’s The Ethnic Cleansing of Palestine (Oneworld Publications, 2006) 54

Appendix E. Excerpt from David Livingstone’s Terrorism and the Illuminati (BookSurge / Amazon.com, 2007) 60

Appendix F. Excerpts from  Noam Chomsky’s The Prosperous Few and the Restless Many (Odonian Press, 1993) 63

Appendix G. Excerpts from Kevin Danaher’s 10 Reasons to Abolish the IMF and World Bank (Seven Stories Press, 2001) 68

Appendix H. Excerpts from Lori Wallach and Michelle Sforza’s The WTO: Five Years of Reasons to Resist Corporate Globalization – Introduction by Ralph Nader (Seven Stories Press, 1999) 73

Appendix I. Excerpts from Pat Choate’s Dangerous Business (Alfred A. Knopf, 2008) 78

Appendix J. Excerpt from Jerrome R. Corsi’s The Late Great USA (WND Books, 2007) 98

Appendix K. Excerpt from James Howard Kunstler’s The City in Mind (The Free Press, 2001) 101

Appendix L. Excerpts from James Howard Kunstler’s The Geography of Nowhere (Touchstone, 1993) 106

Appendix M. Excerpts from James Howard Kunstler’s Home from Nowhere (Touchstone, 1996) 133

Appendix N. Excerpts from James Howard Kunstler’s The Long Emergency (Grove Press, 2005) 161

Appendix O. Excerpts from Ellen Hodgson Brown’s The Web of Debt, 2nd revised edition (Third Millennium Press, 2007, 2008) 307

Appendix P. Excerpts from John Robb’s Brave New War (John Wiley & Sons, 2007) 519

Appendix Q. Excerpts from James Fallows’ article, “Declaring Victory,” in the September 2006 issue of Atlantic Monthly. 525

Appendix R. Excerpts from Thomas Hammes’ The Sling and the Stone (Zenith Press, 2006) 533

Appendix S. Excerpts from Martin van Creveld’s The Changing Face of War (Ballantine Books, 2006) 542

Appendix T. Excerpts from Rupert Smith’s The Utility of Force (Vintage Books / Random House, 2005, 2007) 552

Appendix U. Excerpts from the US Army / Marine Corps Counterinsurgency Field Manual (University of Chicago Press, 2007) 556

Appendix V. Excerpts from the Tamil Nation Website. 566

Appendix W. Excerpt from Sheldon Richman’s War Is a Government Program.. 580


The Late Great United States: The Decline and Fall of the United States of America, Appendices

Note on Appendices.

Appendix A is a list of 184 reasons why the US is dying.  This list formed the basis for the book, which is essentially an arranging of the reasons into categories and discussion of each category.

The remaining appendices are extracts from books that support the theses of this work.  As mentioned in the text, the reason for citing references is to show that the ideas presented in the book are not unique, or exclusive, or original to me.  I consider this to be important because in many casual conversations, people express disbelief at the assertions that I make about the state of the planet’s environment, or the global economy, or US culture.

Another reason for citing extracts of references is that many of the books dealing with topics related to the subject of this book do not have a wide circulation.  Most of them are in print, but not available in public libraries or from the Internet.  Also, many of my readers live in foreign countries.  In many instances they would have no access to these books at all, without purchasing them from a US or UK publisher, at substantial cost.  Because of the importance of the subject, I consider it important to provide these readers with a sampling of extracts from original sources.

The extracts presented here should not be considered a “representative sample” of the works – the selections relate to points of interest to me, and are not a comprehensive or balanced sampling.  (When I attended college in the 1950s, long before the age of the personal computer, social-studies courses such as Literature and History placed file folders containing copies of background reading materials in the library.  This reduced the cost associated with acquiring original source documents, and made the materials available to everyone, not just the first person to check the source document out of the library.  The appendices that follow are a modern version of these “reading files.”)

I have extracted extensively from two sources: James Howard Kunstler’s The Long Emergency (Grove Press, 2005) and Ellen Hodgson Brown’s The Web of Debt (2nd edition, Third Millennium Press, 2007, 2008.  Kunstler’s book provides an excellent discussion of the social and environmental ills that have been caused by “cheap oil,” and what we may expect in the way of changes over the next few years as “Peak Oil” passes.  Brown’s book provides an excellent discussion of the nature of the US banking and finance system, including its history and suggestions for improvements.  While the extracts from these two works are extensive, they are not excessive relative to the purpose, and in full compliance with the letter and the spirit of the “fair use” doctrine.  Kunstler’s book and Brown’s book are available from standard sources (e.g., Amazon, Barnes and Noble, Borders, and other sources cited on their websites, http://www.kunstler.com , http://www.webofdebt.com and http://www.ellenbrown.com .  I recommend these two books highly for comprehensive and detailed discussion of the global environmental and financial crises.  The Long Emergency costs $14.00; The Web of Debt costs $25.00 (softcover editions).


Appendix A. 184 Reasons Why the US Is Dying (or Already Dead)

Here follows a list of specific indicators that show why the US is dying.  The indicators are grouped into the same sixteen categories as were used in the text.   Within headings, items are listed in no particular order.

There is nothing in this appendix that is not addressed in the main text of the book.  This appendix is included simply for “historical interest,” to show how the book got started.

Destruction of the Biosphere

1.  The planet’s biosphere is being destroyed (deforestation, pollution, mass species extinction, and global warming, all human-caused (large-scale global industrialization)).  All industrial nations will perish, including the US.  The only significant issue is how and when, and what system of planetary management will replace the current system of global industrialization.

2.  Pollution and species loss.  Many fishing areas have been destroyed by industrial-scale fishing, overfishing and pollution.  Fertilizer runoff has killed a large portion of the Gulf of Mexico.  Acid rain has sterilized many lakes and killed many forests.  Pesticides and herbicides have killed many birds and other species.  Our land is now littered with nuclear waste dumps and urban-waste landfills.  Our air is polluted.  Pollution of the upper atmosphere by jet exhaust is considered to be a major contributor to global warming, but the government is committed to increasing air travel.

3.  Industrial society generates much waste that it does not reprocess, and cannot be reprocessed by the biosphere.  No system that continually generates much waste that is not reprocessed can continue for very long.  Our system is not designed for long-term survival.  It will quickly pass.  Industrial society is “spoiling its own nest.”

4.  The government is interested only in growth (population and economic), without consideration of the consequences, or for the long-term survival of the biosphere or the human species.  Growth cannot continue indefinitely within any finite system.  Any system based on long-term growth must fail.  Any system based on exponential growth (a percentage increase every year, such as our government promotes) will fail quickly (exponential growth is explosive, and explosions do not last very long).  The government is leading the country down the path to total destruction.  It is continually seeking ways to increase or maximize growth, ignoring the consequences.  It has no plans for long-term survival for itself, for the US population, for the global population, or for the biosphere.

Peak Oil

5.  Hubbert’s Curve and Peak Oil.  Hubbert’s Curve is an estimate of global oil production, plotted as a graph over time.  For global oil production, the curve is bell-shaped, low at 1950 and 2050, with the maximum (the Peak) occurring this decade. Our system is oil-based, and oil is disappearing.  Our system is characterized by urban sprawl – the destruction of much natural land and the use of much energy (oil) for commuting, especially by personal automobiles.  It is also based on energy-intensive industrial production of food.  As soon as global production of oil starts to decline, this unsustainable system will quickly disintegrate.  Society will eventually move to a current-solar-energy based system.  Now that Peak Oil is occurring (decline in global production of oil), the global industrial world will be starved of its primary energy source, and collapse.  Our way of life and security are almost totally dependent on oil.  As global oil production declines, this society will fall apart.

6.  The US government operates a greed-based system founded on growth-based economics.  But no system based on continued growth can endure.  The US government had a chance, when US birth rates dropped to replacement levels by 1972, to transition to a long-term sustainable society that could function on the “renewable” energy available from current solar energy.  Instead, it opted for massive, unsustainable growth, instability and eventual collapse.  Fossil fuels provide the energy for this system, and they will be available in ever-shrinking amounts.  As global fossil fuel production declines, this unsustainable society will collapse.  Instead of transiting to a current-solar-energy-based society, the government moved to a high-population, high-energy-consuming society that cannot continue without the availability of massive amounts of fossil fuel.  The US population is now several times larger than can be supported by current solar energy.  As global fossil fuel production declines, the country will be consumed in civil war, revolution and a mass population die-off.

7.  The population that can be supported in the US at a low level of living by solar energy is 63 million.  (The statistics on populations supportable by solar energy are taken from Can America Survive? and related documents at http://www.foundationwebsite.org/canam4x.htm , http://www.foundationwebsite.org/PopAnalysisAllCountries.txt and http://www.foundationwebsite.org/PopProfileAllCountries.txt .)  The current population is over three hundred million.  As fossil fuels decline, the world will return to current-solar-energy-based systems and population levels.  Oil will be gone within 40 years.  Coal could last several hundred years, but not if converted to oil (which requires much energy).  Even if the government decided today to reduce the US population to a level that could be supported by solar energy, it could not possible accomplish this goal before fossil fuels run out.  It is no longer possible to transition to a stable, current-solar-energy based system before fossil fuels run out.  The forced transition to a current-solar-energy-based system will be sudden, chaotic and catastrophic.  This convulsive return to current solar energy is now inevitable.  Compare current-solar-energy-based populations of US and Russia.

8.  As the world becomes more crowded, as the environment is destroyed, and as global oil production starts to decline, global resource wars will occur.  The demand for oil will be matched to supply by wars that destroy large numbers of people.  Global nuclear war is increasingly likely as the quality of life declines.  The US is vulnerable to destruction in global nuclear war.

Overpopulation

9.  Global population far exceeds the carrying capacity of current solar energy.  Global oil production, which supports it, is about to decline (Peak Oil), therefore crash is inevitable.

10.              There is overcrowding, therefore can’t enjoy nature, can’t be free.

11.              Long commutes.

12.              High housing prices.

13.              Malthus, Catton.

Fractionated Culture

14.              The US government has promoted mass immigration for many years, but the US no longer assimilates its immigrants.  It is now highly fractionated (racially, linguistically, religiously, and culturally / ethnically) and held together only by material wealth (“good times”).  As the good times disappear (with the passing of Peak Oil), the country will also.  There are no longer strong ethnic bonds holding the country together.  See Garreau’s Nine Nations of North America.

15.              Strong countries are made and kept strong by strong ethnic bonds (race, religion, language, culture).  The US is now sharing world power with several such nations, whose power is growing to rival the US (Russia, Communist China, India).  Those nations will soon have America’s economic power but not its cultural weakness (racial and ethnic fragmentation, lack of Machiavellian will).  They will destroy the US, as resource wars erupt over declining oil production.

16.              A large country that is diverse with respect to race, language, religion and ethnicity needs something other than these things to bind it together as a strong, cohesive nation.  In the US of today, that common interest is a high materialistic standard of living.  America had a strong ethic / cultural cohesiveness prior to passage of the Immigration Act of 1965, but after forty years of mass immigration from diverse cultures, some of which are inimical to historical US culture, this Act has destroyed the cultural world-view and cohesion of the country.  The Act has imported corruption and contempt for the individual into the US, and transformed it culturally to a third-world nation.

17.              The decline of nationalism.  A strong nation is a group of people with common bonds.  They are similar with respect to significant human attributes, such as race, language, religion, ethnicity and culture.  With the rise of global industrialization, nationalism has been substantially weakened.  In previous times, a person was born into a nation, and took his nationality for granted.  Now, in an era of mass migration, many people feel little loyalty to their nation.  They are concerned only with what it can do for them.  Many wealthy people now consider themselves citizens of the world, with no particular allegiance to any nation.  For many people, there is no longer a blind allegiance to one’s nation.  This is increasingly true in the US, where the government has declared war on the middle class and alienated its traditional population.  The US government, headed by the President, has committed high treason by allowing mass invasion of the country by illegal aliens.  America’s leaders – the wealthy elite and its political leaders (the government) – no longer embrace the concept of loyalty to the principles on which the country was founded.  The concept of nationalism is fading in the increasingly globalized world.  The glue that holds nations together is dissolving.  This is true not only in the US, but in all of the world, especially in Europe (with the rise of the European Union).

18.              There are few neighborhood schools.  The local community hardly exists any longer, in many parts of the US.  Everything is industrialized, denaturalized, dehumanized.

Decline in US Culture

19.              Ariel Durant’s observation that civilizations are destroyed from within before they are destroyed from without.

20.              General decline in US culture.  Bork, Schlesinger, Geyer, Bloom.  The US people are now soft, with nothing binding them together.  Advice of Artembares to Cyrus.

21.              Radical egalitarianism and multiculturalism are more important to the US than order, stability, and long-term survival.

22.              The cultural core of the US has lost its spirituality, and the essence that formed the nation.  Of the factors that make a nation strong – race, religion, language, culture – the country’s leaders since (following) Dwight Eisenhower have worked hard to destroy all of them.

23.              Stupidity and ignorance.  You often hear the charge that being against mass immigration is racist, or that “we are a nation of immigrants,” and therefore should continue to be.  Just because we have been something in the past is no reason to continue.  A person who was poor as a child has no rational reason to want to remain poor.  After the US Revolution and secession from Great Britain, the latter lost the ability to exile criminals to North America.  It then used Australian as its land of exile.  Australia is a land of “criminal exiles,” but there is no rational reason why it should continue to be.  All that matters is the future, what you desire it to be, and what actions you take to realize your desires.

24.              The current Chase Bank commercial, “I want it all, I want it all, I want it all, and I want it now” illustrates the essence of American culture.

Loss of Spirituality and Manifest Destiny

25.              The size of the US prison population is very high (one percent of the adult population) because many people no longer fit in the society that has evolved.  There is no place for many people.  There is no way out, no future.  Overcrowding causes social breakdown, the same as with rats and lemmings.  [On clearing out the prisons and exile:  Exile prisoners to Mexico, by adopting a reciprocal version of Mexican President Felipe Calderón’s policy that “Wherever Mexicans are, there is Mexico.”]

26.              Decline in religion.  Religion has failed to nurture and maintain spirituality.  There is nothing about the current system to believe in any longer.

27.              Loss of Manifest Destiny.

28.              Loss of hope for the future.

Globalization

29.              Species loss, deforestation.

30.              Pollution, global warming

31.              Spread of disease and alien species (e.g., fire ants, starlings, tumbleweed, kudzu, lamprey eels, Japanese beetles, HIV, syphilis).

32.              Loss of stability, loss of comfort of one’s own culture.  Loss of link to one’s own land.

33.              Decline in nationalism.

34.              Marx, Huntington’s Clash of Civilizations, Kaplan’s Rise of Anarchy, Plato’s Republic, Keynes’ remark to grandchildren.

Low Security

35.              Because of mass immigration from diverse cultures, massive international free trade, and open borders, national security has been severely degraded.  High-tech weapons are now in the hands of the third world.  Great damage can be done to the industrialized world at very little cost by its enemies.  “9/11” is a prelude (in miniature) of things to come.

36.              Individual security has fallen to low levels.  Houses and cars must be kept locked at all times.  Identity theft is rampant.  Economic security is low.  Increasing use of Spanish makes it very difficult for native English speakers to obtain jobs in Spanish-speaking areas.

37.              Membership in violent gangs is exploding, and it is no longer safe to walk in many parts of most cities.  The president of the student body of the University of North Carolina at Chapel Hill (from where I received a PhD in statistics) was recently executed by a gang member, with five shots to the face.  The executioner had earlier killed a graduate student from Duke University (incident downplayed by the press).  Gangs are seeking “high profile” murders and brutal murders, not just random violence.  A local young mother had her hands and feet cut off (Spartanburg Herald Journal, 9 April 2008, “Authorities identify woman whose body parts were found).

38.              There is much resentment that Durham, NC, District Attorney Mike Nifong brought false charges of rape against three members of the Duke University lacrosse team.  The charges were made by a black woman who worked as an escort, prostitute and stripper, and blindly and aggressively prosecuted by Nifong evidently to enhance his chances for reelection in a city with a sizable black population.  The men were later exonerated.  The woman messed up the three young men’s lives so much that the term “Nifong” now means to assert false charges or to totally screw up someone’s life.  The case cost local taxpayers millions of dollars.  There is resentment in the local community that no charges have been levied against the woman for her crimes.

The Politics of Envy

39.              The “politics of envy” motivates many to seek the destruction of the US.  This factor is at work both internally (because of the increasing income gap in the US and the flaunting of wealth by the very rich) and externally (because the quality of life in the third world is extremely low and they have no hope of achieving our standard of living.  (The “politics of greed” is the practice of using political power to amass wealth.  The “politics of envy” is the motivation for the “have-nots” to seek to destroy the “haves,” after they realize that they will never share their wealth.  The politics of envy will be clothed in religion, i.e., as Islam (representing antimaterialism) vs. Christianity (representing materialism / capitalism)).

Oppression

40.              The US imprisons one percent of adult population, including one in seven black men.

41.              The US imprisons border guards who are doing their jobs.

42.              The US income tax system is very oppressive (lack of freedom to be anonymous; guilty until proved innocent; expensive to administer; intrusive; subject to abuse (Hansen’s To Harass Our People); illegal tribunals (tax courts)).  (The Constitution proscribes courts outside the Judicial Department: the tax court is in direct violation of this requirement.)

43.              The US is a fascist dictatorship.  Refer to “Fascism Anyone” by Laurence W. Britt, Free Inquiry Magazine, Vol. 23, No. 2, Council for Secular Humanism, at http://www.secularhumanism.org/library/fi/britt_23_2.htm ); and Thom Hartmann’s “Dismantling Democracy: What’s Behind the Magic Trick of War” published on Sunday, September 22, 2002 by CommonDreams, at http://www.thomhartmann.com/index.php?option=com_content&task=view&id=23&Itemid=120 or  http://www.commondreams.org/cgi-bin/print.cgi?file=/views02/0922-06.htm .

Decline in Freedom

44.              Loss of security, hassle of security checks.

45.              Identity theft.

46.              Loss of land, much open space gone.

47.              With income tax and SSN, loss of anonymity.

48.              Antidiscrimination laws; loss of ability to associate with whom one pleases, diminished ability to do as one pleases.

Alienation of the US People from the Government

The US government no longer obeys its own laws

49.              Under the US Constitution, the US President and other leaders are required to protect the country from invasion.  The President and other leaders take a solemn oath to uphold the Constitution.  By allowing the invasion of the country by 12-20 million illegal aliens, and doing nothing to rectify this situation, the President and other leaders who vow to uphold the Constitution are guilty of high treason.

Civil rights and affirmative action

50.              There is much resentment against civil rights legislation.  In 1964 the government passed a comprehensive civil rights law banning many forms of social and economic discrimination, and then proceeded immediately to pass comprehensive and oppressive “equal opportunity” and “affirmative action” laws to enforce its desired “reverse” discrimination (euphemistically referred to as “positive” discrimination).

51.              Political correctness has reached obscene levels.  With mass immigration, little assimilation has occurred.  There is no longer a strong cultural core, no shibboleths.

The US government punishes native US citizens and lets illegal aliens flaunt the law

52.              By its actions, the US government appears to care more for illegal aliens and nonproducers than for the middle class worker and traditional US culture.  In the Ramos/Compean case, for example, the government chose to give immunity to prosecution to an illegal-alien drug smuggler rather than to citizen Border Patrol agents.

The US government has abolished security, to generate more wealth for the wealthy elite

53.              Through its policies, the US government has caused a decrease in security at both the national and individual levels.  The country is vulnerable to attack (e.g., 9/11).  Crime is rampant at the local level.

Illegal aliens are killing, maiming and murdering US citizens on a massive scale

54.              Each year, illegal aliens kill and maim more US citizens than soldiers die in the war in Iraq.

The US is now a fascist dictatorship

55.              The US has become a fascist dictatorship.  If you look up the definition of “fascist,” you will see that it means an oppressive system of government in which government is in a strong alliance with business: a system of government that exercises a dictatorship of the extreme right, typically through the merging of state and business leadership, together with belligerent nationalism (American Heritage Dictionary, 1983).  Refer to articles by Hartmann and Britt.  It now takes on the order of one-hundred million dollars to run for president or for senator of a major state.  The only way that most people can get elected to high office in the US is to accept tens of millions of dollars from the wealthy elite who own and run the country.  The governing class of the US is in hock and in thrall to the wealthy oligarchs.  The US government now serves the wealthy, not the people.  The US government imprisons one percent of its adult population, including one in every seven adult black men.  Even the major organized religions have joined the government in promoting mass immigration and the destruction of the US middle class.  The government is waging war on the middle class.  The US middle class does not want mass immigration, massive international free trade, or open borders, but the US government insists on forcing these things, against the will of the people.  Through its policies of mass immigration, massive international free trade and open borders, the government is destroying the country’s environment, culture and quality of life for the middle class.  The US middle class does not want Border Patrol agents Ramos and Compean held in prison for their “crime” of fighting illegal aliens, but the US Justice Department has imprisoned them, the US President adamantly refuses to pardon them, and the US Congress does nothing.  The US is a republic, under which the citizens elect representatives to make decisions for them.  But these representatives now represent only the wealthy, not the middle class, as when the country was founded.  The US is now a fascist dictatorship (where the dictator is the global military-industrial complex) – it is no longer a government “of, by and for the people.”

56.              There is resentment that under US law, corporations enjoy the same rights as natural persons (see Thom Hartmann on Dismantling Democracy).

57.              The US imprisons a large portion of its population (about 1 percent of its adult population), many for crimes for which many people do not consider prison appropriate (e.g., marijuana).  This affects many families, and has generated widespread resentment and hatred of the government.

58.              There is strong resentment over the income tax and the IRS.  The original Constitution outlawed such indirect taxes.  When the income tax act was passed in 1913, it was to be a tax of about one percent of the income of the richest one percent.  Income taxes (federal, state, and local) and other income-related taxes (Social Security, Medicare) now consume a substantial portion of each person’s income, for most citizens. 

59.              The income tax, originally intended as a tax on the extremely rich, is now a major tool of repression against the middle class.

60.              There is resentment over having to work for a substantial portion of the year to pay taxes that have little to do with quality of life or security (such as transfer payments, bloated federal programs, political wars, welfare programs, corporate bail-outs)..

61.              There is resentment over the Tax Court.  Under the original Constitution, all tribunals had to be in the Justice Department.  The Tax Court is an illegal tribunal, in which the burden of proof rests on the individual – he is guilty until proven innocent.

62.              There is resentment over “political” wars, such as in Vietnam and Iraq, in contrast to “survival” wars, such as the Second World War.  There is resentment that the US soldier is required to fight impossible-to-win insurgencies in foreign lands (the enemy cannot be distinguished from the general population; this problem is a political one, not a military one; the soldier cannot do battle with the enemy, but is instead assassinated by an invisible foe).

63.              There is resentment over the “war on drugs,” which criminalizes many, generates much crime, and imposes a heavy tax burden on the middle class.

64.              There is resentment over our system of justice, which is adversarial.  Trials are about the law, not about justice.  There is no desire to search for the truth, just to win the case (and earn large fees).  Mandatory sentencing is blind and unreasoning, and a strong departure from English common law.  There is resentment that legislation such as the Racketeer Influenced and Corrupt Organizations (RICO) Act are used against individuals, in complete violation of the law.  There is resentment that “money laundering” laws, which apply only to profits from an illegal gambling ring, are used to prosecute individuals in other circumstances.

65.              America’s health care system is a disaster.  One in four teenage girls in the US has a sexually-transmitted disease.  Because of US government policies designed to make the medical establishment very wealthy, the cost of medical care has been pushed to extremely high levels, to the point where the average family cannot afford even basic care and is in danger of being destroyed financially upon the occurrence of the next even-modest illness or accident.  The system has been designed so that no alternative low-cost care is available.  The system serves mainly the wealthy, who produce, own and operate the medical facilities and equipment and provide the inflated-cost monopolized services.  Forty percent of the population have no health insurance, because they can’t afford it.  The “solution” being proposed by the current presidential candidates is to force everyone to purchase high-cost medical insurance; for those who cannot afford it their insurance will be paid by tax dollars (thereby forcibly transferring massive amounts of money from the tax-paying middle class to the medical establishment, in support of their government-supported bloated costs).  No consideration is being given to low-cost options, such as local basic-care clinics staffed by the Public Health Service.  Through the use of high-pressure television ads that promise good health and longevity, consumers are exhorted to request their doctors to prescribe expensive drugs.  Physicians regularly prescribe very expensive drugs instead of low-cost alternatives (e.g., prescription of a $5 Plavix pill instead of a $.01 aspirin tablet, which is about as effective and has less-severe side effects).  The US health care system has been deliberately designed so that, for a family of average means without medical insurance (costing the family about a thousand dollars a month, if they can get it at all), the occurrence of the first significant (moderate) illness or accident of a family member will probably result in all of the family’s assets being transferred to the medical establishment.  If ever there was an example of an “evil” system, this is one.

66.              There is resentment over the takeover of the medical profession by foreigners.

67.              There is resentment that the government has promoted policies to inflate the cost of medical care to high levels, such that many families are wiped out financially by the first modest medical incident (accident, illness).

68.              There is resentment over the revision of the bankruptcy laws, making it much harder for individuals to declare bankruptcy.  There is resentment that much bankruptcy is caused by the government’s policy of pushing medical costs to astronomical limits (to transfer wealth to the medical establishment), so that the first modest medical incident will ruin most families financially.  There is resentment that the government allows banks and credit-card firms to charge usurious credit-card interest rates and fees, while at the same time making it harder for individuals to declare bankruptcy.

69.              There is intense resentment that in the mortgage crisis the government seeks to bail out large corporate lenders rather than individuals, who are losing their homes.  It was announced on March 16, 2008, that the US government would guarantee almost 200 million dollars of J. P. Morgan’s buyout of investment banking firm Bear Stearns, after the collapse of that firm due to the bad mortgage loans.  All that the government has offered to individuals, however, is advice on “restructuring” their loans.  The government is willing to spend up to 200 million dollars to save one mortgage firm, yet it will not save one individual mortgagee.  The terrible irony of this is that it is using taxpayer money – paid by the middle class – to save the banking firm but not the taxpaying citizen.  The government caused this crisis by allowing and promoting the giving of zero-down-payment loans.  After it got the “hook” in, with homeowners now liable for thirty years of high payments, the mortgage market collapsed, and the government will do nothing to help the individual – only the wealthy-elite mortgage company.  (It is under no obligation to bail out either banks or individuals, but if it is going to help in some way, why did it choose to help the banks?)  People are now waiting to see whether the leaders in Bear Stearns will be rewarded for losing their stockholders’ money (the stock was sold to J. P. Morgan for $2 per share, down from a high of $171 and $57 the preceding week), or go to jail for their scandalous economic crime.  Of course they won’t.  The government regulators and the mortgage bankers will get off scot-free and pronounce each other blameless – the middle class taxpayer will be left holding the bag – and the mortgage.  (Just a couple of weeks after the agreement to purchase Bear Stearns for $2 a share, J P Morgan, the government’s intermediary bank in the Bear Stearns bailout, announced that it planned to “renegotiate” the purchase price from $2 per share to $10 per share.  This transfers massive wealth to Bear Stearns’ stockholders, all guaranteed by the US government and ultimately, the US taxpayer.  For weeks preceding the collapse of Bear Stearns, economists were declaring that the US housing market was such a small part of the economy that its substantial decline would not bring about a severe downturn in the economy.  Once Bear Stearns collapsed, the story line was that if it was not saved by a government bailout, then the entire financial system might collapse, since (1) housing mortgage debt had been distributed into many derivative financial instruments and (2) these instruments were not backed by real assets, but simply represented “securitized debt.”)  President Bush has suggested that “market forces” be allowed to resolve the crisis, leaving the individual mortgage alone to sort things out.  Why does he not want market forces to work on the mortgage banking firm?   It acted in the same fashion in the bailout in the savings and loan crisis and the collapse of the Long Term Capital Management hedge fund.  Using taxes from the middle class, the government bails out only the wealthy, not the middle class.  See “The Pendulum Swings” in the March 8, 2008, issue of The Economist.

70.              When I see US passports now, they are often held by members of an alien race or culture.  Being reminded that our government has given our country away to foreigner’s causes profound resentment and anger.  The government is in the process of giving away the country that our forefathers fought and died for.  (My forebears have lived in North America since the 1700s -- before the United States was a country.  Some of them objected to secession of the colonies from Great Britain and moved to Canada from the new United States after the US Revolution (i.e., they were “United Empire Loyalists”).  Many have lived in both Canada and the US since then.  They fought hard to establish both Canada and the USA.  Many of my relatives have served in the armed forces.  I am sure that they and their contemporaries would “roll over in their graves” if they now saw the current leaders of the US and Canada giving away the land that they fought and killed and died for.)

71.              I was in a department store a few days ago, purchasing a floor mop.  Ahead of me were three groups of people: an Hispanic family of four, two Hispanic girls, and a Hmong family of four.  I was the only Caucasian in the line.  This reminds me of the day when I was in an elevator in the Canada Trust building in Toronto: the elevator contained 11 people, and I was the only Caucasian – all of the others were Asian.  The US and Canada are giving their countries away.

72.              Current political leaders sense the unhappiness of the citizens, and they promise change.  But this is an empty promise.  They have no intention of changing the system in any way from the essence of what it is – a wealth-creation tool for the rich.  What they mean by “change” is that they will replace the incumbents with themselves, and continue “business as usual” in serving the wealthy elite.  The electorate is gradually realizing that the system is self-perpetuating, that they have no power, and that the elected politicians work only to preserve it and serve the wealthy controllers.  The people will see again and again (after each succeeding election brings no real change) that the system cannot possibly be changed (for the better) from within, and they will eventually realize en masse that the only possible means of substantive change is revolution, rather than evolution.  At that point, the random violence that we have seen in places such as Northern Ireland, Palestine and Iraq will spread to the US, perpetrated by its own disenchanted citizens.  To paraphrase Malcolm X, “The chickens will have come home to roost.”  The violence will escalate rapidly.  It will be embraced by the “core culture” of America, who will vent their anger on their leaders, who have given away the country’s land, despoiled its environment, decimated its cultural integrity, and destroyed the quality of life for the middle class (from the overcrowding wrought by the decades-long policies of mass immigration, massive international free trade and open borders).

73.              The US government is now the enemy of the people.

Many US jobs are being exported, and many US-based jobs are being given to foreigners

74.              Massive international free trade (NAFTA, CAFTA, GATT, WTO); outsourcing; H-1B visa program; mass immigration.

Discrimination against the capable

75.              Affirmative action.  Grade-restricted promotions.  Promotion of teachers based on seniority, not capability.

The US government sues the Salvation Army over its use of English

76.              Loss of jobs by English-speaking citizens to bilingual speakers in areas where large Hispanic populations have migrated.

The benefit of productivity goes to the wealthy elite

77.              There have been massive productivity increases in the US over the past half century, but the benefits of this have gone to the wealthy elite, not to the worker.  It now takes two incomes to support a family, whereas 50 years ago it took only one.  The US middle class family must now provide two full-time workers to the formal labor market, whereas 50 years ago it provided only one.

The US government promotes the dollar drain and dollar weakening

78.              By its profligate spending in the war in Iraq, following a no-win strategy, the US government has generated a massive debt, foreign trade deficit, and much weakening of the US dollar.  This benefits the wealthy, who see substantially increased demand for their exports.  Now that the US imports much of its food and consumer goods, the standard of living of US consumers is falling rapidly, since they now have to pay much more for foreign imports (food, oil, cars).  This did not need to happen, since the US was once self-sufficient.

Mexican truckers get a free ride on US highways

79.              As of last year, Mexican truckers may use US highways free of cost.  US truckers must pay taxes to build these highways.  This puts the US trucker at a tremendous disadvantage.

The US government is selling US infrastructure to foreign entities

80.              The US government is selling highways to foreign firms.  This infrastructure was paid for by the US taxpayer, and should not be sold to foreigners.

The “Dream Act” is an affront to the US middle class

81.              The US government has been allowing massive illegal immigration for many years.  The US government granted amnesty to millions of aliens in 1986, and allowed many of them to become citizens.  It is now doing this again.  The US middle class do not want mass immigration, they do not want illegal immigration, and they do not want to grant illegal-alien invaders a path to citizenship (“amnesty”).

There is strong resentment that US government policies have caused the decline in the quality of life of the US middle class

82.              There is resentment over the destruction of the planet by the system of global industrialization that the government has promoted.

83.              There is resentment over the destruction of US wildlife and natural beauty from mass immigration.

84.              There is resentment over the flooding of the Colorado River to provide power and water to an ever-increasing population, and to provide irrigation to produce agricultural exports that are not needed by our people, but are produced only to generate wealth for the wealthy elite.

85.              There is resentment at the land and water pollution.  Mercury poisoning of the Great Lakes.  Acid rain sterilization of lakes and killing of forests.  Love Canal.

86.              There is resentment that the government allowed the dust bowl, the collapse of the Ogallala Aquifer, the damming and drying of the Colorado River, the destruction of the Everglades by the US Army Corps of Engineers’ “chanellization” program, the transformation of California and Florida from beautiful agricultural states to overcrowded urban “hells” and the disappearance of US topsoil.

87.              There is resentment that, because of the government’s policy of mass immigration, natural land is being destroyed, and that most people who return to their birthplace after a few years see the neighborhood woods and natural places destroyed to make homes and infrastructure for the exploding population.

88.              There is resentment that our cities have few “green” zones, such as parks.

89.              Because of overcrowding, low-cost recreation is now a thing of the past.   The middle class are beginning to realize that it is the overcrowding that is the essential problem, and the only solution to this government policy is termination of the government and reduction of the population – the same “ethnic cleansing” that has occurred in other places where the population becomes too large and fractionated (e.g., Rwanda, Yugoslavia, Germany).  Because of overcrowding, the only way to enjoy exquisite natural beauty now is to pay very large amounts of money, i.e., to operate within the system.  Natural beauty that could once be enjoyed by anyone is now available only to the wealthy, or at high cost.  Beach houses on Fripp Island, SC, now sell for over a million dollars apiece.  As in the cult movie Metropolis, all must work at industrial jobs.

90.              There is resentment that the cost of many things has been pushed to high levels because of the government’s policy of massive population growth (which increases demand for just about everything).

91.              There is resentment that the government’s monetary policy has caused hyperinflation.  When I left for Zambia in 2002, a glass of wine cost about $2.50 in many restaurants.  Now, the price ranges from $4.50 to $10.00, with an average of about $7.50.  (Over the long term, inflation is really high.  Yesterday (March 10, 2008), New York Governor Elliott Spitzer (yes, the same fellow who attempted to give New York driver’s licenses to illegal aliens) admitted to consorting with a prostitute.  Her fee for the evening was $4,300.  When I was a boy in Spartanburg in the 1950s, the going rate for a prostitute was $20 for a white girl and $5 for a black girl.  In Las Vegas in 1963, the rate was $100 for a beautiful woman.  Now, it seems, it is several thousand dollars in Washington, DC.  Talk about inflation!)

92.              There is resentment that the government has debased the currency.  Gold, which sold for $42 per ounce when I was young, now costs about $1,000 per ounce.  Paper money is no longer backed by precious metals – silver certificates were withdrawn in the 1960s and replaced with worthless paper.  The Federal Reserve System (established in 1913) is in violation of the original Constitution.

93.              There is resentment that banks and credit-card companies use predatory practices to induce young people to obtain credit that they cannot afford, that they are charged exorbitant fees and usurious interest rates, and that the government has passed draconian laws to support their practices (difficult to declare bankruptcy; use of courts for collection).

94.              There is resentment that banks can loan money that they do not have.  Debt-based money.  Fractional reserve requirements (now based on debt).

95.              There is resentment that usurious interest can be charged on bank loans.  Compound interest.

96.              There is resentment over the government’s policies of radical egalitarianism, affirmative action, political correctness.

97.              There is resentment that Washington’s Birthday, which honored the “father of the country,” has been replaced by Martin Luther King Jr.’s Birthday, which honors a rake who represented a small minority of the country.  There is resentment that whereas there were two separate days honoring Presidents Washington and Lincoln, there is now just a single “unnamed” Presidents’ Day, and a “named” day for King.

98.              There is resentment over our leaders giving our country away to foreigners.

99.              There is resentment over having to have mothers work, and for children to be in day care from the time that they are born.

100.          There is resentment at being subjected to intensive security measures at airports.  This is a direct result of the government’s policy of promoting mass immigration from alien cultures and open borders.

101.          There is resentment that minority groups such as blacks and Hispanics receive Social Security Disability benefits with ease, while white applicants must hire a lawyer, sue the government, and give a third of their recovered benefits to the lawyer (about 10,000 Social Security disability cases currently pending in North Carolina).  There is resentment that it is necessary to hire a lawyer at very high cost (e.g., a contingency fee equal to one-third of the settlement, after expenses) to obtain entitlements such as workers’ compensation or disability payments.

102.          There is resentment that recent immigrants receive the same economic, social and educational benefits (e.g., Social Security, Medicare) as natural-born citizens who have been working and paying taxes all of their lives.  There is resentment that the government may effect a Social Security “totalization” agreement with Mexico, resulting in the payment of billions of dollars of retirement funds to Mexicans, when the financial soundness of the system is already in serious doubt.

103.          There is resentment over the collapse of the housing market, which was caused solely because of the government’s refusal to regulate this financial market.

104.          There is resentment at the long commutes caused by mass immigration / overpopulation.

105.          There is resentment at the loss of jobs by native English speakers, who lose out to bilingual immigrants in jobs in teaching, sales clerks, bank tellers, and any other job dealing with the public, in areas in which there are many Spanish-speaking residents.  The government should require the use of English as the national language, and not allow Spanish-language capability to be a job requirement for any job for which Spanish was not traditionally required (e.g., teaching, bank tellers, public officials).

106.          There is resentment that the government’s policies (mass immigration, massive population growth, urban sprawl, privately owned cars) have pushed the price of oil to very high levels, so that home heating and commuting to work are now very difficult for many people.

107.          There is resentment that the government does not control, or even inspect, or even allow to be labeled, food imported from other countries.

108.          There is resentment that the government does not enforce food labeling laws, which would reveal which food was imported.

109.          There is resentment over poisoned food imported from Communist China.

110.          There is resentment that our drinking water is now laced with pharmaceuticals.

111.          There is resentment that the government does not inspect toys, and has allowed millions of toys painted with toxic lead paint, to be imported into our country from Communist China.

112.          There is resentment that, because of mass immigration and overcrowding, water bills are skyrocketing and water shortages are becoming common.

113.          There is resentment that the US government destroyed the telephone monopoly (AT&T), causing telephone bills to skyrocket (for many years, the cost of basic service was about $10 per month – it is now many times this amount, whereas, because of improved technology, it should be much less expensive).

114.          There is resentment that the US government does not regulate television, which was once free and now costs $40 per month or more.  When cable TV was introduced, it cost $11 per month, and we were told that there would be no commercials – what a lie!

115.          There is resentment that the passenger train system has been destroyed, because of massive subsidies paid for highway trucking.

116.          There is resentment that the US subsidizes the production of ethanol for motor fuel (especially since it requires the use of more than one gallon of gasoline to produce the corn required to produce one gallon of ethanol!).

117.          There is resentment that the US government subsidizes tobacco production.

118.          There is resentment that the US government forbids the use of psychotropic drugs (without a prescription).

119.          There is resentment that the US government forbids the use of marijuana to suppress nausea in serious illness.

120.          There is resentment that the government allowed General Motors to buy up electric trolley-car systems around the country and replace them with noisy, foul-smelling buses.  There is resentment that the US government killed the electric car (Sony’s 2007 film, “Who Killed the Electric Car.”

121.          There is resentment that the government promoted mass immigration, making it difficult to maintain electric trolley car systems.

122.          There is resentment that the government is selling the country’s land and infrastructure to foreigners.

123.          There is resentment that the government would have allowed an entity of the United Arab Emirates (Dubai Ports World) to operate our ports.

124.          There is resentment that the government allows and even promotes the sending of remittances by immigrants – legal and illegal – back to their home countries.

125.          There is resentment that the US government, led by the US president, does not protect the country from invasion, and instead aids and abets the occupation of our country by 12-20 million illegal-alien invaders.

126.          There is resentment that the government gives immunity from prosecution to an illegal-alien drug smugger instead of to US Border Patrol agents, and sentences the Border Patrol agents to ten years in prison based on this testimony.

127.          There is resentment that the government imprisons one percent of its adult population.  There is resentment that the government imprisons one in seven black men.

128.          There is resentment at having to compete with foreigners for jobs in our own country, at all levels.

129.          There is resentment at being prohibited from doing “profiling” (e.g., of rapists, drug dealers and terrorists) when this is a logically sound method for making decisions (Bayes’ Rule).

130.          There is resentment that the US government has allowed the Jewish culture (the “Israel Lobby,” or “Jewish Lobby,” to take over US government, and make the US a client state of Israel.  There is resentment that the US government allowed the Jewish Anti-Defamation League and Senator Edward Kennedy to pass the Immigration Act of 1965, and destroy US culture, environment, and quality of life for the US middle class (from overcrowding).  There is frustration that you cannot criticize mass immigration without being accused of being a bigot and a racist, just as you cannot criticize Israel without being accused of anti-Semitism.  (The word anti-Semitic is a strange word, in view of the fact that most Jews and most Arabs are Semites.)

131.          There is resentment that the US government lied to the US people about invading Iraq, and invaded on “trumped up” charges.  It wanted control of Iraqi oil, but would not admit this.  It claimed that Saddam Hussein was developing or harboring weapons of mass destruction.  This was proved false.  It then claimed that it was freeing the Iraqi people from a cruel dictator, but the quality of life for Iraqis is lower under American domination than under Hussein.  It claimed to be bringing democracy to Iraq, when their culture has never accepted this system.

132.          The US population no longer trusts the government.  It has seen too many lies, such as invading Iraq for the oil but pretending it was to remove weapons of mass destruction, remove Saddam Hussein, or plant democracy.  It is seeing through the hypocrisy, such as Hillary Clinton and Barack Obama’s calls for forced health insurance and calling it universal health care.  It is seeing through the empty promises of the political candidates who, as Plato asserted, will pander to the masses and promise them anything.  It is seeing that calls for change (e.g., replace Republicans by Democrats) are a complete waste of time, since the replacements stand for the same things (promotion of the wealthy elite) that the incumbents before them.

133.          There is resentment that the US government now gives US citizenship away in an annual lottery in which 50,000 permanent-resident visas are awarded randomly to people around the globe.

134.          There is resentment that each year the US government awards resident visas willy-nilly to a million people, who are then awarded citizenship after five years.

135.          There is resentment that it awards visas for refuge for trivial reasons, such as the “wet-foot/dry-foot” practice of awarding a visa to Haitians and Cubans who are able to set foot on US soil.

136.          There is resentment that it awards 65,000 work visas every year under the H-1B program, when all of these jobs could easily be filled from our population of 300 million people in the most technically advanced country in the world.

137.          There is resentment that birthright citizenship is awarded to any child born to anyone physically present in the US, even if they are here illegally.

138.          There is resentment that a Mexican whore can slip across the border into the US, have a child which is automatically granted US citizenship, and then herself become a US citizen because she is now the mother of a US citizen.

139.          There is resentment over “chain migration,” under which citizenship is awarded to extended family members for no other reason than they have a relative here (“family reunification”).

140.          Because of overcrowding, it is no longer possible for most families to engage directly in major spectator sporting events, such as baseball or football.  These sports are now available to the individual only via a television set.

141.          Because of the government’s policy of massive population growth and overcrowding (which generate much wealth), many people have very long commutes and must pay much of their income for commuting and housing.  There is no alternative.  All alternatives have been removed, and every person is forced to join the rat race, to work long hours just to get by.  Because of the government’s policy of high population increase from immigration, the quality of life has declined substantially over the past several decades.

142.          Man likes to build, to explore, to wage war, not to serve.  But the US government wants mainly service workers, and has shipped most manufacturing jobs overseas.

143.          Most food consumed in the US is industrial produce, laced with dangerous chemicals.  These chemicals have led to an epidemic of disease, including diabetes, allergies, cancer and obesity.  Because the country’s population now grossly exceeds the levels that can be supported by current solar energy, we are now dependent on this industrial food production, with its heavy use of chemicals such as pesticides and preservatives.  We have pickled our bodies, and are now paying for it.  The industrial system causes mass disease, and forces people to pay extreme prices for high-tech medicines and procedures to alleviate it.  Everything is industrialized – agriculture, health, education.  Cite Robbins, Hartmann, Time.

144.          Massive population increase from immigration has driven the price of many precious things out of reach of the common man.  Even land is now prohibitively expensive.  After twenty years of operation, a local Chinese restaurant closed its doors upon the retirement of the father who founded it.  The restaurant, on a quarter acre of land on a street that a few decades ago was a rural lane at the edge of town, sold for $780,000.  I assumed that this was the price for the business.  But I was wrong.  As soon as the deal was completed, the new owner of the land demolished the restaurant and erected a new building.  The quarter acre of land alone sold for $780,000.  Local land has been “commoditized” and put out of reach of most people.

145.          Natural areas near cites and towns are being destroyed by immigration.  South Carolina has more Mexican immigrants that New York, Pennsylvania and New Jersey.  At last count, there were 400,000 illegal aliens in South Carolina.  The woods where I used to hike as a boy – formerly called Camp Wadsworth (a World War I training camp) has been totally replaced by housing developments.  Most of Camp Croft (a World War II training camp) has fallen to the same fate – and invasion by Hmong.  My home state of South Carolina is being destroyed by the US government’s policy of mass immigration, open borders, and adamant refusal to enforce immigration laws.  When I lived in Florida in 1953, it was a beautiful rural state with about two million population.  Now, mainly because of mass immigration to the US, the population of Florida is 18.2 million (2007 estimate).  The quality of life is so bad that people who moved from the North to Florida are now moving back north.  But they do not want to move all the way back, so they are going “half-way” back, to South Carolina.  There are so many of them that they now have a name, “Half-backs.”  The population of South Carolina has skyrocketed from two million when I lived here in the 1950s to about 4.5 million now.  Article, “Spartanburg grows by 8.6 percent” (from 2000 to 2007) in Spartanburg Herald-Journal, 20 March 2008).

146.          There is resentment that children are no longer raised by their parents or families, but in industrial day-care centers. 

147.          Previously, in a less densely populated country, the “average Joe” had easy access to nature.  With the overcrowding that has resulted from mass immigration, this is no longer possible.  Helplessness and hopelessness prevail.  It is the entire planet that is self-destructing, not just the US.  The system just gets worse, generating economic activity and industrial waste and destruction at a faster and faster rate.  There are more people and less nature (species, forests, natural land) every year.  The sooner that this pernicious system is destroyed, the better it will be for the planet’s biosphere and the few human survivors of the age of global industrialization.

There is no way out for the middle class

148.          Organized religion, which was founded by government to serve government, traditionally offered spiritual support for people (as part of the program).  It has now essentially abandoned them, since it now offers little spiritual support) and functions almost exclusively to serve the government and the wealthy elite.

149.          Many laws are in direct opposition to the welfare of the individual, such as the income tax and the recent changes to the personal-bankruptcy laws.  Many government policies, such as mass immigration, massive international free trade and open borders are detrimental to the welfare and security of the US middle class.

150.          Many Americans feel alienated and marginalized.  The government, through its policy of massive international free trade, sent much of our manufacturing capacity overseas.  Through its policy of mass immigration, it brought in millions of people to overcrowd the country, caused natural places to become terminally developed and prohibitively expensive for many people, and forced Americans to compete for jobs and space in their own country.  The government is changing the US economy from a manufacturing economy to a service economy.  This policy creates dependence on massive international free trade, since the manufactured items that our society requires are to a large extent no longer made in the US.  Massive international free trade destroys our security.  Many workers, particularly men, are not well suited to service jobs – they prefer to grow things and to build things and to operate things, not to serve others.  To be happy, people need meaningful work.  The US government has robbed many US citizens of meaningful livelihoods, their natural resources and the quality of their lives.  Many US citizens are now very unhappy with their lives, and are seeing no meaningful future for their children.  The government has brought in foreigners to crowd them out of their own living space and culture.  The government has betrayed its own citizens.

151.          An individual can no longer survive on his own.  He is forced to live and to work within the system, since the land has become overcrowded.  This overcrowding causes much frustration and has substantially diminished personal freedom and quality of life.  The overcrowding has led to much violence and crime, so that millions of Americans are now imprisoned.  Because of massive international free trade, many of the jobs that society now offers are not meaningful or agreeable (one of the reasons often cited why we “need” illegal aliens – “to do the jobs Americans are unwilling to do”).  The only solution to the problem of overcrowding is a massive reduction in the population, which the government strives to avoid.  The only solution to the problem is therefore the destruction of the system that has spawned it and seeks to maintain it (and make it worse).  In its early days, America represented an escape route from overcrowded Europe.  Because of the government’s population policy, America is now crowded, low-cost land is no longer available in quantities sufficient to support an individual and his family, and there is nowhere else to go.

152.          In light of recent “midterm” (2006) US elections, where candidates promised change but no change occurred, Americans are beginning to realize that there is no way to cause change from within the system, but that the only way to accomplish meaningful change is to change the system itself.  The individual is helpless to change the system.  There is no way out.  There is no longer meaningful work available for a large proportion of the population.  While there are many jobs in a modern service-oriented society for women, men have been marginalized and emasculated.

153.          There are no longer any statesmen, any leaders, who represent the people.  Today’s leaders represent only the wealthy elite.  It takes on the order of $100 million to run for president.  The only candidates are extremely wealthy or must accept massive amounts of money from the system that generates this wealth.  They will serve it, not seek to change it, since it enabled their election.  No one represents the common man any longer.  The system will continue to destroy his quality of life as long as it continues.

154.          There is strong resentment over the government’s allowing mass immigration from alien cultures, thereby handing the country – its land, natural beauty, and environment – over to others, and diminish the quality of all of these.

155.          Massive international free trade, mass immigration and open borders are destroying the US middle class (US government trade policies are destructive of the US middle class, not protective or supportive of it).  The government now exists to serve the wealthy elite, not the people.  When times get rough, after the passage of Peak Oil, the people will rise against the government for two reasons: (1) the mentioned government policies will have worsened the problem; and (2) the government abandoned them to serve the wealthy elite, and committed treason against them by allowing a mass alien invasion.

156.           

Quality of Life Is Declining for the US Middle Class, and Will Decline Rapidly with the Passage of Peak Oil

157.          The wealthy elite are able to control the vastly larger non-wealthy population primarily through the fear that the latter might lose their material comforts.  As Peak Oil passes, the decline in quality of life for the masses will become noticeable, then severe, the profound.  At some point, the masses will realize that they have no hope for a better life, and that a miserable life for them and their children is inevitable.  At that point, the politics of envy will “kick in,” and the masses will turn against the wealthy elite.  As Plato observed, democracy cannot last because the leaders have to pander to the base desires of the masses.  When the point is reached that they can no longer pander (i.e., deliver on their promises, or maintain the status-quo quality of life), the masses will turn against them, since they have nothing to lose materially and they have something to gain emotionally (i.e., the destruction of the wealthy elite, especially those who flaunt wealth, per the politics of envy).  Cf. Luddites.

158.          The American Dream is fading fast.  Fifty years ago, families could aspire to a high quality of life, with home ownership and a mother at home with the children.  Then, a family could be supported by a single person’s income.  Now, both parents must  work in the competitive labor market, children are in day care their entire lives, commute times are unbearable, the cost of housing is so high that many cannot hope to own their homes upon retirement, the US environment has been substantially damaged and diminished by mass immigration, the high income of the middle class has been destroyed by decades of massive international free trade, and parents of today now realize that the quality of life will be very poor for their children and there is nothing that anyone can do about it (since Peak Oil is occurring, our energy source is evaporating, and there is nothing comparable to replace it).  Free time is in short supply.  Once hope is gone, the end is near.

159.          Man was born to work, to build, to create.  Basic human needs include close social bonds of family and community.  A sense of purpose and destiny is essential.  To be satisfied, man has to be working toward a desired and meaningful goal.  In today’s United States, these things are in short supply.  As the petroleum age draws to a close, global industrialization will come to an end.  The planet’s biosphere is being destroyed (species loss, global warming, deforestation) by large human numbers and industrialization.  People see that the current way of life is coming to an end, very likely for them and surely for their children.  There is no longer any point to building anything, since it will soon all be gone.

160.          Divorce rates are high, the nuclear family has been decimated.  Most young couples can not now aspire to owning their own home, or having the mother stay a home with the children.  They can see that the quality of life is declining rapidly, that the future is bleak.  Hopelessness and despair are increasing.  Drug addiction is increasing.  Social problems are increasing.  Imprisonment is increasing.  There has been a death of the spirit.  The cultural glue that held the fabric of the country together has dissolved.

161.          Natural products and nature-based processes are mostly gone.  Wood, leather, cotton, wool, silk have been largely replaced by plastics and metal.  Low-technology farming, fishing and hunting have been replaced by industrial farming and fishing, under obscenely cruel conditions (e.g., battery production of poultry, pig farms, beef / veal).  The system is cruel, wasteful and destructive in the extreme.  Natural lifestyles, such as family farming, have been destroyed.  It is no longer possible to feed our bloated population with natural, solar-based agriculture or naturally benign hunter-gatherer systems.  Because of overcrowding, the US government is promoting the replacement of incandescent lamps with fluorescent lamps (as has been done in Australia).  The “strobe” effect (intermittent flashing on and off) of fluorescent lamps is unpleasant and believed to be unhealthy (linked to eye problems and possibly to epilepsy).

162.          Stability is gone.  Rapid change is now the nature of things.  Growth is explosive.  The problem with exponential growth (a percentage increase in everything every year) is that explosions do not last very long.

163.          Civility is gone from US society.

164.          The drug problem.  The problem is largely a creation of the government, whose policies have led to much crime and violence, the growth of a massive drug underworld, the destruction of some countries, and the criminalization and imprisonment of many citizens.  The US government puts criminals in charge of running local drug programs.  It substitutes a legal drug (methadone) for an illegal one (heroin), to serve the wealthy elite (drug program owners, pharmaceutical manufacturers).  The government has no will to change this system, since it generates much wealth for the wealthy elite.  All that matters is the money.   See “Saving Cities and Souls” in March 17, 2008 issue of Time magazine.

165.          The America we knew a few generations ago is gone forever.  There can be no “graceful” return to a low population and a high quality of life.  The present America is not human-friendly or eco-friendly, and, based on perpetual growth and waste production, is doomed.  It is obscenely inefficient, requiring the expenditure of massive amounts of energy to produce small amounts of food.  It is not stable, and the ideas and ideals on which it was founded (pursuit of happiness; government of, by and for the people) have been abandoned.  It will quickly perish, because it is an empty shell, whose spirit has left it.  Originally, the country’s reality matched the ideal.  There was a synchronization of the two. The original spirit and the reality of today don’t match.

166.          Our food is poisoned with chemicals that are destroying our health.

167.          Our medical system has been designed by the government to transfer a massive portion of people’s incomes to the medical establishment.  The US government has caused much of the country’s health problem, and created a parasitic medical system to feed off it.  The US medical system is the major drug of today.  The government’s policy is to promote the prolonging of life at any cost, regardless of quality.  The medical establishment deliberately attempts to inflate costs (e.g., promotes Plavix at $5.00 per pill instead of aspirin (which is almost as effective, and has far fewer undesirable side effects) for a penny a pill, for prevention of stroke).  (Dr. Jack Kevorkian was just released from prison, after serving eight years for assisted suicide.)  Much of the cost of medical care is covered by tax dollars (total cost, including research and government medical programs). 

Increasing Income Gap

168.          The exploding income gap.  When the country was founded, most people were poor, with a few wealthy landowners.  Wealth was in land, not in manufacturing assets or financial assets.  Wealthy people lived lives very similar to poor people.  Now, a small proportion of the population has fabulous wealth, and a small but noticeable proportion has great wealth.  As that wealth mushrooms and the quality of life starts to decline for most Americans, the resentment between the poor and the rich will intensify.  At some point, when it is clear that all that lies ahead is increasing poverty, the poor will turn on the rich in revolution, since they have nothing to lose materially, and emotionally they have the satisfaction of bringing down the rich.

169.          Conspicuous consumption and the flaunting of extreme wealth.  A small proportion of the population is extremely wealthy, and flaunts this wealth.  The media promote this spectacle.  A very large proportion of the population is barely able to make ends meet, and their prospects for improving are nil.  This situation is very unstable. The politics of envy will cause the burdened masses to turn on the wealthy.

170.          The relative position of the middle class to the wealthy is falling.  In Kevin Danaher’s book, 10 Reasons to Abolish the IMF & World Bank is a graph (p. 31) showing the ratio of the pay for top corporate executives to average worker pay.  The ratio was 42 in 1980, and it had risen to 475 in 1999.

Political Incompetence and Corruption

171.          Tyranny of the minority.  As the country’s culture fragmented due to the mass immigration from alien cultures spurred by the Immigration Act of 1965, the US population shifted from a population that was 90 percent white (Northern European) and 10 percent black (Negro, African) to a population that is now about 60 percent white and 40 percent other (many others, not just African).  As current immigration trends continue, the proportion white will soon be less than 50 percent.  There is no longer a dominant American culture and before long there will not even be a majority traditional American culture.  With the fragmentation of the country, it is now often the case that neither major political party can win an election without the support of a large number of minority groups.  This situation, in which the major parties must pander to minority groups, is referred to as tyranny of the minority.  If either major party offends any of a large number of minority groups, it will lose the election.  Any candidate hoping to win must pander to all minority groups.  Under the current two-party system (and unlike a parliamentary system), many people are denied a voice in government – the situation intensifies as the cultural homogeneity of the country is destroyed by mass immigration.  The country is seeing the rise of anarchy.  Our governmental system has been paralyzed.  A democratic system functioned well only when the country had a homogeneous population, when both sides were in reality simply slightly different segments of the same ethnic group, and it did not matter very much which party was in power.  The phenomenon of tyranny of the minority will generate much frustration and anger in the population, as it becomes clear that the system of government cannot function, and no longer represents the interests of the culture that founded the country (government of, by and for the people).  CNN’s Jack Cafferty questioned once (August, 2008) why no one is voting for third-party candidates, when 80 percent of the people hate the Democrats and Republicans.  It is the common belief that no third-party candidate can win.  Utter hopelessness: the only choice is between the two major parties, which both represent the same system.

172.          Because of US government fiscal and monetary policies, inflation is very high and the US debt has skyrocketed.  Rather than address this issue to protect the middle class, the US government chooses to allow assets such as US firms and US infrastructure and US land to be sold to foreigners (e.g., sovereign wealth funds).

173.          Lies, deception, and treason of leaders.

174.          Government no longer obeys its own laws.  Is above the people.  Serves only the wealthy.  Arrogance.  Hubris.

175.          Rise of Communist China and India (massive increase in use of oil, in their attempt to substantially raise the standard of living of their very large and very poor populations).

176.          The US government has no strategy for winning asymmetric warfare.  Terrorists can easily outspend us.  The war in Iraq is bankrupting us. The government continues to wage a no-win war in Iraq to keep the US middle class in bondage.

177.          The case of Israel/Palestine – how to take over a country.  Britain and US in North America.  Communist China in Tibet.

Miscellaneous Technical Reasons

178.          Fates of societies.  Jared Diamond’s Collapse, Tainter’s Collapse of Complex Societies

179.          History: Toynbee, Spengler

180.          Carrying capacity: Catton’s Overshoot; Jay Hanson’s Die Off website.

181.          Ponerology

182.          Economics

183.          Klare’s Resource Wars, Homer-Dixon’s Environment, Scarcity and Violence

184.          Dynamic systems, chaos theory, catastrophe theory, Gladwell’s Tipping Point

185.          Debt-based money and compound interest.


Appendix B. Excerpts from Patrick Buchanan’s Day of Reckoning (Thomas Dunne Books / St. Martin’s Press, 2007)

[In taking issue with Queen Elizabeth’s “revisionist-history” remarks made in 2007 at the 400th anniversary of the founding of Jamestown.]  For the Jamestown settlers were not Western Europeans but English Christians….  A great nation did indeed arise from Jamestown, but, intending no disrespect to Her Majesty, democracy and equality had nothing to do with it.  The House of Burgesses, formed in 1619, was restricted to white males, men of property.  The American Revolution was not fought for equality, but to be rid of British rule.  Four of the first five presidents – Washington, Jefferson, Madison and Monroe – were Virginia slaveholders.  Exactly two and a half centuries after Jamestown, in 1857, came Chief Justice Roger B. Taney’s Dred Scott decision declaring that slaves were not Americans and that none of them had any of the rights of American citizens.  Few Americans then, certainly not Abe Lincoln, believed in social or political equality.

If Jamestown and Virginia were not about democracy, equality and diversity for the 350 years between 1607 and 1957, who invented this myth that America was always about democracy, equality, and diversity?  And what was their motive?

The point here is unpleasant to modernity but critical to recognize: The United States, the greatest republic since Rome, and the British Empire, the greatest empire since Rome, may be said to have arisen from that three-cornered fort the Jamestown settlers began to build the day they arrived. But that republic and that empire did not rise because the settlers and those who followed believed in diversity, equality, and democracy, but because they rejected diversity, equality, and democracy. The English, the Virginians, the Americans were all "us-or-them" people.

They believed in the superiority of their Christian faith and English culture and civilization. And they transplanted that unique faith, culture, and civilization to America's fertile soil. Other faiths, cultures, and civilizations – like the ones the Indians had here, or the Africans brought, or the French had planted in Quebec, or the Spanish in Mexico – they rejected and resisted with cannon, musket, and sword. This was our land, not anybody else's.

But today America and Britain have embraced ideas about the innate equality of all cultures, civilizations, languages, and faiths, and about the mixing of all tribes, races, and peoples, that are not only ahistorical, they are suicidal for America and the West. For all over the world, rising faiths like Islam, rising like the indigenous peoples' movement rolling out of Latin America to Los Angeles, rising powers like China reaching for Asian and world hegemony – ignore the kumbaya we preach, and look to what our fathers practiced when they conquered the world.

What the queen said at Jamestown 2007 was that we are not the same people we were in 1957. She is right. For we now reject as repellent and ethnocentric the idea that the British who founded our republic and created the British Empire were not only unique but superior to other peoples and civilizations. And to show the world how resolutely we reject those old ideas, threw open our borders in the last forty years to peoples of all creeds, cultures, countries, and civilizations, inviting them to come and convert the old America into the most multicultural, multilingual, multiethnic, multiracial nation in history – “The First Universal Nation" of Ben Wattenberg's warblings. But it the Jamestown settlers had believed in equality and diversity and had shared their fort with the Indians, the settlers would never have been heard from again.

No matter the lies we tell ourselves and teach our children, no great republic or empire – not Persia, Rome, Islam, Spain. France, Britain, Russia, China, the United States – ever arose because it embraced democracy, diversity, and equality. None. The real question is not whether the values the queen celebrated at Jamestown created America – they had nothing to do with it – but whether America can survive having embraced them. In his farewell address, President Reagan warned, "We've got to teach history based not on what's in fashion but what's important.... If we forget what we did, we won't know who we are. I'm warning of an eradication ... of the American memory that could result, ultimately, in an erosion of the American spirit."'

Reagan's fear on leaving office, that forgetting the great things we have done in the past could lead to an erosion of the national spirit, was echoed by the incoming president of France, Nicolas Sarkozy, who said in May 2007: "I'm going to make the French proud of France again. I am going to bring an end to repentance, which is a form of self-hatred."

If France was ever to be great again, Sarkozy was saying, France must cease to grovel and apologize for sins committed in the days when she was great. And it is true of us. The truth about Jamestown, Her Majesty's syrupy recital of history notwithstanding, is that a great and brave people with a superior faith, culture, and civilization conquered this continent and created something historic and wonderful. Others did not do it; others could not have done it. And if we lose that unique culture and civilization, we will cease to be what we were – a great people and a great nation.

[End of Buchanan’s comments on the Queen’s remark.]

To understand what must be done to preserve our country, consider how a smaller country fights to preserve its national identity. How does Israel resist the centrifugal forces of multiculturalism? It is a nation 20 percent Arab and Druze, with the Jewish population tracing its roots to every continent and most especially the West and the old Ottoman world. How does Israel meet the threat of deconstruction? How does Israel prevent herself from being inundated by the mass migrations of modernity?

Israel fights ferociously to preserve her religious and ethnic identity. Immigration is restricted to those who are Jewish by birth or faith. While Jews from all over the world are urged to settle in Israel or on the West Bank, no Palestinian is permitted to return to the home of his father or grandfather. The rights of land ownership extended to Jews are not extended to non-Jews. Jewish history is taught in the schools. The Hebrew language has been revived. A Jewish currency, the shekel, has been created. There is talk of annexing all major Jewish settlements on the West Bank, and, in exchange, giving up Israeli land contiguous to the Bank where Arabs reside. Many Israelis say openly that while they wish to keep their Jewish population they would let the Arabs go. They seek an overwhelming Jewish majority in a Jewish state. Israelis understand it is not ideology that makes a nation. It is not democracy. Jews are a people.  And Israel is unapologetic about preserving its ethnic and religious character.

Millions of Americans feel the same way. Yet were an American to propose an immigration policy to keep the United States predominantly Christian and European – the rationale behind the Immigration Act of 1924 – he or she would be denounced as a racist, a xenophobe, and un-American.

The normal and natural instincts of our people have been demonized and the nation cowed by its ruling class – until the uprising of 2007 and the crushing defeat of an establishment that tried to ram a massive amnesty of illegal aliens and their corporate collaborators down America's throat.

To halt the Third World invasion we need moral fortitude and political will. And if we do not find these virtues soon in our leaders, we will lose the country. For what is happening on our Mexican border is a graver threat to our survival than anything happening in Iraq. If we do not wish to become that "tangle of squabbling nationalities" of which TR [Theodore Roosevelt] warned, not really a nation at all, ten tough but simple steps are necessary.

1. No amnesty for the 12 to 20 million illegal aliens.

2. A security fence from San Diego to Brownsville.

3. Rigorous enforcement of immigration laws against employers.

4. A federal requirement that all employers verify the identity and Social Security number of all workers, through a toll-free call.

5. A cutoff of all federal and state benefits, except emergency, to those who cannot prove they are in the country legally.

6. Justice Department support for states like California and Arizona and towns like Hazelton, Pennsylvania, and Farmers Branch, Texas, that seek to help enforce immigration laws by punishing landlords and businesses who flout federal laws by renting to or hiring illegal aliens.

7. A congressional declaration that children born to illegal aliens are not automatic citizens. The Fourteenth Amendment never intended that they be so.

8. An end to "chain migration" by telling legal immigrants that while they may bring wives and minor children with them, adult children, siblings, and parents must get in line like everyone else.

9. Declare English the official language of the United States, and strip the Supreme Court of any right of review of the law.

10. A time-out on legal immigration, such as the one from 1924 to 1965, and the annual admission of only the number urged by John Kennedy when he endorsed reform, fifty years ago: 150,000 to 250,000. For it is among the tens of millions of legal immigrants that the illegals find sanctuary.

If these measures are enacted, the invasion can be halted, millions of illegals will go home quietly as they did in 1954, and the nation can begin to assimilate and Americanize the scores of millions who have come legally in the last thirty-five years. Within ten years our national nightmare will be over.

Without a program like this, we lose America, for we are reaching a point of no return. According to the Census Bureau, if immigration continues at present rates, the U.S. population in 2060 will be nearing half a billion people. To today's 301 million will have been added 167 million, to reach 468 million living in the USA. The increase alone will equal the entire population of the United States when Kennedy took the oath. Of that 167 million, 105 million will be immigrants, almost all of them from the Third World, which will be like having the entire population of Mexico today move into the United States.

How America's unique character and national identity can survive this invasion, unprecedented in history, is impossible to see. Americans never voted for this invasion, never wanted it. Yet it is being done to us. Why?

…America is indeed coming apart, decomposing, and … the likelihood of her survival as one nation through mid-century is improbable – and impossible if America continues on her current course.  For we are on a path to national suicide.

Why did America not secure her borders, enforce her laws, repel the invasion, expel the intruders?  Because our leaders are terrified of charges of racism and lack moral courage, and because the United States has ceased to be a democratic republic.  The will of the majority is no longer reflected in public policy.  State and local referenda to deal with the illegal alien crisis are routinely invalidated by federal judges, as immigration laws go unenforced by federal officials.


Appendix C. Excerpts from Michael Neumann’s The Case Against Israel (CounterPunch and AK Press, 2005)

In his book, The Case Against Israel (2005), Michael Neumann discusses the right of a “people” to a homeland, or to self-determination.  Here follows an excerpt:

Zionism is, in important respects, far from unique, and its defenders use this. Why, they say, shouldn't the Jews have a right to a homeland? Doesn't every people have this right?

Self-Determination

The question cuts both ways. If no people have this right, neither do the Jews. (They might have a right to Palestine for some other reason, but not this one.) I will argue that, in the sense relevant to the Israel/Palestine conflict, no people have this right. Each individual person certainly has a right to live somewhere, but an ethnic group has no right to live somewhere together. From a narrow point of view, this should not be very controversial: no one seems to think that all Italians, all over the world, have a right to live in one huge nation, excluding all others. If they all want to go back to Italy, fine, but certainly they expel others to make room in their "homeland," or expand that homeland to fulfill the project of reversing their Diaspora. And if all Italians were to travel to some distant planet, they might have a right to it. But that right would rest on the fact that they got there first, not on some right to all live in the same place, just because they were all Italian. If twenty Italians got there first, they might own the whole planet and could determine who lived there. They would not have to admit all Italians on the basis of some right of all Italians to live together in one place.

But these objections take the right of a people to its homeland too literally. What is really meant, when such a right is proposed, that peoples have a right to determine their own future, their own affairs. This alleged right was given international standing at the Versailles peace conference in 1919. Woodrow Wilson, speaking in support of this agreement, referred to

“the sacredness of the right of self determination, the sacredness of the right of any body of people to say that they would not continue to live under the Government they were then living under, and under article eleven of the Covenant they are given a place to say whether they will live under it or not.”

and in his famous Fourteen Points he had this:

“An evident principle runs through the whole program I have outlined. It is the principle of justice to all peoples and nationalities, and their right to live on equal terms of liberty and safety with one another, whether they be strong or weak.”

Though his statements are less than entirely explicit, it is generally supposed that [quoting Michael Hirst] "Woodrow Wilson made self-determination an inalienable right for disenfranchised peoples around the world." The UN Charter states that one of the purposes of the United Nations is "To develop friendly relations among nations based on respect for the principle of equal rights and self-determination of peoples."

But neither international approval nor the United Nations' Charter are sufficient to bring rights into existence: If the UN said that people had a right to eat their children, would that make it so? There is no right of self-determination of peoples. The whole idea is a bad one.

Before leaving the general area of self-determination, it is necessary to look at a question that continually surfaces in the Israel/Palestine conflict – to what end, I am unsure. It is often said that the Palestinians were not a people. The most notorious example of this vague ploy is a 1969 remark by the then Prime Minister of Israel, Golda Meir:

It was not as though there was a Palestinian people and we came and threw them out and took their country away from them. They did not exist.


Appendix D. Excerpts from Ilan Pappe’s The Ethnic Cleansing of Palestine (Oneworld Publications, 2006)

In this building, on a cold Wednesday afternoon, 10 March 1948, a group of eleven men, veteran Zionist leaders together with young military Jewish officers, put the final touches to a plan for the ethnic cleansing of Palestine. That same evening, military orders were dispatched to the units on the ground to prepare for the systematic expulsion of the Palestinians from vast areas of the country.  The orders came with a detailed description of the methods to be employed to forcibly evict the people: large-scale intimidation; laying siege to and bombarding villages and population centres; setting fire to homes, properties and goods; expulsion; demolition; and, finally, planting mines among the rubble to prevent any of the expelled inhabitants from returning. Each unit was issued with its own list of villages and neighbourhoods as the targets of this master plan. Codenamed Plan D (Dalet in Hebrew), this was the fourth and final version of less substantial plans that outlined the fate the Zionists had in store for Palestine and consequently for its native population. The previous three schemes had articulated only obscurely how the Zionist leadership contemplated dealing with the presence of so many Palestinians living in the land the Jewish national movement coveted as its own. This fourth and last blueprint spelled it out clearly and unambiguously: the Palestinians had to go. In the words of one of the first historians to note the significance of that plan, Simcha Flapan, 'The military campaign against the Arabs, including the "conquest and destruction of the rural areas" was set forth in the Hagana's Plan Dalet'. The aim of the plan was in fact the destruction of both the rural and urban areas of Palestine.

As the first chapters of this book will attempt to show, this plan was both the inevitable product of the Zionist ideological impulse to have an exclusively Jewish presence in Palestine, and a response to developments on the ground once the British cabinet had decided to end the mandate. Clashes with local Palestinian militias provided the perfect context and pretext for implementing the ideological vision of an ethnically cleansed Palestine. The Zionist policy was first based on retaliation against Palestinian attacks in February 1947, and it transformed into an initiative to ethnically cleanse the country as a whole in March 1948.

Once the decision was taken, it took six months to complete the mission. When it was over, more than half of Palestine's native population, close to 800,000 people, had been uprooted, 531 villages had been destroyed, and eleven urban neighbourhoods emptied of their inhabitants. The plan decided upon on 10 March 1948, and above all its systematic implementation in the following months, was a clear-cut case of an ethnic cleansing operation, regarded under international law today as a crime against humanity.

After the Holocaust, it has become almost impossible to conceal large-scale crimes against humanity. Our modern communication-driven world, especially since the upsurge of electronic media, no longer allows human-made catastrophes to remain hidden from the public eye or to be denied. And yet, one such crime has been erased almost totally from the global public memory: the dispossession of the Palestinians in 1948 by Israel. This, the most formative event in the modern history of the land of Palestine, has ever since been systematically denied, and is still today not recognised as an historical fact, let alone acknowledged as a crime that needs to be confronted politically as well as morally….

One might suggest that the history already exposed should have been enough to raise troubling questions. Yet, the 'new history' narrative and recent Palestinian historiographical inputs somehow failed to enter the public realm of moral conscience and action. In this book, I want to explore both the mechanism of the 1948 ethnic cleansing, and the cognitive system that allowed the world to forget, and enabled the perpetrators to deny, the crime the Zionist movement committed against the Palestinian people in 1948.

In other words, I want to make the case for the paradigm of ethnic cleansing and use it to replace the paradigm of war as the basis for the scholarly research of, and the public debate about, 1948. I have no doubt that the absence so far of the paradigm of ethnic cleansing is part of the reason why the denial of the catastrophe has been able to go on for so long. When it created its nation-state, the Zionist movement did not wage a war that 'tragically but inevitably' led to the expulsion of 'parts of’ the indigenous population, but the other way round: the main goal was the ethnic cleansing of all of Palestine, which the movement coveted for its new state. A few weeks after the ethnic cleansing operations began, the neighbouring Arab states sent a small army– small in comparison to their overall military might – to try, in vain, to prevent the ethnic cleansing. The war with the regular Arab armies did not bring the ethnic cleansing operations to a halt until their successful completion in the autumn of 1948….

Though as yet without a state, Ben-Gurion already now functioned as defence minister and as a prime minister of sorts (given his authority to pass resolutions within a government). In many aspects he shared responsibility, and most issues on the agenda of the Jewish community were discussed in a democratic way within institutions that represented the composition of the major political groups among the Jews in Palestine. But as the time came nearer when crucial decisions needed to be made with regards to the fate of the Palestinians, Ben-Gurion began to ignore the official structure and started relying on more clandestine formations.

The major topic on the Zionist agenda in 1946 and 1947, the struggle against the British, resolved itself with the British decision, in February 1947, to quit Palestine and to transfer the Palestine question to the UN. In fact, the British had little choice: after the Holocaust they would never be able to deal with the looming Jewish rebellion as they had with the Arab one in the 1930s and, as the Labour party made up its mind to leave India, Palestine lost much of its attraction. A particularly cold winter in 1947 drove the message home to London that the Empire was on its way to become a second-rate power, its global influence dwarfed by the two new super-powers and its economy crippled by a capitalist system that caused Sterling to drop precipitously. Rather than hold on to remote places such as Palestine, the Labour party saw as its priority the building of a welfare state at home. In the end, Britain left in a hurry and with no regrets.

Ben-Gurion had already realised by the end of 1946 that the British were on their way out, and with his aides began working on a general strategy that could be implemented against the Palestinian population the moment the British were gone. This strategy became Plan C, or Gimel in Hebrew.

Plan C was a revised version of two earlier plans, A and B. Plan A was also named the 'Elimelech plan', after Elimelech Avnir, the Hagana commander in Tel-Aviv who in 1937, at Ben-Gurion's request, had already set out possible guidelines for the takeover of Palestine in the event of a British withdrawal. Plan B had been devised in 1946 and both plans were now fused into one to form Plan C.

Like Plans A and B, Plan C aimed to prepare the military forces of the Jewish community in Palestine for the offensive campaigns they would be engaged in against rural and urban Palestine the moment the British were gone. The purpose of such actions would be to 'deter' the Palestinian population from attacking Jewish settlements, and to retaliate for assaults on Jewish houses, roads and traffic. Plan C spelled out clearly what punitive actions of this kind would entail:

Killing the Palestinian political leadership.

Killing Palestinian inciters and their financial supporters.

Killing Palestinians who acted against Jews.

Killing senior Palestinian officers and officials [in the Mandatory system].

Damaging Palestinian transportation.

Damaging the sources of Palestinian livelihoods: water wells, mills, etc.

Attacking nearby Palestinian villages likely to assist in future attacks.

Attacking Palestinian clubs, coffee houses, meeting places, etc.

Plan C added that all data required for the performance of these actions could be found in the village files: lists of leaders, activists, 'potential human targets', the precise layout of villages, and so on.

However, within a few months, yet another plan was drawn up: Plan D (Dalet)." It was this plan that sealed the fate of the Palestinians within the territory the Zionist Leaders had set their eyes on for their future Jewish State. Indifferent as to whether these Palestinians might decide to collaborate with or oppose their Jewish State, Plan Dalet called for their systematic and total expulsion from their homeland….

The Arab decision as to how much to intervene and assist was directly affected by developments on the ground. And on the ground they watched – politicians with growing dismay, intellectuals and journalists with horror – the beginning of a depopulation process unfolding in front of their eyes.  They had enough representatives in the area to be fully aware of the intent and scope of the Jewish operations. Few of them were in any doubt at that early stage, in the beginning of 1948, of the potential disaster awaiting the Palestinian people. But they procrastinated, and postponed, for as long as they could, the inevitable military intervention, and then were only too sappy to terminate it sooner rather than later: they knew full well not only that the Palestinians were defeated, but also that their armies stood no chance against the superior Jewish forces. In fact, they sent troops into a war they knew they had little or no chance of winning.


Appendix E. Excerpt from David Livingstone’s Terrorism and the Illuminati (BookSurge / Amazon.com, 2007)

Arthur Koestler, in The Thirteenth Tribe, popularized the theory that the majority of European Ashkenazi Jews are in fact not descended from the ancient inhabitants of Israel, but from Khazarian converts to Judaism. The term “Ashkenaz” describes a relatively compact area of Jewish settlement in northwestern Europe, including northeastern France and northern Germany, where Jewish settlement is documented dating back to at least the sixth century AD. The traditional explanation of East European Jewish origins was that most Ashkenazi Jews reached Poland and Russia from Germany, and Germany from France.

Modern genetic studies, however, have proven Koestler's theory incorrect . Studies of mitochondrial DNA have demonstrated that Ashkenazi Jewish communities in Europe were composed mostly through intermarriage of men with women of European descent. The reason is that Radhanites, Persian Jewish merchants, had migrated to Poland or Germany or France, since the fifth century AD, where they mostly married into those communities for hundreds of years. The Proceedings of the National Academy of Science report, appears to bear out that Ashkenazi Jews must have arrived in Eastern Europe, not from the west and southwest, but from the south and east, that is, via northern Italy and the Balkans, Asia Minor and the Greek Byzantine empire, the Volga kingdom of the Khazars, or a combination of all three.

The non-Israelite haplogroups found in Ashkenazi samples include Q, which is typically Central Asian, and R1a1, which is typically Eastern. Q is considered by researcher Doron Behar to constitute a minor founding lineage among Jewish populations. Approximately five to ten percent of Ashkenazi Jews today are in this haplogroup, which originated in Central Asia. It is an extremely rare haplogroup in both Europe and the Middle East, found only Scandinavia, and the few countries that Khazars were known to have migrated to, like Poland, Hungary and Lithuania.

It has also been found that about half of Ashkenazi Levites possess Eastern European non-Israelite haplotypes belonging to the R1a1 haplogroup, which is typically Eastern European. The Levites are particularly interesting, because, among them, it is the Cohens, or Kohamin, for whom the office of priest has traditionally been reserved. Levitical status is generally determined by oral tradition, passed from father to son, with children being Levites if their father and grandfather was [sic]. Until the eighteenth century in Europe, many Cohens could accurately trace their lineage back to a verifiable Kohamin such as Ezra. Today, families may verify their priestly lineage via the tombstone of deceased ancestors, as the universal symbol of the hands arranged for the Priestly Blessing. This is the hand gesture popularized as Spock's Vulcan salute in Star Trek. Some scholars maintain, however, that because of the destruction of Jerusalem's temple and the unavailability of lineage records, there is now no way to establish who is a Levite reliably.

Levites in Orthodox Judaism continue to have additional rights and obligations compared to lay people, although these responsibilities have diminished with the destruction of the Temple. Orthodox Judaism maintains a belief in and hope for a restoration of a Third Temple in Jerusalem, and Kohanim are regarded as retaining their original sanctity, and some elements of their original roles and responsibilities, and having a status of waiting in readiness for future service in a restored Temple. Some Orthodox Jews have founded schools to train priests and Levites in their respective roles.

The R1a1 haplogroup is almost never found among Sephardic Levites, and may have been introduced into the Ashkenazi Levite lines by Slavs, or Khazars who converted to Judaism. R1a1m rather, is found all over Armenia, Georgia, and Eastern Europe in general, including the Sorbs, the Poles, and many people of central Europe. It's also found in Finland, and many R1a1 people went west to Scotland and Scandinavia. Interestingly, the R1a1 was introduced only 900-1000 years ago into only the Ashkenazi Levite male population.

The irony, of course, is the R1a1 Kurgans who are the founders of this haplogroup are considered the epitome of Indo-Europeanism. The homeland of the Indo-Europeans is the steppes north of the Black Sea, right where the Khazarian Empire was located. But the problem is that not only were Khazars most likely significantly R1a1 in their ancestry, but most Eastern Europeans are also R1a1.

The finding raises the question of how the signature became so widespread among the Levites. The foreign genetic signature found among Levites occurs on the male or Y chromosome and comes from a few men, or perhaps a single ancestor, who lived about 1,000 years ago, just as the Ashkenazim were beginning to be established in Europe. It has been proposed that the ancestor who introduced it into the Ashkenazi Levites could perhaps have been from the Khazar.

Ultimately, it was through the infiltration of Armenian Jews that the double-headed eagle of the Mamikonians became their heraldic symbol the Khazars. The striking or rising eagle, Togrul or Togarmah, meaning "the powerful eagle", represents for Khazars the messenger and mediator of Tängri, meaning "The Lord-God-The sun". It also represents the sacred royal imperial power, in Hebrew Malchut Ha-Shmayim, since more than three thousand years, and is the heraldic symbol of the two merged royal clans, in Hebrew Ha-Shechina, and Turkic Ashina. Thus it is the very emblem of any Khagan, meaning "King of Kings, Emperor", of Khazars.


Appendix F. Excerpts from  Noam Chomsky’s The Prosperous Few and the Restless Many (Odonian Press, 1993)

“The new global economy

“[David Barsamian, interviewer: I was on Brattle Street (in Cambridge) just last night. There were panhandlers, people asking for money, people sleeping in the doorways of buildings. This morning, in the subway station at Harvard Square, there was more of the same.  The spectre of poverty and despair has become increasingly obvious to the middle and upper class. You just can't avoid it as you could years ago, when it was limited to a certain section of town. This has a lot to do with the pauperization (the internal Third Worldization, / think you call it) of the United States.]  There are several factors involved. About twenty years ago there was a big change in the world order, partly symbolized by Richard Nixon's dismantling of the postwar economic system. He recognized that US dominance of the global system had declined, and that in the new ‘tripolar’ world order (with Japan and German-based Europe playing a larger role), the US could no longer serve in effects as the world's banker.

“That led to a lot more pressure on corporate profits in the US and, consequently, to a big attack on social welfare gains. The crumbs that were permitted to ordinary people had to be taken away. Everything had to go to the rich.

“There was also a tremendous expansion of unregulated capital in the world. In 1971, Nixon dismantled the Bretton Woods system, thereby deregulating currencies. That, and a number of other changes, tremendously expanded the amount of unregulated capital in the world, and accelerated what's called the globalization (or the internationalization) of the economy.

“That's a fancy way of saying that you export jobs to high-repression, low-wage areas – which undercuts the opportunities for productive labor at home. It's a way of increasing corporate profits, of course. And it's much easier to do with a free flow of capital, advances in telecommunications, etc.

“There are two important consequences of' globalization. First, it extends the Third World model to industrial countries. In the Third World, there's a two-tiered society – a sector of extreme wealth and privilege, and a sector of huge misery and despair among useless, superfluous people.

"That division is deepened by the policies dictated by the West. It imposes a neoliberal ‘free market’ system that directs resources to the wealthy and to foreign investors, with the idea that something will trickle down by magic, some time after the Messiah comes.

“You can see this happening everywhere in the industrial world, but most strikingly in the three English-speaking countries. In the 1980s, England under Thatcher, the United States under the Reaganites and Australia under a Labor government [that] adopted some of the doctrines they preached for the Third World.

“Of course, they would never really play this game completely. It would be too harmful to the rich. But they flirted with it. And they suffered. That is, the general population suffered.

“Take, for example, South Central Los Angeles. It had factories once. They moved to Eastern Europe, Mexico, Indonesia – where you can get peasant women flocking off the land. But the rich did fine, just like they do in the Third World.

“The second consequence, which is also important, has to do with governing structures. Throughout history, the structures of government have tended to coalesce around other forms of power – in modern times, primarily around economic power. So, when you have national economies, you get national states. We now have an international economy and we're moving towards an international state – which means, finally, an international executive.

“To quote the business press, we're creating ‘a new imperial age’ with a ‘de facto world government.’ It has its own institutions – like the International Monetary Fund (IMF) and the World Bank, trading structures like NAFTA and GATT (the North American Free Trade Agreement and the General Agreement on Tariffs and Trade … executive meetings like the G-7 (the seven richest industrial countries – the US, Canada, Japan, Germany, Britain, France and Italy – who meet regularly to discuss economic policy) and the European Community bureaucracy.

“As you'd expect, this whole structure of decision making answers basically to the transnational corporations, international banks, etc. It's also an effective blow against democracy. All these structures raise decision making to the executive level, leaving what's called a ‘democratic deficit’ – parliaments and populations with less influence.

“Not only that, but the general population doesn't know what's happening, and it doesn't even know that it doesn't know. One result is is a kind of alienation from institutions. People feel that nothing works for them.

“Sure it doesn't. They don't even know what's going on at that remote and secret level of decision making. That's a real success in the long-term task of depriving formal democratic structures of any substance.”

“There are serious issues here. First of all, we have to be careful in the use of terms. When someone says America is in for a long period of decline, we have to decide what we mean by ‘America.’ If we mean the geographical area of the United States, I'm sure that's right. The policies now being discussed will have only a cosmetic effect. There has been decline and there will be further decline. The country is acquiring many of the characteristics of a Third World society.

“But if we're talking about US-based corporations, then it's probably not right. In fact, the indications are to the contrary – their share in manufacturing production, for example, has been stable or is probably even increasing, while the share of the US itself has declined. That's an automatic consequence of sending productive labor elsewhere.

“General Motors, as the press constantly reports, is closing some 24 factories in North America. But in the small print you read that it's opening new factories – including, for example, a $700 million high-tech factory in East Germany. That's an area of huge unemployment where GM can pay 40% of the wages of Western Europe and none of the benefits.

“There was a nice story on the front page of the Financial Times, in which they described what a great idea this was. As they put it, GM doesn't have to worry about the ‘pampered’ West European workers any longer – they can just get highly exploited workers now that East Germany is being pushed back to its traditional Third World status. It's the same in Mexico, Thailand, etc.

“[Barsamian: The prescription for our economic problems is more of the same – ‘leave it to the market.’ There's such endless trumpeting of the free market that it assumes almost a myth-like quality. ‘It'll correct the problems.’ Are there any alternatives?]

“We have to first separate ideology from practice, because to talk about a free market at this point is something of a joke. Outside the academy and the press, no one thinks that capitalism is a viable system, and nobody has thought that for sixty or seventy years – if ever.”


Appendix G. Excerpts from Kevin Danaher’s 10 Reasons to Abolish the IMF and World Bank (Seven Stories Press, 2001)

[From the introduction by Anuradha Mittal] “Almost invariably, structural adjustment programs have the following elements:

·       Radically reduce government spending on health, education and welfare

·       Privatize and deregulate state enterprises

·       Devalue the currency

·       Liberalize imports and remove restrictions on foreign investment

·       Cut or constrain wages and eliminate or weaken mechanisms protecting labor.

“Two of the most powerful institutions that have promoted the ‘free market’ agenda of the large corporations are the World Bank and the IMF.  By lending hundreds of billions of dollars to third world elites, the World Bank and the IMF exert significant control over the economic strategy of most countries.  They have promoted a set of policies (‘structural adjustment’) and an economic model that greatly benefit a minority and harm the majority.”

In the chapter entitled, “Growth: The Ideology of the Cancer Cell,” Danaher writes:

“Contrary to what the economists at the World Bank and the IMF have been preaching, no amount of market-driven ‘growth’ will solve the key problems we face.

“Just think of how often you have heard people justify the ideology of economic growth by saying, ‘A rising tide floats all boats.’ But for those who don't own boats or those whose boats have holes in them (the global majority), a rising tide only increases the gap between them and the wealthy minority.

“In fact, market-driven ‘growth’ is making things worse. Look at a period of rapid economic growth such as 1960 to the present. During that period the global economy experienced rapid growth in all the major indicators: production, foreign direct investment, international trade, international debt. Did inequality in the world get better or worse during that period? It got far worse. Did environmental destruction get better or worse during that period? It got far worse. Did our sense of community and spirituality get better or worse? Most would agree that these key indicators of quality of life have gotten worse.

“’Gross’ National Product?

“Think about the way we measure economic growth: the annual percentage increase in Gross National Product [GNP].  GNP counts all goods and services – no matter how destructive – as positive numbers.

“For example, let’s say I go into a bar and drink ten beers.  All the money I spend on that beer is a positive contribution to GNP.  Now I’m drunk.  I drive away in my car, and I crash into a family in their car.  They’re all maimed, and require intensive medical care for the rest of their lives.  The tow truck, the emergency crews, the court costs, any jail time I get sentenced to, and the lifetime of medical care for the victims are all positive additions to GNP.

“If we were using sane social and environmental criteria, the effects of behavior such as drunk driving, cigarette smoking, toxic waste dumping, and pollution's impact on public health would be negative numbers subtracted from GNP, not positive numbers as they are now.

The nonprofit group Redefining Progress has developed a statistical indicator called the Genuine Progress Indicator (GPI) that does precisely this. By subtracting from the value of production the costs of cleaning up toxic waste dumps, reclaiming polluted rivers, mending people maimed by the industrial system, the GPI better reflects the real, sustainable growth of the economy. Graph 4 shows that when they calculate the entire U.S. economy using this more sensible measure – subtracting destruction from production – they find that the U.S. economy stopped growing in the 1970s and has been steadily declining since then.

“By relying on the narrow economic criteria of the market to measure growth, we are systematically deceived about the underlying destruction being wrought by the global market economy.

“Look at the how we have been deceived about the ‘efficiency’ of the market economy. If you ask Americans if U.S. agriculture is efficient, the majority will give a resounding yes as their answer. But consider the way that alleged efficiency is measured. The social and environmental costs of U.S. agriculture are excluded from the calculation of efficiency. The fact that agriculture is the primary source of water pollution in the United States does not get factored into the equation.

“The destruction of beneficial insects by chemical-intensive farming is not factored into the equation. The value of the billions of tons of topsoil lost from U.S. farms every year is not factored into the equation. The bankrupting of family farmers whose land gets gobbled up by corporate agribusiness is not factored into the equation. In sum, if we measured our food system's productivity broadly – including all social and environmental costs – rather than just in narrow money terms, we would see that our food system is highly inefficient.

“So the growth ideology must be challenged. The one thing in nature that has an ideology of unregulated growth is the cancer cell that propels a malignant tumor. The uniqueness of the cancer cell is that it has an ‘on’ switch with no ‘off’ switch. What does a malignant tumor do to its biological host? It kills it. What is the global market economy doing to its biological host – the earth’s air, water, and soil?  It is killing them. The ‘off’ switch, or social immune system, for the cancer of global capitalism is we the people.

“The World Bank and the IMF are the two most powerful enforcers of the growth ideology and a system of measurement that hides the social and environmental costs of market-led growth. Without major changes in these institutions, there is little hope that we will be able to convert to a more sane way of measuring economic progress.”

Danaher heads this chapter with a quotation from John Maynard Keynes: “Capitalism is the extraordinary belief that the nastiest of men, for the nastiest of reasons, will somehow work for the benefit of all.”

It may be argued that since virtually all economic (industrial) activity is destructive of nature, the best measure of this destructive impact is simply the unadjusted GNP, and that use of the GPI serves to downplay the destruction.  It focuses on direct “positive” benefits to mankind, and it attempts to place a monetary value on things such as deforestation and species loss.  By doing this, it equates a certain amount of species loss with a certain amount of money.  This is terribly wrong.  While the GPI may be a better measure of the direct effects of economic activity on generating pleasure for human beings, it is not at all a good measure of the damaging effect of economic activity on the biosphere (and hence on the long-term welfare of mankind).

All national leaders call for more economic growth.  All of the presidential candidates call for more economic growth.  All that matters to them is more jobs, more new housing starts, and increased gross national product.  None of them ever talks about a stable society in which resources are put into improving the quality of life for a stable population.  Of course not.  How can a rabid capitalist make a quick fortune in a stable society?


Appendix H. Excerpts from Lori Wallach and Michelle Sforza’s The WTO: Five Years of Reasons to Resist Corporate Globalization – Introduction by Ralph Nader (Seven Stories Press, 1999)

“In approving the far-reaching, powerful World Trade Organization and other international trade agreements, such as the North American Free Trade Agreement, the U.S. government, like those of other nations, has ceded much of its flexibility to independently advance health and safety standards that protect citizens. Instead, the U.S. has accepted harsh legal limitations on what domestic policies it may pursue. Approval of these agreements has institutionalized a global economic and political structure that makes every government increasingly hostage to an unaccountable system of transnational governance designed to increase corporate profit, often with complete disregard for social and ecological consequences.

“This new governing regime will increasingly provide major generic control over the minute details of the lives of the majority of the world's people. It is not based on the health and economic well-being of people, but rather on the enhancement of the power and wealth of the world's largest corporations and financial institutions.

“Under this new system, many decisions affecting people's daily lives are being shifted away from our local and national governments and being placed increasingly in the hands of unelected trade bureaucrats sitting behind closed doors in Geneva, Switzerland. These bureaucrats, for example, are now empowered to dictate whether people in California can pursue certain actions to prevent the destruction of their last virgin forests or determine if carcinogenic pesticides can be banned from their food, or whether the European countries have the right to ban the use of risky biotech materials in their food. Moreover, once the WTO's secret tribunals issue their edicts, no independent appeals are possible. Worldwide conformity or continued payment of fines are required.

“At stake is the very basis of democracy and accountable decision-making that is the necessary foundation of any citizen struggle for just distribution of wealth and adequate health, safety, human rights, and environmental protections. An erosion of democratic accountability, and the local, state and national sovereignty that is its embodiment, has taken place over the past several decades.

“Multinational companies have shaped the globalization of commerce and finance. The establishment of the WTO marks a landmark formalization and strengthening of their power. In this way, corporate globalization establishes supranational limitations and impinges deeply on the ability of any nation to control commercial activity with democratically enacted laws. Globalization's tactic is to eliminate democratic decision-making and accountability over matters as intimate as the safety of food, pharmaceuticals and motor vehicles, or the way in which a country may use or conserve its land, water, minerals and other resources. What we have now in this type of globalization is a slow motion coup d'état, a low intensity war waged to redefine free society – democracy and its non-commercial health, safety and other protections – as subordinate to the dictates of international trade – i.e., big business über alles.

“One cannot open a newspaper today without reading about myriad examples of the problems that concentrated power spawns: reduced standards of living for most people in the developed and developing world; growing unemployment worldwide; deadly infectious diseases; massive environmental degradation and natural resources shortages; growing political chaos; and a global sense of despair, not hope and optimism, for the future.  Conspiratorial meetings have not been necessary to fuel the push for globalization. Many corporate officials share a common, perverse outlook. To them, the globe is viewed primarily as a common market and capital source. Governments, laws and democracy are inconvenient factors that restrict their exploitation and limit their profit. From their perspective, the goal is to eliminate market barriers on a global scale. From any other humane perspective, such barriers are seen as valued safeguards established to protect a nation's population – that is, every nation's laws that foster their economies, their citizens' health and safety, the sustainable use of their land and resources, and so on. In stark contrast, for multinational corporations, the diversity that is a blessing of democracy and that results from diffuse decision-making is itself the major barrier to be bypassed or removed. On rare occasions, promoters of the economic globalization agenda have been frank about their intentions. ‘Governments should interfere in the conduct of trade as little as possible,’ said GATT (General Agreement on Tariffs and Trade) Director General Peter Sutherland, in a March 3, 1994, speech in New York City where he promoted U.S. approval of the WTO.

“Even more alarming is the definition of ‘trade’ these days which is used increasingly to describe a large portion of each nation's economic and political structures. The WTO and other trade agreements have moved way beyond their traditional rules of setting quotas and tariffs. Now these institute new and unprecedented controls over democratic governance. Erasing national laws and economic boundaries to foster capital mobility and ‘free trade,’ a term that ought properly to be called corporate-managed trade (since it produces constraints, not freedom, for the rest of us) has led the likes of American Express, Cargill, General Motors, Monsanto, Union Carbide, Shell, Citigroup, Pfizer and other mega-corporations to rejoice. However, the prospect of global commerce without democratic controls is brewing a disaster far the rest of the world left uniquely vulnerable to unrestrained corporate activity amid declining living, health and environmental standards.

“Economist Herman Daly issued an important warning in his January 1994 ‘Farewell Lecture to the World Bank.’ The push to eliminate the nation-state's capacity to regulate commerce, he said, ‘is to wound fatally the major unit of community capable of carrying out any policies for the common good. ... Cosmopolitan globalism weakens national boundaries and the power of national and subnational communities, while strengthening the relative power of transnational corporations.’

“The philosophy allegedly behind the globalization agenda is that maximizing global economic deregulation will in itself result in broad economic and social benefits. However, anyone who believes this philosophy or that corporate economic globalization has any underpinnings except maximizing short-term profit, need only consider the case of U.S.-China economic relations. When only human rights were at issue in 1994, the Clinton administration ended the historical linkage between favorable trade status and a country's human rights record. Instead it supported renewal of China's Most Favored Nation (MFN) status. However, in early 1995, when property rights were in question, McDonald's lease and Mickey Mouse's royalties were cause for $1 billion dollars in threatened U.S. trade restrictions against China. This threat resulted in Chinese government policy changes to enforce intellectual property restrictions.

“Similarly, economic globalization’s primary mechanisms – the WTO and NAFTA – do not target all ‘fetters’ on commerce for elimination. Rather, the agreements promote elimination of restrictions that protect people, while increasing protection for corporate interests. Regulation of commerce for environmental, health or other social goals is strictly limited or challenged. For example, selling products internationally made with child labor is WTO-legal.”


Appendix I. Excerpts from Pat Choate’s Dangerous Business (Alfred A. Knopf, 2008)

The globalization policies of Presidents George H. W Bush, Bill Clinton, and George W Bush collectively constitute the worst economic policy mistake in American history. Their policies have enabled leaders of transnational companies and global finance to enrich themselves and advance their interests at the expense of the larger society. These companies and financiers and the campaign fund-driven politicians they support are transforming the United States into a corporately governed nation, the mass of whose citizens face an increasingly bleak future.

Inexplicably, the U.S. government sends contaminated foods back to the shippers rather than destroying them. Thus, these foreign exporters often resend the same putrid shipments back to the United States, knowing that the odds are 99 to I that they will succeed the next time. As one FDA official explained, ‘If you send a problem shipment to the United States, it is going to get in and you won't get caught, and you won't have your food returned to you, let alone get arrested or imprisoned!'

As matters stand in early 2008, consumers are allowed to know the source of the fish and shellfish they buy, but they are not allowed to know the country of origin of their beef, pork, lamb, perishable agricultural commodities, and peanut purchases. In August 2007, the Zogby polling firm released survey results that revealed that 95 percent of respondents feel that consumers have a right to know the country of origin of the foods they buy and 88 percent said they want all retail food labeled with country-of-origin information.

The fact that American consumers are not allowed to know the country of origin for many of the foods they eat reflects the money-driven political collusion between the transnational food corporations and the elected officials who control our government.

The idea is so basic that Adam Smith noted in The Wealth of Nations, ‘People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.’ Stated in modern terms, collusion, not competition, is the default strategy in concentrated industries.

Mercantilism In the sixteenth through eighteenth centuries, the dominant economic theory in Europe, mercantilism, held that national prosperity depended on a nation's supply of capital, which at the time was gold. The fastest way to acquire capital was to run a trade surplus with other nations by exporting more goods than were imported. To facilitate this economic dynamic, governments subsidized production and exports, at the same time imposing barriers on foreign imports. The idea was simple: to make exports so inexpensive that foreign consumers would prefer them over domestically made goods and conversely make imports so expensive that local consumers would prefer domestic goods over foreign-made ones.

England, under the intellectual influence of the economists Adam Smith and David Ricardo, gradually abandoned mercantilist policies in the nineteenth century and replaced them with an ideology of free trade and comparative advantage. The United States continued to pursue a mercantilist economic agenda until the early 1930s, when Roosevelt's secretary of state, Cordell Hull, began a move to a free-trade regime, which also promoted FDR's foreign policy objectives.

China is taking the mercantilist path abandoned by England and once used by the United States. The principal differences are that the United States did not own the major enterprises on which its economy depended, as the Chinese government does, and the U.S. government operated under a two-party, democratic system, while China is governed by one-party rule. The ownership and democratic differences between the United States and China are fundamental. The Chinese government announced in December 2006 that seven industries were critical to the nation's economic security and would remain under strict government control. They are defense, power, oil, telecommunications, mineral resources, civil aviation, and shipping industries. This list, moreover, is not exclusive; Chinese officials noted that the government intended to expand the volume and structure of those industries so that they would become leading world businesses.

The Chinese have established unique political and economic relationships with ex-presidents of the United States. Former Presidents George H. W Bush and Bill Clinton have both been paid hundreds of thousands of dollars to make speeches in China. Prescott Bush, Sr., the brother of George H. W Bush and uncle of George W Bush, is a partner in the largest foreign-owned golf course in China. Neil Bush, the current president's brother, is copartner in a software business with the son of a member of China's Politburo Standing Committee. After George W Bush leaves office, he will have many opportunities to speak in China and throughout Asia and the Middle East for six-figure fees.

Since 2001, the Bush administration has sponsored three initiatives to shift work traditionally performed by government to private corporations. Privatizing Social Security was one, and outsourcing work at the Department of Defense, which I have discussed, is another. Both those efforts have received massive attention from the media and other interested parties.

By contrast, the third corporatism initiative has been virtually unnoticed by the national media. The U.S. Department of Transportation is aggressively trying to shift the financing, construction, and operation of America's busiest highways, including the interstate highways, to privately operated toll-road corporations. This is a historic shift of public policy.

The two principal corporations involved in this conversion of freeways to toll roads are foreign-based companies. One is Cintra, a Spanish corporation; the other is Macquarie Infrastructure Group, an Australian company. Often working together on joint projects, these two transnationals are trying to change the way America finances its highways and do so in a way that will yield massive profits for both them and their U.S. partners. The business model used by both corporations is to offer state and local elected officials, who are averse to enacting new taxes, large fees, often in the billions of dollars, in exchange for 50- to 100-year leases on prime public facilities that they can convert into toll operations. Literally hundreds of billions of dollars of public assets are involved. The properties these two companies want most are the high-traffic portions of the Interstate Highway System, which they want to convert to toll roads. As the following two stories illustrate, the globalization of America's roads and bridges is well under way.

The 46,000-mile Interstate Highway System is the largest public works project ever constructed. It is part of a national system of more than 4 million miles of roads, fewer than 6,000 miles of which require tolls.

The interstate system was conceived in 1938, during the second Franklin D. Roosevelt administration, authorized by Congress in 1944, and built beginning in President Dwight D. Eisenhower's first term (1955). One of the principal stumbling blocks faced by Eisenhower was figuring out how to pay for this system.

Interestingly, Eisenhower's strongest ally on the highway project in the U.S. Senate and one of the principal architects of the pay-as-you-go system to build it was a former investment banker named Prescott Bush (R-Conn.), a grandfather of President George W Bush. Largely completed in less than fifteen years, the interstate system was paid for as it was built and thus is owned debt-free by the American people.

But as with many legacies, monies are constantly required to maintain and refresh the existing system and expand it for a growing population. The Bush administration's remedy for financing these ongoing needs is not to increase federal and state gas taxes, the traditional method of funding road construction, which have not been adjusted for inflation since 1982, but to lease these public facilities to private corporations for a period of a half century or more.

The name for the administration's program is public-private partnerships (PPP). Under PPP, states and cities are now being urged by the Department of Transportation (DOT) to convert existing highways to privately owned and operated toll roads, including the interstates, and to allow private corporations to plan, finance, build, and operate most new roads the United States will need.

The shift from pay-as-you-go public roads to private toll highways is a nondebated, unexamined historic change of national policy that has multiple ramifications. For generations, the policy of the national and state governments has been to build a first-rate road system that keeps the cost of highway transportation as low as possible. America is a big nation, and holding down the costs of transport makes it more economically competitive and gives taxpayers and their families an inexpensive way to travel for work and pleasure.

Under PPP, just the opposite philosophy exists. Road tolls are set at market rates – that is, to maximize profits for the private operator. In many cities and regions where public highways are the only means of transport, toll rates could be raised to very high levels. With such rates possible, and with rates set by the private company, many corporations are willing to pay massive up-front concession fees to control and operate these transport monopolies for up to fifty to a hundred years.

These initial fees, which are often in the billions of dollars, allow today's antitax politicians to finance other public programs, such as schools, without having to ask voters for tax increases. They are able to keep their ‘no-new-tax’ pledges by effectively selling debt-free public assets. In many of these deals, a part of the toll profits is to be shared with the state government concerned, sometimes outside the normal budget and legislative processes. Often the money goes into a governor's discretionary fund.

Imagine having your credit card charged a new fee of $250 to $300 per month for tolls to commute to work by car. The Department of Transportation is making this a reality that millions of American commuters will soon have to confront. Ironically, commuters in Washington, D.C., and the surrounding region are already experiencing this form of ‘highway corporatism,’ with several toll roads now proposed or operating in the region.

The political lexicon of such road financing was changed in 1982, when the Reagan administration proposed raising the federal gas tax by five cents. The principal opposition was from Republican senators, who were divided as to whether the nickel add-on was a tax increase or a user fee. In the 1982 lame-duck session of Congress, GOP Senate leaders decided it was a user fee and not a tax. Only then did they approve it.

Today, raising a user fee is considered a tax increase, which it really always has been. For those officials who are ideologically opposed to tax raises, the innocuous-sounding ‘public-private partnership’ is a perfect solution. It provides politicians a new way of raising massive amounts of money without the necessity of their raising taxes, and when toll rates must be raised the private contractor does it, thereby allowing politicians to remain tax ‘pure’ with their constituents. They also receive the upfront fees to dispense, which to many officials is like ‘free’ money.

[Choate discusses in some detail the efforts to convert public roads in Indiana and Texas to toll roads.]

The PPP is a sly form of taxation for which politicians need not take responsibility but that can generate tens of billions of dollars in revenues, including campaign contributions. Maybe the existing U.S. model of highway financing is obsolete and the nation no longer wants good roads at the lowest price. Perhaps the United States would be better served by the toll policies used in Europe and Japan. Maybe it does not matter if our governors put our roads and bridges under control of foreign companies for a half century or longer and leave massive unfunded liabilities for their successors.

One thing is certain, however: this is corporatism in its most basic form. How America should deal with it is one of the thorniest issues the Bush administration will bestow upon the next president and Congress.

At noon on January 20, 1993, Bill Clinton became president of the United States. After the ceremony, Clinton and congressional leaders went into the Capitol for the traditional reception and luncheon. Before the event began, however, the new president was guided to a side room, where aides presented several important documents for his signature. One of those was Executive Order 12834, titled ‘Ethics Commitments by Executive Branch Appointees,' which Clinton signed into law.

This Executive Order required every senior appointee in every executive agency of the U.S. government appointed on or after January 20, 1993, to sign an ethics contract that read:

1. I will not, within five years after the termination of my employment as a senior appointee in any executive agency in which I am appointed to serve, lobby any officer or employee of that agency.

2. I will not, at any time after the termination of my employment in the United States Government, engage in any activity on behalf of any foreign government or foreign political party which, if undertaken on January 20, 1993, would require me to register under the Foreign Agents Registration Act of 1938, as amended.

3. I will not within five years after termination of my personal and substantial participation in a trade negotiation, represent, aid or advise any foreign government, foreign political party or foreign business entity with the intent to influence a decision of any officer or employee of any executive agency, in carrying out his or her official duties.

With a stroke of his pen, Clinton put into U.S. law an ethics code for the executive branch that was by far the strictest, most comprehensive, most inclusive, and most detailed that any president had ever adopted.

Once the details of the Executive Order became known, the controversy over Clinton's appointments died down almost instantly since, once out of office, the appointees were legally bound to wait five years before lobbying their former agencies or becoming agents of influence. Over the next eight years, every policy-level appointee of the Clinton administration signed the federal ethics contract.

On the morning of January 20, 2001, just before the inauguration of President George W. Bush, the White House media director issued a press alert announcing that Clinton had pardoned 140 people, including his brother, drug dealers, stock swindlers, two brothers from Arkansas imprisoned for rolling back odometers on used cars, and a fugitive commodities broker named Marc Rich, whose ex-wife had contributed hundreds of thousands of dollars to the political campaigns of both Bill and Hillary Clinton, and a middleman in Saddam Hussein's Oil-for-Food scandal.

Unnoticed in the media's coverage of the pardons was Clinton's issuance of Executive Order 13184, which provided, in its entirety: Executive Order 12834 of January 20, 1993, ‘Ethics Commitments by Executive Branch Appointees,’ is hereby revoked, effective at noon January 20, 2001. Employees and former employees subject to the commitments in Executive Order 12834 will not be subject to those commitments after the effective date of this order.

With a second stroke of his pen, Clinton canceled the thousands of ethics contracts signed during his administration, thereby allowing all of his appointees to rejoin Washington's lobbying industry instantly.

Absent his revocation, Clinton's ethics code would automatically have been law for Bush's appointees. Revoking the order would have been a most difficult step for the new president, given the negative publicity such an action surely would have created. Bush never reinstated the 1993 Executive Order, and the postgovernment employment of both Clinton and Bush administration appointees has been regulated by the loophole-ridden ethics laws of the past, which in effect means that there has been no regulation.

Today, America’s financial leaders, as they always have, sit among the top of our national elites.  The newest variation in their changing role in the economy is the rise of hedge and equity funds.  They trade stocks and take large positions in corporations, where they often mount takeovers and conduct restructuring.  As of 2008, hedge fund managers had almost $3 trillion under management, most of which comes from pension funds, insurance companies, and wealthy investors.

Such investment funds are organizations, such as the Blackstone Group and Cerberus, that raise massive amounts of equity from institutions and individuals, borrow even greater sums from banks and other lending institutions, and then buy corporations and operate them as private entities.

The political power of these elites is reflected in their ability to persuade Congress and a succession of presidents to leave untouched a loophole that taxes their multimillion-dollar earnings at a much lower rate than what most Americans pay. These managers charge a base fee of 2 percent of the value of the investment, plus 20 to 50 percent of profits. Their profits can be enormous. In 2005, the top ten hedge and investment fund managers were paid more than $7 billion. The top performer was a Texan, T. Boone Pickens, who was paid $1.5 billion for a year's work. In 2006, James Simons of East Setaucket, New York, took the top spot with a paycheck of $1.8 billion.

But unlike other service providers, such as high-earning accountants and lawyers, who are taxed at a 35 percent rate, hedge fund managers pay a tax of only 15 percent on what they earn. When the Blackstone Group sold equity to the public in 2007, the partners were able to structure the deal so they were taxed less than 5 percent on $3.7 billion of the stock they sold.

The United States' protectionism was not some fluke or accident of national policy; it was the core foundation of development and defense strategies for almost a century and a half. Its first champion was President George Washington, who, having fought a war that he almost lost because his nation lacked the manufacturing capacity needed to supply his armies, told Congress in his first State of the Union message that the safety and the interests of a free people ‘require that they should promote such manufactories as tend to render them independent of others for essential, particularly military supplies.’

His idealistic successors, Thomas Jefferson and James Madison, read Adam Smith's The Wealth of Nations and were intellectually enamored with the theory of free trade. They tinkered with the idea of imposing trade sanctions against Britain in order to weaken its mercantilist discrimination and to get the British to accept a bilateral free-trade regime. The War of 1812 shattered that fantasy.

The experience quickly converted them and their contemporaries into economic realists who finally came to understand that as long as the United States was without the means of producing the goods and weapons it needed, it was a vulnerable nation. Nothing made that more immediate than British troops' burning the White House during the War of 1812.

The policy that emerged after that war became known as the ‘American System’ of trade. Congress imposed high protective tariffs on foreign imports, not to tax consumers but to stimulate investment in the United States and thereby enable the country to produce its own goods and provide jobs for its own workers. Under the system, the new nation grew and prospered over the next half century.

The American System was controversial. Unfettered capitalism brought with it great income inequalities and recurring financial panics. Labor was abused, and the state of the environment was irrelevant. Though Republicans unabashedly supported a nationalistic trade policy in the period 1860-1930, the Democrats vacillated between approval of free trade and wanting to provide low-cost imports to working people. The contradiction the Democrats faced was that while free trade could break the power of the protected monopolies that financed the Republicans and furnish inexpensive consumer goods, the inflow of cheaper foreign items could also devastate their own working-class constituents. For almost three quarters of a century, therefore, the Republican position prevailed among voters and in Congress.

Did the American System work? Beyond George Washington's wildest hopes. Under protectionist policies the United States evolved from a small colony with limited manufacturing capacity to the mightiest industrial power in the world.

The United States' trade isolationism, of course, was not the only reason. Other contributory factors included the development of a mass market, technology transfer, high rates of domestic capital formation, foreign investment, and a legal system that protected private property, including intellectual property.

The beginning of the end of the American System came with the election of Woodrow Wilson, who ran for office in 1912 on a platform of low tariffs and free trade. Only days after being inaugurated, Wilson summoned Congress into a special session to enact tariff reduction legislation. The result was the Underwood-Simmons Tariff Act of 1913. The problem with this legislation was that it drastically reduced U.S. tariffs unilaterally without getting, or even asking for, the lowering of foreign barriers to U.S. exports. Ironically, the bill had little effect because soon after it was enacted, World Wax I shut down global trade.

As soon as Warren Harding succeeded Wilson in March 1921, the new president called upon Congress to pass an emergency tariff bill, which became law barely sixty days later. The Harding legislation required reciprocal concessions as the price for reducing U.S. trade barriers, a stance that spurred two domestic political movements. One was the creation of a coalition of Republicans and industrialists who wanted the reestablishment of the nineteenth-century protectionist tariff patterns of the American System. The other group, led by Representative Cordell Hull :D-Tenn.), advocated a new system of reciprocal tariff cuts, an approach that emphasized exports.

Eventually, America got both. In 1930, Senator Reed Smoot (R-Utah) and Representative Willis Hawley (R-Oreg.) shepherded through Congress the Tariff Act of 1930, commonly known as Smoot-Hawley, which raised tariffs. Contrary to the enduring political myth, the Smoot-Hawley Tariff was not the cause of the Great Depression, which actually began nine months prior to its enactment. Neither did the law induce fierce foreign retaliation and trade wars or a spiraling decline in U.S. trade, which subsequent data have confirmed.

Readers quickly realized that the NAFTA agreement was a sharp departure from prior U.S. trade pacts, which dealt primarily with tariffs and quotas.  Indeed, this was the world’s first ‘globalism’ trade treaty.  Among other provisions, the three governments agreed to… Allow Mexican nationals to own up to 100 percent of farms, forests, and other real estate in the United States, while restricting U.S. citizens’ ownership to 49 percent of such properties in Mexico.

Before NAFTA could become law, Congress had to ratify it.  During the presidential campaign of 1992, George Bush and Bill Clinton supported NAFTA, while Ross Perot, the independent candidate, opposed it.  During the third presidential debate, Perot famously told his two opponents:  You implement that NAFTA – the Mexican trade agreement where you pay people one dollar an hour, have no health care, no retirement, no pollution controls – and you are going to hear a giant sucking sound of jobs being pulled out of this country.

NAFTA successfully did what its creators designed it to do: facilitate increased U.S. direct investment (FDI) in Mexico and Canada, allowing those nations to become manufacturing platforms of goods that can be imported into the United States duty- and quota-free.

Robert E. Scott, director of international studies at the Washington-based Economic Policy Institute, calculates that NAFTA cost America 879,000 jobs between 1994 and 2002. He also concludes that the trade pact indirectly diminished many other manufacturing positions that remained in the United States. Scott notes that many companies openly threatened to move their factories to Mexico if employees were not compliant about wages, benefits, and working conditions. Just the threat of relocation, he says, ‘suppressed real wages for production workers, weakened workers' collective bargaining powers and ability to organize unions, and reduced fringe benefits!'

Yet the total effect of NAFTA goes far beyond trade with Canada and Mexico. As a template, its provisions are found in more than a dozen similar pacts, leading to staggering trade losses. Again, to provide a perspective, the United States ran a trade surplus every year during the 1960s, creating a decade-long surplus of almost $32 billion. Largely because of the oil shortages of the 1970s and rising petroleum prices, that surplus became a decade-long deficit of $87 billion in the 1970s. As the new laissez-faire policies took hold in the 1980s, the collective decade-long deficit jumped to $864 billion, and in the 1990s it exceeded $1 trillion.

In the first years of the twenty-first century, the U.S. trade deficit has soared to stratospheric levels. For the first seven years of the 2000s, the cumulative total is $4.5 trillion – the largest, fastest unilateral transfer of wealth from one nation to others in the history of the world.

When Ricardo conceived the comparative advantage theory, agricultural production, based on climate and geography, was then the major component of the gross domestic product of most nations. Countries grew and exported, if they could, what they grew best. Manufacturing for export was just developing. In those days, capital, technology, and labor were immobile. Indeed, the idea of an Englishman building a factory elsewhere, such as France, or sending technology there was deemed too risky.

The economist Paul Craig Roberts, a key economic adviser to President Ronald Reagan, disputes the idea that Ricardo's theory can work in today's global economy. Advantage today, he argues, is created not by a nation's stock of natural resources but by the availability and mobility of capital, technology, and workers. The heart of modern trade agreements is a global guarantee that foreign capital and technology will not be expropriated by the host nation, eliminating the most important obstacle to direct foreign investment.

Unlike in Ricardo's era, capital and technologies are now totally mobile, as increasingly are high-knowledge workers. The result is a historically unique economic dynamic.

Today, dozens of nations have essentially limitless pools of labor, collectively hundreds of millions of potential workers, most of whom can be made as productive as workers in the United States or any other industrial country. When combined with imported capital and technology, low-wage labor with few benefits enables those nations to create an unbeatable ‘absolute advantage’ based on inexpensive workers that the United States and other developed countries cannot overcome.

Even if the United States, Japan, or Europe confronts the arbitrage differential by developing even more advanced technologies, greater laborsaving approaches, or even entirely new industries, all of those advances can be transferred to the low-wage countries, wiping out any temporary gains in competitiveness.

The absolute advantage created by labor arbitrage is strengthened by the lower costs created with a lack of environmental restrictions, worker safety, and other employment benefits found in the developed world.

The United States has created these imbalances by borrowing from foreign lenders on a massive scale and liquidating national assets to an equally vast degree. By 2009, foreign interests will own more than half of the federal Treasury debt, a third of corporate bonds, and a sixth of corporate assets. Professional economists across most ideological spectrums, including Alan Greenspan, Paul Volcker, Paul Krugman, and Paul Craig Roberts, agree that this imbalance foreshadows a major global economic restructuring and if handled improperly could trigger a worldwide depression. As for the ‘serpents’ that John Maynard Keynes referred to in The Economic Consequences of the Peace – militarism, imperialism, racism, cultural rivalries, monopolies, restrictions, and exclusions–they are the same ones that threaten the economic and political stability of our time. They set the boundaries (visible and invisible) that define the limits of progress, change, and global economic integration.

John Maynard Keynes wrote in The General Theory… ‘Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist.’ [Contrary to popular belief, Keynes held PhD in mathematics, not economics.]

The policy debate of which global trade approach to take – to put rules into place or aim for results – has raged for more than twenty-five years in the United States. The rules-driven free-trade advocates have won every political trade battle in Congress during that time and enacted every trade agreement proposed by the federal government. The fundamental flaw in their approach to policy is that they assume that globalization is free trade. It is not.

Ralph E. Gomory, president of the Alfred P. Sloan Foundation and former head of research at IBM, emphatically made that point in testimony before the House Committee on Science and Technology on June 12, 2007. He told Congress: What is good for America's global corporations is no longer necessarily good for the American economy... When U.S. companies build semiconductor plants and R&D facilities in Asia rather than in the U.S., then that is a shift in productive capability, and neither economic theory nor common sense assert that shift is automatically good for the U.S. even in the long run.... [G]lobalism is thriving today at least partly because it supports and gains support from a group that is very powerful today, the multinational corporations.

U.S. industries with a positive trade balance are mainly commodities such as cotton, soybeans, corn, scrap paper, rice, coal, tobacco, hides, meat, and wheat.

The profile of the U.S. trade position today is that of a nation being economically colonized – one that is purchasing high-value-added commodities and manufactured goods from abroad and paying for them with the export of agricultural commodities, massive foreign borrowing, and the liquidation of its own national assets.

Although many advocates of unfettered globalization argue that trade in services – such as architecture, software, engineering, research and development, and finance – will provide a substitute for America's loss of its manufacturing industries, Commerce Department trade data refute that hope. Eighty percent of the U.S. workforce is employed in service work, but trade in services in 2007 produced a net surplus of only $104 billion, up from $40 billion in 2000, offsetting barely 12 percent of the overall $815 billion deficit in goods.

After World War II, the United States agreed, as part of a global trade agreement, to accept a provision that allowed other governments to rebate to their producers any indirect taxes paid on exported goods and impose an equal tax on imports from the United States into their countries. In that pact, the United States also agreed not to rebate any direct taxes on exports or to impose their equivalent on foreign imports.

This was a post-World War II U.S. tax loophole designed to speed Europe's recovery. Europe recuperated from that war decades ago, but the loophole has not only remained, it has been expanded and applied by 148 other nations.

In practical terms this means, for example, that the German manufacturer of a car, or any other product, exported into the United States gets a rebate from the German government equal to the indirect taxes paid on that product in Germany – the German value-added tax (VAT). The VAT rate in Germany is 19 percent, so German carmakers get a 19 percent tax rebate from their government on every vehicle exported to the United States. This is a big subsidy by any measure to German automakers.

Conversely, any U.S. carmaker exporting a vehicle to Germany must pay that government a VAT-equivalent tax of 19 percent of the price of the car, plus another 19 percent tax on the costs of all transport, insurance, docking, and duties involved in getting the car into Germany. Worse, the American company gets no credit in Germany for the corporate taxes it pays in the United States. This is a giant de facto tariff on imports.

The competitive impediments to American producers created by this tax discrimination and the use of the VAT as a trade barrier are enormous. In 2005, a foreign VAT was applied to 94 percent of all U.S. exports and imports. Foreign governments paid their domestic companies $239 billion of VAT rebates on goods and services exported to the United States. These governments also collected from U.S. producers $131 billion of VAT-equivalent taxes on goods and services imported from the United States, creating a $370 billion distortion in the international trade figures with the United States. By 2006, that distortion had grown to $428 billion per year.

Can anyone be surprised that U.S. exports are increasingly uncompetitive when they must overcome such enormous foreign subsidies and import taxes?

The WTO rules on the VAT, moreover, provide a powerful incentive for U.S.-headquartered companies to shift production and jobs from America to nations that use a VAT. Indeed, thousands of U.S.-headquartered companies have outsourced their production to foreign countries to get the VAT tax advantage in global trade.

Congress has repeatedly tried to eliminate the VAT disadvantage. In 1974, for instance, it directed the Nixon administration to negotiate America's VAT handicap away in the Tokyo Round of global trade talks. The other nations, however, ignored the United States' demand and refused to deal with the issue. The same thing happened in subsequent trade negotiations initiated in 1986 and again in 2002. Even today, foreign governments will not consider the issue.


Appendix J. Excerpt from Jerrome R. Corsi’s The Late Great USA (WND Books, 2007)

[Corsi writes on the VAT.]

Advocates of "free trade" have hailed treaties such as NAFTA and the WTO. Yet, because most of our "free trade" partners use a value added tax (VAT), America has been placed at a severe disadvantage. The United States, virtually alone among the world's major international trading countries, does not use a VAT. "Free trade" is certainly not "fair trade" when VAT structural advantages stack the deck against the United States before any cards are dealt.

The impact of the value added tax (VAT) on international trade is complicated. Yet it may be the most important variable in explaining our expanding trade deficits. The average U.S. citizen can no longer afford not to understand the VAT, especially in an era where free trade agreements dominate our international trade agenda.

The modern VAT was created by French economist Maurice Lauré in the 1950s. The basic concept is that a "value added" tax is imposed at each stage in the chain of production of a good or service.

In a sense, this hides the tax from the consumer. The producer builds the price of the tax into his selling price, which passes the tax along to the consumer at the point of sale. The amount of the VAT, then, is included as a percentage of the final value of the good or service. The VAT is not reimbursed to the consumer, so at the final point of purchase, the government gets to keep the VAT once and for all.

A VAT and a sales tax are both "indirect" taxes, in that the consumer, rather than the producer, pays them. Income taxes, in contrast, are "direct" taxes in that the tax cannot be shifted to someone else other than the person producing the income. The main difference between a VAT and a sales tax is that the VAT is applied at each stage of production, whereas sales taxes are usually imposed once, at the final point of sale.

In international trade, countries do not treat indirect taxes and direct taxes the same, and that differential puts the United States at a decided disadvantage. The United States does not use a VAT system. Some 137 countries, including the EU countries, China, Canada, and Mexico, have VAT systems. A simplified example may clarify how VAT systems disadvantage American goods.

On the one hand, an American made car that sells for $23,000 in the United States includes profit for the company and covers the various tax obligations and expenses for the company. When the manufacturer exports that car to Germany, the German government adds 16 percent VAT to the $23,000 price, meaning that the car will be sold in Germany for $26,680.

On the other hand, consider a German car that is sold in Germany for $23,000 after the 16 percent VAT is imposed. When the German manufacturer exports that car to the United States, Germany rebates the 16 percent VAT to the manufacturer, allowing the export value of the car to be $19,827.59. Moreover, when the German car is imported to the United States, the U.S. government does not assess any comparable tax, so the car is allowed to enter the U.S. market at a price under $20,000.

The system disadvantages U.S. producers two ways. When exported, the U.S. car starts off with a disadvantage of $3,680 in Germany because of the VAT. At the same time, the German car, which sells at home for the same price as the American car sells in America, sells in America for $3,172.41 less than the U.S. car. In this example, the total disadvantage American car companies face is $6,852.41.

In effect, the rebate of the VAT to German exporters serves as a German subsidy for exports, while the imposition of a VAT on American imports serves as a German tariff. Still, free trade agreements do not define the VAT as either a subsidy or a tariff, even though the system demonstrably disadvantages U.S. manufacturers both in exporting to VAT countries and in competing with other countries' exports in the U.S. market.

Here are the crucial points:

Companies operating in VAT countries enjoy rebates of VAT taxes on the goods they export. Companies that manufacture goods in the United States get no refunds of the state and federal taxes they pay on the goods they export.

Imports into VAT countries are subjected to VAT at the border, while imports into the United States are not taxed at the border.

As a result, U.S. exports are taxed twice, while exports from VAT countries are traded free of certain types of taxes.

[End of Corsi segment on the VAT.]


Appendix K. Excerpt from James Howard Kunstler’s The City in Mind (The Free Press, 2001)

In this book, Kunstler describes the evolution of some of the world’s major cities.  A separate chapter is devoted to each city.  Kunstler describes the development of each city, and identifies the reasons for its current condition.  The cities that he examines are Paris, Atlanta, Mexico City, Berlin, Las Vegas, Rome, Boston and London.  Here follows a summary of the book, taken from the book’s preface:

The idea that city-making is an art rather than a product of statistical analysis or social science casework is largely the point of my opening chapter on Georges Eugène Haussmann's heroic renovation of Paris in the mid-1800s. Under the emperor Louis-Napoleon (an improved version of the original Bonaparte, his uncle), Haussmann made over Paris from a stinking and decrepitating rat-maze of slums into the epitome of everything we value about city life.

In the second chapter, about Atlanta, I try to demonstrate the folly of Edge City (so-called) as both a design model and a way of living. Edge City, a term coined by the writer Joel Garreau, was supposed to represent everything cutting-edge and ultramodern in the postindustrial evolution of cities. I essay to show how Atlanta took the urban model of car-crazy Los Angeles to its most ludicrous and, in my view, terminal stage. With Atlanta, you can forgo agonizing over the future, because the present doesn't even work there.

The third chapter takes us back roughly five hundred years to a unique event in history: the collision of two very strange but well-developed and dominant cultures so vastly different that they might have come from two separate planets. In 1519, a tiny Spanish expeditionary force under a brilliant rogue commander, Hernán Cortés, made contact with the death-enthralled empire of the Aztecs and conquered their gigantic, beautiful, sinister capital city, Tenochtitlán. The spirit of the Spanish Inquisition meets its match in Huitzilopochtli, voracious eater of still-beating Aztec hearts. I attempt to show how this astonishing chain of incidents resonates still in the culture of contemporary Mexico City, a prototype of hypertrophic "third-world" urbanism, plagued by a failed social contract, lawlessness, economic disorder, and a wrecked ecology.

Next I reflect upon the strange destiny of Berlin, a city whipsawed by the tragic enormities of twentieth-century politics. Above all, Berlin expresses the paradoxes of history: how Europe's best-educated people could succumb to political mania, moral suicide, and mass murder; how an urban organism can survive nearly total destruction and find itself fifty years later in better condition than the cities of its chief destroyer; how the politics of freedom and openness produced an architecture of despotism, and vice versa. And how the result of all these vicissitudes is a search for nothing grander than normality.

We turn next to Las Vegas, America's leading boom town at the turn of the millennium, a city built by gangsters for gangsters, based on the tragically foolish idea that it is possible to get something for nothing, and now weirdly mutating into a family vacation destination. I discuss the strange physical form of the city, an evolution of the most extreme cultural and technological developments in the past century, and argue that Las Vegas has reached the limits of its hypertrophic growth. Las Vegas may also reflect a condition more and more common throughout America as a whole: that ridicule is the unfortunate destiny of the ridiculous, trumping even the tragic view of history.

Rome is the backdrop for tracing the meaning of classicism as a set of ideas necessary for the continuing project of civilization. This chapter takes a long historical view, tracing the sources of the classical in Greece and Italy, its full flowering in the Roman Empire, the long and gruesome unlearning in the millennium following the fall of Rome, and the rescue of classicism in the Renaissance. Classicism was thrown away once again by the forces of modernism during the nervous breakdown of culture that the twentieth century represented, so the question is posed, can classicism now rescue us?

I chose to write about Boston because I think it has done more to prepare for the twenty-first century than most other American cities, and indeed it may be one of the few habitable cities left in America when the orgy of cheap oil draws to a close and it becomes necessary to conduct normal life and work in walkable neighborhoods connected by decent public transit. Boston had a hard time of it in the twentieth century, and the political legacy of that period still exerts a baneful influence. But the city is in the process of overcoming those other common disasters of the recent urban scene in this country: the tyranny of the automobile and the flight of the prospering classes. In the years ahead, I argue, Boston will demonstrate the value of city life to a culture that all but gave up on the idea.

Finally, I look to London as the origin in Anglo-American culture for the idea that country life is the antidote to the hopelessness of industrial urbanism. This idea, which has reached its fullest expression in contemporary America, begins with the English Landscape movement, and leads directly to the circumstance of London becoming the world's first great industrial city – and therefore the first major world city to suffer the unanticipated consequences of advanced technological progress. In America, where we have inherited so many English ideas about landscape and place, the result in our time has been the notion that city life can be dispensed with altogether for a simulacrum of the rural. The idea culminates in the absurdities of our contemporary battles over "green space" and "open space," while our human ecologies – namely our towns and cities – remain devalued, depopulated, and decivilized.

Under the most favorable circumstances, it is apt to take at least a hundred years to clean up the mess we made of our nation, if we can do it at all – and I stick to a point made in my previous book, Home from Nowhere, that life is tragic and there are no guaranteed rescues from the great blunders of history.

Don't get me wrong. I hope we do recover. I believe we have the knowledge and the resources to reorganize the physical arrangement of American life from a national automobile slum to a land full of places that are truly worth living in. Therefore, I stick to another central point of my previous book: that a land made up of places not worth caring about will sooner or later become a nation not worth defending (or a way of life not worth carrying on). All this begs the question of whether we have the will to reorganize our everyday environment. Personally, I believe the future will compel us to change our way of life, to give up the fiasco of suburbia and all its revolting accessories and recondense our living and working places into the traditional human habitats called cities, towns, and neighborhoods.

In the past eight years I traveled all over the United States (except Alaska) and got to see almost every city of any consequence in the lower forty-eight states. It was a shock to discover how far gone most of them are. Since I wrote about Detroit in The Geography of Nowhere (1993), wildflower meadows have sprouted where miles of slum row houses stood – and I don't mean to say that this is necessarily an improvement, because it only means the hole at the center of Detroit's metropolitan doughnut has gotten larger and emptier instead of redeveloping. St. Louis is a virtual mummy's tomb between its empty downtown and the West End. Baltimore has become a flyblown carcass. Buffalo looks as if it suffered a prolonged aerial bombardment. A giant vacuum cleaner seems to have sucked the populations out of Memphis, Nashville, and Little Rock. Small towns in the Midwest are perhaps the most heartbreaking to see. I remember a spring afternoon I spent as the sole pedestrian in downtown Appleton, Wisconsin – its commercial activity had all been shifted to an asteroid belt of highway strips and architectural garbage five miles outside town. Ditto Louisville; Dayton; Meridian, Mississippi; Billings, Montana; Macon, Georgia. And so on. The list is long and dreary, and it certainly prompts the casual observer to wonder if our future holds a civilized existence.

The concern about what happens to my own country underlies all the chapters in this book. Will it take an autocrat to repair American cities, as in the case of Napoleon III and Paris? How do culture and history support the social contract? Now that we've created our national automobile slum, what are its possible destinies? Can we find a way to reestablish a meaningful distinction between the urban and the rural? Can we make Beauty (capital B) matter again in our everyday world?


Appendix L. Excerpts from James Howard Kunstler’s The Geography of Nowhere (Touchstone, 1993)

Thirty years ago, Lewis Mumford said of post-World II development, "the end product is an encapsulated life, spent more and more either in a motor car or within the cabin of darkness before a television set." The whole wicked, sprawling, megalopolitan mess, he gloomily predicted, would completely demoralize mankind and lead to nuclear holocaust.

American land law was predicated on the paramount principle that land was first and foremost a commodity for capital gain. Speculation became the primary basis for land distribution – indeed the commercial transfer of property would become the basis of American land-use planning, which is to say hardly any planning at all. Somebody would buy a large tract of land and subdivide it into smaller parcels at a profit – a process that continues in our time.

Other Old World values toppled before this novel system – for example, the idea of land as the physical container for community values. Nearly eradicated in the rush to profit was the concept of stewardship, of land as a public trust: that we who are alive now are responsible for taking proper care of the landscape so that future generations can dwell in it in safety and happiness. As historian Sam Bass Warner put it, the genius of American land law and the fanatical support it engendered "lay in its identification of land as a civil liberty instead of as a social resource."

This is embodied today in the popular phrase, "You can't tell me what to do with my land." The "you" here might be a neighbor, the community, or the government. The government's power to regulate land use was limited under the Fifth and Fourteenth Amendments to the Constitution. The Fifth states that private property cannot be taken for public use without due process of law and just compensation – the right to public hearings and payment at market value – and the Fourteenth reiterates the due process clause. All subsequent land-use law in America has hinged on whether it might deprive somebody of the economic value of their land.

America's were the most liberal property laws on earth when they were established. The chief benefits were rapid development of the wilderness, equal opportunity for those with cash and/or ambition, simplicity of acquisition, and the right to exploitation – such as chopping down all the virgin white pine forests of Michigan (they called it "mining trees"). Our laws gave the individual clear title to make his own decisions, but they also deprived him of the support of community and custom and of the presence of sacred places.

The identification of this extreme individualism of property ownership with all that is sacred in American life has been the source of many of the problems I shall describe in the pages that follow. Above all, it tends to degrade the idea of the public realm, and hence of the landscape tissue that ties together the thousands of pieces of private property that make up a town, a suburb, a state. It also degrades the notion that the private individual has a responsibility to this public realm – or, to put it another way, that the public realm is the physical manifestation of the common good.

Tocqueville observed this when he toured America in 1831. "Individualism," he wrote, "at first, only saps the virtues of public life; but in the long run it attacks and destroys all others and is at length absorbed in selfishness."

The automobile rapidly reshaped the nation's economy in ways that had strange and unforeseen repercussions. Certainly, the car was the main force behind the economic boom of the 1920s. Refitting the human habitat to accommodate the car required vast capital expenditures that translated into jobs with rising wages and business activity with soaring profits. Real estate and construction boomed as the urban outlands were carved into homesites for the new automobile commuters. Small businesses sprouted to serve the new auto culture. Even more important, the techniques of the assembly line pioneered by Henry Ford quickly spread throughout American industry, resulting in a deluge of consumer goods, everything from toasters to radios, which were bought by those workers with their rising wages and businessmen with their soaring profits. But the system was booby-trapped and it would blow up in 1929. The first signs of trouble appeared out on the farms.

The automobile revolutionized farm life, in many ways for the better. But it also destroyed farming as a culture (agriculture) – that is, as a body of knowledge and traditional practices – and turned it into another form of industrial production, with ruinous consequences.

Around 1900, fully one third of the U.S. population lived on small family farms. They had grown increasingly isolated from the benefits of industrial civilization. Medical care often lay beyond their reach. One-room schools could be dismally inadequate and hard to get to. A simple trip to the nearest small town might be an ordeal. Henry Ford, himself a Michigan farmboy, had farmers very much in mind when he developed the Model T, and they were among his most avid early customers. They emptied their savings accounts and forsook indoor plumbing to buy cars. The Saturday trip to town became an overnight institution all over rural America. Farmers now had regular access to a society beyond their little hill or hollow, to libraries, to popular culture, to ideas besides those in the family Bible. The car also lightened work by serving as a mobile power plant around the farm. With its driving wheel jacked up, its engine could run other machines, saw firewood, pump water. The improvements in rural life even spawned the expectation that urban working folk, oppressed by dreary factory labor and fractious tenement living, would flock to the bucolic hinterlands. In fact, the opposite happened. The mechanization of the American farm disrupted the rural economy so badly that more farmboys fled to the cities than ever before and the farm population plummeted.

In 1918, Henry Ford sold his first Fordson tractor, superior to all other early makes and models of its day, and, like the Model T, the first tractor mass-produced for "the great multitudes" of farmers. By 1929, the number of tractors in use on American farms leaped to nearly one million. The tractor was soon hitched up to an array of new and expensive accessory machines – reapers, seed drills, threshers, diskers, mowers. At the same time, having given up their horses, farmers suddenly came to depend on artificial fertilizers. What had formerly cost them a great deal of labor, mucking out stalls, collecting manure, now cost them a great deal of cash money.

What followed is a prime illustration of the principle that more is not necessarily better. Mechanized farming allowed farmers to boost their output tremendously in a very short period of time. But the demand for their commodities, the whole market structure, remained much the same as it had been before. Nor did the farmers themselves stop to rethink their economic strategy in terms of, say, product diversification and specialty crops. In fact, mechanized farming lent itself to monoculture, the growing of single crops on an ever-larger scale. Consequently, the supply of staple grains soared while the demand stayed relatively constant, and so prices plummeted. Catastrophically, the bottom fell out of American agriculture.

Meanwhile, the farmer was introduced to another new accessory: the mortgage. He suddenly needed to borrow cash each spring to buy the fertilizer that he formerly got for free, to buy pesticides to protect his monoculture crop (which could be wiped out by one kind of bug), and to purchase new mechanical equipment to increase his production to make up for the falling price of his commodities. The farmer now entered a precarious relationship with banks in which each year he literally bet the farm that he could bring in a profitable crop.

The long-term result was the death of the family farm in America, the replacement of agriculture by agribusiness. By 1940, the percentage of the population on farms fell to 23 percent, and by 1980 it had dwindled to 3 percent. In and of itself, this population shift might not have been a bad thing, but it was accompanied by another terrible cost. A way of life became simply a means of production. Human husbandry gave way to the industrial exploitation of land. Left behind was the knowledge of how to care for land, so plainly evinced in today's problems of soil erosion and in pollution from chemical pesticides and fertilizers.

The cycle of overproduction, debt, and foreclosure in the late 1920s was the first sign that accommodating a new technology like the motor vehicle in an established economy could be wildly disruptive. The farming situation would worsen in the 1930s, when cyclical droughts in the plains combined with new mechanical means of soil cultivation to produce the dust bowl, a disaster of Biblical dimensions.

The rest of the American economy soon followed the farmers into the Great Depression, and for much the same reason: the overproduction of consumer goods and the disruption of markets. By the early 1920s, it was obvious to a few of the saner individuals in the auto industry, like Charles Nash, that they were approaching the point of market saturation, where everybody who could buy a car, would have his or her car. By 1926, an industry survey estimated that only one third of the nation's auto dealers were making a profit. No sales gimmick could overcome the fact that the industry was now turning out far more vehicles than it could sell. Overseas markets would not absorb all that excess capacity. Europe had different needs, different social priorities, and car manufacturers of its own. The rest of the world might as well have been living in the twelfth century—no amount of salesmanship was going to put 300 million Chinese peasants behind the wheel.

The boom of the 1920s was based not simply on the steady sales of cars and other consumer products, but on a continual expansion of sales. When these leveled off, the results were disastrous. The building boom associated with the new automobile suburbs started fizzling in early 1928. There were only so many office managers, regional sales directors, and other business executives to buy those new "colonial" houses out in Lazy Acres. The bulk of American workers toiled in the very factories that were overproducing cars and electric waffle irons, and even before they were laid off in the Depression, few blue-collar workers could have afforded a new house in the suburbs and a car to drive there. They were the very ones who remained behind in the cities until after World War II.

The huge public expenditure in paving streets and building new highways had also reached a kind of natural limit in the late twenties; the basic infrastructure for cars was now in place. The slowdown in car and home sales and in road-building affected suppliers down the line: steel makers, tire makers, glassmakers, lumber companies, cement companies. The makers of smaller consumer products like waffle irons, having adopted the assembly-line methods of Ford, also ran up against the wall of market saturation. American industry had geared up for rates of production that could not be justified by flattened demand.

These events led to a fundamental crisis of capital. The boom had generated a tremendous amount of additional wealth for America's upper classes. This surplus money sought an outlet in investment, meaning stocks and bonds, or interest-bearing bank accounts. Through these avenues, the money would be put to use financing new enterprises and expanding existing ones. At least that was the accepted theory. But as sales of everything sagged in the late twenties, corporations quietly scrapped plans for further expansion and new equipment. Political denial, aggressive professional boosterism, and a climate of mendacity in the securities industry combined to float the illusion that the American economy was still expanding. The money that went into the stock market accumulated paper profits and inflated the stocks of companies that could no longer rationally grow. The banks, with no one coming for commercial loans, funneled their depositors' money into the stock market. The result was the runaway bull market of the late 1920s, a fantastic spiral into casino-style unreality that climaxed in the Black Friday crash of October 13th, 1929.

[Kunstler then describes the massive public-works projects that revived the country economically, including development of the nation’s highways, Robert Moses’ bridges and highways in New York, the economic development programs of the New Deal, the Federal Housing Administration, the Second World War, and Eisenhower’s Interstate Highway System.]

Aside from being nearly killed by an assassin early in his first term, Ronald Reagan was the luckiest President of the century. The oil cartel fell apart while he was in office without America's having to do a thing. Greed, desperation, and a war between Iraq and Iran that spanned both of Reagan's terms, foiled the oil cartel's ability to operate in concert and keep prices jacked up. The poorer oil nations, like Nigeria, tried to undersell the others, while the richest ones, like Saudi Arabia, could not resist the temptation to compete by overproducing, thereby glutting the market. Hence, the price of imported crude oil dropped steadily during Reagan's tenure and he abandoned the alternate-energy research programs that Carter had started.

Reagan professed to believe literally in the fundamentalist Christian doctrine that the end of the world was at hand. At the very least, this should have called into question his concern for the nation's long-term welfare. Unburdened by such mundane cares, he cast aside all restraint in the pursuit of economic "growth," and financed the next phase of suburban expansion by encouraging the greatest accumulation of debt in world history. Why worry about borrowing from the future when you don't believe in the future? His government ran up an unprecedented public debt, his securities regulators allowed corporations to borrow absurd sums by issuing high risk "junk" bonds, and personal credit was extended to any shmo who could sign his name on a retail receipt, until an alarming percentage of ordinary citizens were in hock up to their eyeballs.

Reagan's bank deregulation and tax policies promoted gigantic and unnecessary land development schemes that benefited their backers even when the schemes failed by any normal standards. This is how it worked: a developer and a bank would get together to build a shopping mall outside Denver. The bank would take enormous fees off the top of the total investment, as a reward for its participation. The development company would pay itself a large fee up front for supervising the construction. The money invested in the project would have come from federally insured bank deposits. For any number of reasons the shopping mall might fail to attract enough retail tenants, or customers – too close to other established malls, too far from population centers, whatever – and go out of business. When the tumbleweeds blew through the empty parking lot, the banker could feel perfectly secure knowing that the deposits he had thrown away on a foolish venture would be fully repaid by the U.S. Treasury; while the developer's corporation would seek protection under the bankruptcy laws, and the developer himself would hold onto his personal fortune without liability – minus, perhaps, a few thousand dollars in criminal fines and maybe a term in the slammer. Deals like this happened so many times in the eighties that the whole rotten fabric of bad loans and fraudulent ventures threatened to bankrupt the federal deposit insurance system.

Reagan's energy policy was likewise predicated on the idea that the future didn't matter. By the time he left office, America was importing more than half of its oil, and as long as the price was reasonable, why make a fuss? Meanwhile, try extracting as much petroleum as possible from American fields, wherever they might be: on national park lands, off the California coast, the Alaskan tundra.

Reagan's "voodoo economics" was a strategy to keep the game going at the expense of the future – the game being an economy based on unlimited automobile use and unrestrained land development. By the time Reagan was replaced by George Bush – who had invented the term "voodoo economics" only to become its chief practitioner – the future had arrived and was calling in its notes. Major banks collapsed, largely as a result of foolish real estate ventures. Millions of square feet of office and retail space stood vacant across America. Unemployment climbed, especially in the building trades, and because of the gross overbuilding of the eighties, the government could not artificially stimulate more new construction with the same old tricks. Car sales plummeted – and anyway, half of American drivers owned foreign cars by now. In fact, most of the things that Americans bought, period, were manufactured elsewhere. The nation had entered what was being referred to as a postindustrial economy, but so far it was unclear exactly what that meant – perhaps people selling hamburgers and movie tickets at the mall to employees of other malls on their days off. This was a patent absurdity, of course, but without industries of some kind in America, the prospects for maintaining the consumer economy at the accustomed level seemed rather dim.

The known global reserves of petroleum are expected to last roughly another thirty years. This means that in the lifetimes of most Americans living today, the essential fuel that has powered the suburban consumer way of life will no longer be available. It will not be necessary to run out of petroleum in order to fatally disrupt a petroleum-dependent economy. As the 1970s oil shocks demonstrated, all that it takes to mess things up is some instability of supply and price, and surely we will reach that stage before the wells run dry. Despite a lot of wishful thinking, and a near-religious belief in the "magic of technology," there is no alternative in sight to the internal combustion engine that the masses of motorists could afford.

In any case, by the late 1980s the Great Enterprise of an endless suburban expansion finally crashed up against the ultimate natural limit: Researchers discovered that the burning of fossil fuels was altering the earth's atmosphere so drastically that a projected "global warming" effect could melt the polar ice caps, flood low-lying areas where most of the world's population lived, and destroy world agriculture by disrupting weather patterns, all within the next sixty years.

To Americans, it must seem like a scenario from a 1950s horror movie: The Day the Earth Caught Fire! To contemplate it in the comfort of an air-conditioned sedan, cruising up Interstate 87, pleasured by iced drinks and packaged snacks, must add to the unreality. But the joyride is over. What remains is the question of how we can make the transition to a saner way of living. To do so will certainly require a transformation of the physical setting for our civilization, a remaking of the places where we live and work.

America has now squandered its national wealth erecting a human habitat that, in all likelihood, will not be usable very much longer, and there are few unspoiled places left to retreat to in the nation's habitable reaches. Aside from its enormous social costs, which we have largely ignored, the whole system of suburban sprawl is too expensive to operate, too costly to maintain, and a threat to the ecology of living things. To lose it is tragic not because Americans will be deprived of such wonderful conveniences as K Marts and drive-in churches – we can get along, happily without them – but because it was a foolish waste of resources in the first place, and it remains to be seen whether its components can be recycled, converted to other uses, or moved, or even whether the land beneath all the asphalt, concrete, and plastic, can be salvaged. In the meantime, Americans are doing almost nothing to prepare for the end of the romantic dream that was the American automobile age.

This is a good place to consider in some detail why the automobile suburb is such a terrible pattern for human ecology. In almost all communities designed since 1950, it is a practical impossibility to go about the ordinary business of living without a car. This at once disables children under the legal driving age, some elderly people, and those who cannot afford the several thousand dollars a year that it costs to keep a car, including monthly payments, insurance, gas, and repairs. This produces two separate classes of citizens: those who can fully use their everyday environment, and those who cannot.

Children are certainly the biggest losers – though the suburbs have been touted endlessly as wonderful places for them to grow up. The elderly, at least, have seen something of the world, and know that there is more to it than a housing subdivision. Children are stuck in that one-dimensional world. When they venture beyond it in search of richer experience, they do so at some hazard. More usually, they must be driven about, which impairs their developing sense of personal sovereignty, and turns the parent – usually Mom – into a chauffeur.

The one place outside the subdivision where children are compelled to go is school. They take buses there – a public transit system that operates at huge expense, is restricted to children, and runs only twice a day. Even if children happen to live relatively close to school, there is good chance that it would not be safe for them to travel there on foot or by bicycle. This is because the detailing of the streets is so abysmal. By detailing, I mean all the big and little design considerations, including the basic dimensions, that make for good relationships between the things along the street, between the things that streets are supposed to connect, and between people's different uses, as, say, between motorists and pedestrians. For example: what are the building setbacks? Can cars legally park alongside the street? Will there be sidewalks, trees, benches where people can rest or simply enjoy the public realm? Will there be lighting, trash baskets, plantings, et cetera?

The suburban streets of almost all postwar housing developments were designed so that a car can comfortably maneuver at fifty miles per hour – no matter what the legal speed limit is. The width and curb ratios were set in stone by traffic engineers who wanted to create streets so ultrasafe (for motorists) that any moron could drive them without wrecking his car. This is a good example of the folly of professional overspecialization. The traffic engineer is not concerned about the pedestrians. His mission is to make sure that wheeled vehicles are happy. What he deems to be ultrasafe for drivers can be dangerous for pedestrians who share the street with cars. Anybody knows that a child of eight walking home from school at three o'clock in the afternoon uses a street differently than a forty-six-year-old carpet cleaner in a panel truck.

Today, we have achieved the goal of total separation of uses in the man-made landscape. The houses are all in their respective income pods, the shopping is miles away from the houses, and the schools are separate from both the shopping and the dwellings. Work takes place in the office park – the word park being a semantic gimmick to persuade zoning boards that a bunch of concrete and glass boxes set among parking lots amounts to a rewarding environment – and manufacturing takes place in the industrial park – ditto. This has some interesting, and rather grave, ramifications.

The amount of driving necessary to exist within this system is stupendous, and fantastically expensive. The time squandered by commuters is time that they cannot spend with their children, or going to the library, or playing the clarinet, or getting exercise, or doing anything else more spiritually nourishing than sitting alone in a steel compartment on Highway 101 with 40,000 other stalled commuters. Anybody who commutes an hour a day in each direction spends seven weeks of the year sitting in his car.

The costs of all this driving in terms of pollution, which includes everything from increased lung diseases all the way up to global warming, are beyond calculation. The cost to society in terms of money spent building and maintaining roads and paying for traffic police, courts, accidents, insurance, is also titanic. The least understood cost – although probably the most keenly felt – has been the sacrifice of a sense of place: the idea that people and things exist in some sort of continuity, that we belong to the world physically and chronologically, and that we know where we are.

The extreme separation and dispersion of components that use to add up to a compact town, where everything was within a ten-minute walk, has left us with a public realm that is composed mainly of roads. And the only way to be in that public realm is to be in a car, often alone. The present arrangement has certainly done away with sacred places, places of casual public assembly, and places of repose. Otherwise, there remain only the shopping plazas, the supermarkets, and the malls. Now, American supermarkets are not designed to function like Parisian cafes. There is no seating, no table service. They do not encourage customers to linger. Yet some shoppers will spend as much time as their dignity affords haunting the supermarket aisles because it is practically the only place where they can be in the public realm and engage in some purposeful activity around other live human beings. Here they even stand the chance of running into someone they know. A suburbanite could stand on her front lawn for three hours on a weekday afternoon and never have a chance for a conversation.

This vacuum at the center of American life led to the phenomenon of shopping malls. Of course, the concept of a marketplace was hardly new, and large marketplaces under a roof have existed in history too. But the marketplace had always been a public space, part of the fabric of the town, usually at the heart of it, existing in continuity with the rest of town life. By the 1970s, when malls started to multiply across the land, the public realm had been pretty much eliminated from the American scene. Yet that hunger for public life remained. The mall commercialized the public realm, just as the insurance business commercialized fate. What had existed before in an organic state as Main Street, downtown shopping districts, town squares, hotel lobbies, public gardens, saloons, museums, churches, was now standardized, simplified, sanitized, packaged, and relocated on the suburban fringe in the form of a mall. Well, what was so bad about that?

Quite a number of things, actually. For one, the mall existed in isolation, connected to everything else only via the road, and the road was often the type of multilane highway that a pedestrian or bicycle rider might use only at peril – in short, you needed a car to get there. People without cars were just out of luck. It was ironic too, because one reason that people flock to malls was that, once inside, they didn't have to look at all the goddam cars and be reminded of what a depressing environment they lived in.

For another thing, the mall wasn't really a public space. It was a private space masquerading as a public space. Sure, people were free to come and go (during shopping hours), and they were not charged an admission fee to enter, but in reality, they were the guests of the Acme Development Company, or whoever owned the mall. The developer was also entitled to control all the activities that went on inside the mall. This meant no free speech, no right of assembly. In a nation as politically complacent as the United States in the 1970s and '80s, this might seem trivial. But imagine if America got involved in another war as unpopular as Vietnam, and the political temperature rose. Or if our dependence on cheap oil started to cause political problems. Acme Development might not be so tolerant about political rallies held around their philodendron beds, or protest marchers interfering with sales at the Pet-O-Rama shop. Where, then, are you going to have your public assembly? On the median strip of Interstate 87?

Thirdly, the real Main Streets of America developed organically over time, and included both the new and the old, the high rent and low rent. Out at the mall, all rents were necessarily high because of the high cost of construction, maintenance; heat, and air conditioning. The only merchants who could afford such rents, it turned out, were the large chain-store operations – the Radio Shacks, the Gaps, the Footlockers – who had the financial muscle and the proven sales volume to enter into long-term leases. Invariably, these chain stores destroyed local businesses outside the mall, and in so doing they destroyed local economies. The chain stores' profits were funneled to corporate headquarters far away. The chains gave back nothing to the locality except a handful of low-wage service jobs. Since the people who worked in the mall stores were not the owners of the stores, they did not have a long-term stake in their success or failure, and so they had limited incentives to provide good service.

It remains to be seen how the shopping malls of America might evolve over time. The conditions under which they flourished – cheap energy, cars for everyone, a credit-driven consumer economy, special tax breaks for big real estate ventures – may be viewed as abnormal and transitory, a fragile equation that could fall apart like a house of cards if any of the factors changed – for example, if gasoline prices go up enough to erase the profit margins of mass retailers; or if citizens have to establish credit-worthiness before banks issue them charge cards; or if more banks themselves fail. Only one thing is certain: The malls will not be new forever. And none of them were built for the ages.

The public realm suffered in another way with the rise of the automobile. Because the highways were gold-plated with our national wealth, all other forms of public building were impoverished. This is the reason why every town hall built after 1950 is a concrete-block shed full of cheap paneling and plastic furniture, why public schools look like overgrown gas stations, why courthouses, firehouses, halls of records, libraries, museums, post offices, and other civic monuments are indistinguishable from bottling plants and cold-storage warehouses. The dogmas of Modernism only helped rationalize what the car economy demanded: bare bones buildings that served their basic functions without symbolically expressing any aspirations or civic virtues.

[My home town of Spartanburg illustrates this point well.  The “new” US Post Office, built perhaps about 1960, is an ugly white box, with a barren parking lot in front.  The front lawn of my old high school, which is now a “senior resource center,” is now a parking lot.]

The Greek Revival merchants' exchanges and courthouses of the early 1800s symbolically expressed a hopeful view of democracy, a sense of pride and confidence in the future, and significant public expense went into that expression. Public buildings such as the Philadelphia water works or Jefferson's Virginia state capitol at Richmond were expected to endure for generations, perhaps centuries, as the Greek temples had endured since antiquity. These earlier American building types were set in a different landscape, characterized by respect for the human scale and a desire to embellish nature, not eradicate it. Try to imagine a building of any dignity surrounded by six acres of parked cars. The problems are obvious. Obvious solution: Build buildings without dignity.

This is precisely the outcome in ten thousand highway strips across the land: boulevards so horrible that every trace of human aspiration seems to have been expelled, except the impetus to sell. It has made commerce itself appear to be obscene. Traveling a commercial highway like Route 1 north of Boston, surrounded by other motorists, assaulted by a chaos of gigantic, lurid plastic signs, golden arches, red-and-white-striped revolving chicken buckets, cinder-block carpet warehouses, discount marts, asphalt deserts, and a horizon slashed by utility poles, one can forget that commerce ever took place in dignified surroundings.

There is no shortage of apologists for the ubiquitous highway crud.  The self-interest of its promoters in the highway, auto, and construction lobbies is obvious.  Harder to understand are its boosters in academia – for instance, John Brinckerhoff Jackson….

Thus, a Jacksonian student of landscape can observe a Red Barn hamburger joint, he can remark on its architectural resemblance to certain farm structures of the past, measure its dimensions, figure out the materials that went into building it, record the square footage of its parking lot, count the number of cars that come and go, the length of time that each customer lingers inside, the average sum spent on a meal, the temperature of the iceberg lettuce in its bin in the salad bar—all down to the last infinitesimal detail—and never arrive at the conclusion that the Red Barn is an ignoble piece of shit that degrades the community.

The Auto Age, as we have known it, will shortly come to an end, but the automobile will still be with us. Whatever the fate of the petroleum supply, there will be cars and trucks around in any plausible version of the future. They may be smaller, leaner, slower. They will likely be more costly to operate. They may run on electricity, hydrogen, vodka, cow flops, or something we do not know about yet. We will almost surely have proportionately less of them per capita, so that each and every adult is not oppressed by the expense of maintaining one. Possibly only the rich will be able to own cars, as in the early days of motoring; the rest of us may rent one when we need it, for a day in the country or a vacation. This was how society managed with the horse and carriage for many decades, and this is how many Europeans still manage with cars. If we are lucky, and wise, and can intelligently redesign our towns to eliminate the absolute need to drive everywhere for everything, and give up some of our more idiotic beliefs about what comprises a Good Life – such as the idea of speed for its own sake, which is practically a religion in America – we may possibly be able to adapt to the new realities without a lot of political trouble.

It is quite possible to have streets that accommodate the automobile and are still charming, as long as you observe some elementary rules, respect the presence of humans, and pay attention to details. The Miami-based architect and urbanist, Andres Duany, demonstrates this very nicely in his lecture on principles.  Duany shows two slides. The first is an American-style elevated urban megafreeway: twelve lanes jam-packed with cars (presumably moving at a rush-hour crawl) amid a desolate featureless landscape. The second is a Parisian boulevard, which also happens to contain twelve lanes and yet exists within a very gratifying and richly detailed urban environment.

The deficiencies of the American urban freeway are immediately apparent. It is designed solely from the vantage of the traffic engineer. It is monofunctional. Its only purpose is to move cars – and it does not even perform that function very well at certain times of day. No other activity can go on at its margins. It does not respect the presence of humans without automobiles, cyclists and pedestrians, who, in any case, are forbidden by law to be there. The freeway is not part of the urban fabric. Rather, it is superimposed upon it, often physically perched above the city on trestles, as the Southeast Expressway floats above downtown Boston, or else slicing through the city below grade, like the Fisher Freeway in Detroit. When it defines urban spaces, it does so only in a crude and disruptive way, creating "Chinese walls" of noise, danger, and gloom that cut off neighborhoods from each other – as Boston's North End is cut off from the rest of town, and Manhattan is cut off from the East River by the FDR Drive. Being "limited-access" roadways by definition, these freeways connect with the city only at infrequent intervals, and so they embody the discontinuity that afflicts the present urban arrangement.

Observe how the Parisian boulevard behaves. In the center of the boulevard are several express lanes for fast-moving traffic. At each side of the express lane is a median island planted with trees. These medians define an outer slow lane on each side of the boulevard for drivers looking for a local address. There is space for parking along both sides of each median island and along the sidewalk. Finally, the outer edges of the sidewalks are planted with formal, orderly rows of trees. In other words, you have a twelve-lane road in which half the lanes are used for parking and the rest for moving cars at two different speeds, express and local.

Thus, the boulevard is part of the urban fabric of the city. It celebrates the idea of the city as a place with value, a place where a human being would want to be, not just a one-dimensional office slum to be fled after the hours of work. It defines space in a way that allows for multiple functions: motoring, strolling, shopping, business, apartment living, repose. The subtleties of its design make all the difference. It can accommodate twenty parked cars for every fifty linear feet of boulevard, eliminating the need for parking lots. The cars parked along these edges serve another crucial function: they act as a buffer – both physically and psychologically – between the human activities on the sidewalk and the hurtling cars in the express lanes. The two rows of trees per side (four in all) provide additional cushioning. This system works so well that Parisian boulevards are typically lined with outdoor cafes, full of people relaxing in comfort and security. Imagine sitting at a little round table in the breakdown lane of the Santa Monica Freeway at 5:30 in the afternoon.

The existence of back-alley dwellings allowed poor people to live throughout the city – not just in ghettos reserved exclusively for them – cheek-by-jowl with those who were better off, who were often their employers and landlords. They were part of the neighborhood and accepted as a presence there. The children of the poor saw how sober and responsible citizens lived. They saw something tangible to aspire to in adult life. And mixed into a neighborhood of law-abiding property owners, who knew them, the poor did not indulge in the kind of tribal violence that plagues them today.

Alleys are now verboten. The official reason is that a classic alley is not wide enough to accommodate a fire truck. To make it wide enough for a fire truck, and therefore acceptable under the official codes, would make it the size of a regular street—and then it wouldn't be an alley anymore. The lack of alleys and their ancillary buildings has contributed greatly to the segregation of people by income groups, so that the well-off are all by themselves in their leafy neighborhoods – even their servants must drive to work! – and the poor are off in their own ghettos – meaning the inner cities, because that's where the only cheap rentals are.

Without alleys, garages have moved to the front of the house in America. As a matter of design, the garage in front of the house is a disaster. The gigantic door presents a blank wall to the street. Tarted up with extra moldings or a checkerboard paint job, it can look even worse by drawing more attention to itself. Anyway, it is inescapably the dominant feature of the house's front. And if it takes up a third of the facade, which is often the case, then it disfigures what remains, no matter how elegant. Moreover, when you consider that every house on the street has a similar gaping blank facade, you end up with a degraded street as well as a degraded architecture.

That the parking lots are so much bigger than they need to be is a result of several things. First, the zoning laws in this part of town ordain a minimum lot size. Your business has to occupy a lot of at least one quarter acre, whether it is a hot dog stand or a car dealership, which makes for a lot of dead space between business establishments. In effect, it mandates the same relationships between buildings as you would find in a suburban subdivision.

The zoning laws also require deep setbacks from the street, from the side property lines, and from the rear lot line, which encourages placing the building in the exact center of the lot with parking all around. The parking area, naturally, is paved with asphalt. Landscaping costs more to install and creates obstructions for motorists, not to mention maintenance headaches. So paving the whole lot is the easiest and least expensive solution, whether you need all the parking or not. Empty parking lots are the most common little dead noplaces of the postwar streetscape. Great big noplaces are made up of many little noplaces.

It's a bad practice done partly out of misguided good intentions. Businesses in this part of town are required by the codes to have a minimum number of parking spaces. There is no on-street parking here on South Broadway and the customers' cars have to go somewhere. So the business is expected to make provision. This is a good intention, though it results in a place that is unfriendly to pedestrians and oppressive to look at.

The mobility that Americans prize so highly is the final ingredient in the debasement of housing. The freedom to pick up and move is a premise of the national experience. It is the physical expression of the freedom to move upward socially, absent in other societies. The automobile allowed this expression to be carried to absurd extremes. Our obsession with mobility, the urge to move on every few years, stands at odds with the wish to endure in a beloved place, and no place can be worthy of that kind of deep love if we are willing to abandon it on short notice for a few extra dollars. Rather, we choose to live in Noplace, and our dwellings show it. In every corner of the nation we have built places unworthy of love and move on from them without regret. But move on to what? Where is the ultimate destination when every place is No-place?

The great suburban build-out is over. It was wonderful for business in the short term, and a disaster for our civilization when the short term expired. We shall have to live with its consequences for a long time.

The chief consequence is that the living arrangement most Americans think of as "normal" is bankrupting us both personally and at every level of government. This is the true meaning of the word deficit, which has resounded so hollowly the past ten years as to have lost its power to distress us. Now that we have built the sprawling system of far-flung houses, offices, and discount marts connected by freeways, we can't afford to live in it. We also failed to anticipate the costs of the social problems we created in letting our towns and cities go to hell.

A further consequence is that two generations have grown up and matured in America without experiencing what it is like to live in a human habitat of quality. We have lost so much culture in the sense of how to build things well. Bodies of knowledge and sets of skills that took centuries to develop were tossed into the garbage, and we will not get them back easily. The culture of architecture was lost to Modernism and its dogmas. The culture of town planning was handed over to lawyers and bureaucrats, with pockets of resistance mopped up by the automobile, highway, and real estate interests.

The average citizen – who went to school in a building modeled on a shoe factory, who works in a suburban office park, who lives in a raised ranch house, who vacations in Las Vegas – would not recognize a building of quality if a tornado dropped it in his yard. But the professional architects, who ought to know better, have lost almost as much ability to discern the good from the bad, the human from the antihuman. The consequence of losing our planning skills is the monotony and soullessness of single-use zoning, which banished the variety that was the essence of our best communities. Most important, we have lost our knowledge of how physically to connect things in our everyday world, except by car and telephone.

You might say the overall consequence is that we have lost our sense of consequence. Living in places where nothing is connected properly, we have forgotten that connections are important. To a certain degree, we have forgotten how to think. Doesn't this show in our failure to bring these issues into the political arena? There is a direct connection between suburban sprawl and the spiraling cost of government, and most Americans don't see it yet, including many in government. Likewise, there is a connection between disregard for the public realm – for public life in general – and the breakdown of public safety.

These issues will not enter the public discourse until something of a paradigm shift occurs in American society. By paradigm, I mean a comprehensive world view shared by a critical mass of citizens. At any given time, enough people agree upon a particular model of reality and do whatever is necessary to sustain it. Ideas themselves may evolve slowly or rapidly and credible proofs may lag behind hypotheses. But a collective world view is made up of many ideas, all operating dynamically, and when the consensus about what they all add up to is shaken, the result can be convulsive social change. Enough people move to one side of the raft and suddenly the whole thing flips over. The rapid demise of Leninist communism as a believable model of economic reality is an example.

When I suggest that something similar may happen here, I do not anticipate the demise of capitalism. Capitalism in some form is likely to endure, whatever its shortcomings, for it is the only way known for managing accumulated material assets. I do foresee a necessary change, however, in our effort to create a capitalist economy appropriate to our circumstances – namely, a sustainable economy as opposed to our present exhaustive economy. And we can't have a sustainable economy unless we build a physical setting to house it. The physical setting we presently dwell in itself exhausts our capital. It is, in fact, the biggest part of the problem. The future will require us to build better places, or else the future will belong to other people and other societies.

Even after 1990, when the savings and loan catastrophe left the commercial real estate market in shambles, and the American economy began to slide into a malaise resembling the Great Depression, developers were still building some major projects in the same old foolish manner: single-family detached homes on half-acre lots out in the hills, minimally along the connector roads, accountants' offices out in the old cornfields. But these are the mindless twitchings of a brain-dead culture, artificially sustained by the intravenous feeding of cheap oil. Indeed, the continuation of a cheap oil supply through the 1980s – a temporary quirk of politics and history – has been a disaster, allowing us to postpone the necessary redesign of America.

The longer we fail to act on this redesign of our everyday world, the more mired we are apt to become in a psychology of previous investment. So much of the nation's wealth is tied up in badly designed communities, inhuman buildings, and commercial highway crud that we cannot bring ourselves to imagine changing it. But time and circumstances will change our ability to use these things, whether we choose to think about it or not. What will become of all the junk that litters our landscape?

Things that were built in absurd locations, like the vast housing tracts outside Los Angeles on the fringe of the Mojave Desert, may have to be abandoned. Ill-conceived building types, such as the vertical slum housing projects of the big cities, will have to be demolished on a wholesale basis. Today's posh suburbs could easily become tomorrow's slums. If it seems unthinkable, go to Detroit and check out the square miles of mansions-turned-slums off Woodward Avenue. It is a good bet that many of the suburban office buildings put up in the 1980s will never be used as intended – by corporate enterprises that employ hundreds of drones in the arrangement known as "back office" – and what they might be converted to, if anything, is anybody's guess. Many houses and shopping plazas built in the postwar era were so poorly constructed in the first place that they will reach the end of their "design life" before they might be eligible for reuse. In all, a lot of property is apt to lose its value.

There are some things we can predict about the physical arrangement of life in the coming decades. The most obvious is that we will have to rebuild our towns and cities in order to have any kind of advanced economy at all. In fact, this enterprise may turn out to be the engine that powers our economy for years to come, much the same way that the suburban build-out did – with results, one hopes, of more lasting value. To accomplish it, we will have to reacquire the lost art of town planning and radically revise the rules of building, especially the zoning codes that impoverish our present townscapes.

This implies that we shall have to give up mass automobile use. By this, I do not mean an end to all cars but rather, that every individual adult need not make a car trip for every function of living: to go to work, to buy clothes, to have a drink; that every adult need not be compelled to bear the absurd expense of car ownership and maintenance as a requisite of citizenship. The adjustment may be painful for a nation that views car ownership as the essence of individual liberty. Indeed, it is estimated that one sixth of all Americans make their living off of cars in one way or another. But the future will require us to make this adjustment. If we are wise, we will enjoy the compensations of an improved civic life that rehabilitated towns and cities can provide.

Reviving our towns and cities also offers a chance to rehabilitate our countryside. In the future, when we practice a different kind of agriculture than the heavily subsidized, petroleum-intensive, single-crop system we follow today, farming may be down-scaled and regionalized, more food grown and consumed locally. American farmers may learn to produce value-added products – for example, cheese – instead of supplying tons of raw milk to distant cheese factories. European farmers produce an array of value-added products from champagne to Parma hams. They have a richer agriculture and a richer food culture. They even live more comfortable, civilized, middle-class lives than many American farmers, though they operate on a far smaller scale.

Having turned farming into just another industrial enterprise, Americans have lost the culture of agriculture. Where I live there are still dozens of dairy farms in operation. On hardly any of them will you find a household vegetable garden. The farmers have vinyl swimming pools in their side yards, recreational vehicles parked next to the house, motorcycles, TV satellite dishes, but no gardens. Like the rest of us, they get their food at the supermarket….


Appendix M. Excerpts from James Howard Kunstler’s Home from Nowhere (Touchstone, 1996)

Rather than being a classless society, we are very much a class-conscious society, perhaps class-obsessed. It is probably in the nature of human intelligence that we tend to divide ourselves into social categories. The more complex our societies, the more criteria we bring to bear on the question and the more increments of status we perceive. The late Soviet system, for instance, supposedly based on the abolition of class, probably had more subtle distinctions of bureaucratic rank and privilege than the British peerage at the height of empire. American democracy has hardly vanquished the human tendency to make social distinctions. No people on earth brag so much about their equality and no people spend so much time and energy trying to prove that they are better than the next guy.

We like to think that democracy is the glue that holds our society together. Paradoxically, it may act more as a centrifugal force flinging us away from the center and driving us apart, much the same way that the automobile operated on the city – so it is easy to understand why the automobile suits the American spirit so perfectly. In our current national folklore, democracy exists supposedly as a system solely devoted to promoting individual liberty, the right to be left alone to pursue happiness in one's own way, to do whatever we please as we please. Any connection to some idea of the public interest is now severed. This is the position of today's "property rights" extremists, who wish to abolish all attempts to regulate land use. It seems to me that such an extraordinary view of democracy is essentially absurd and cannot sustain communities. Rather, it undermines communities and the institutions they contain, since democracy only has meaning as an organizing principle for the group, not as a shibboleth for individuals living in a vacuum. The current popular conception of democracy, therefore, finds physical expression not in neighborhoods, towns, or cities but only in individual homesteads. This is the meaning behind the other monster of our reigning zeitgeist: the American Dream, the antidote to the industrial city.

The railroad was the enabling mechanism. Both Llewellyn Park and Riverside were made possible by commuter links to New York and Chicago. In time, small commercial nodes formed around the commuter depots of railroad suburbs like these, though in the years before World War Two major retail and everything else remained centralized in the city, supported by an elaborate delivery network of railway express agencies. From its inception the chief characteristic of the American suburb was not of an organically real town, nor a civic place, but a place of fantasy and escape. The notion of life it expressed had the further psychological repercussion of enabling those who lived in such suburbs to believe that they could conduct their industrial activities without suffering any of the unpleasant consequences these activities typically entailed. Along with the suburb itself as a physical artifact, this notion of freedom from the consequences of one's social behavior has also persisted in the mental life of Americans. If anything, it has only become more gross and elaborate over time, so that today millions of Americans are employed in all sorts of destructive enterprises – killing other people's local economies, wrecking towns and cities with inappropriate "development," paving over rural landscapes, ruining ecosystems – without the dimmest sense of remorse or responsibility, returning at night to their homesteads in an artificial wilderness, and the blue light of an electronic hearth with its diverting and reassuring imagery.

As time went on, and ever-larger classes of Americans evacuated the cities, the basic suburban pattern of residential enclaves without the other equipment of civic life persisted and multiplied. The addition of the automobile to the suburban program aggravated its imbalances, ultimately to a grotesque degree. (More about that later.) It is interesting to note, though, how the concept of the American Dream mutated from a set of ideas about liberty to the more explicit notion of a suburban house as the material reward for sacrifice and honest toil. That mutation occurred in the years after World War Two, spurred by our sudden stupendous affluence and egged on by the advertising industry, especially by its operatives in televised political campaigning. Precisely when the suburban equation finally began to fail in the 1980s, when the average price of the suburban house began to exceed the ability of the average family to buy one, this notion of the American Dream mutated once again from a reward into an entitlement, something that the American Way of Life owed to the average citizen as a kind of birthright.

I began this chapter by saying that history is merciless. It is also perverse and ironic. History's first rule is that everything changes. Human beings, on the other hand, like to believe that good things should last forever. When they discover the truth, they often feel cheated, like children who discover there is no Santa Claus. This is the case with the extraordinary postwar economy that boomed and boomed and allowed Americans to build the hugely expensive drive-in fantasy world that is the mature auto suburb. That economy is now fading into history. Perversely, as soon as its demise became manifest, Americans began to insist that they were entitled to it and all its goodies forever, because ... well, because we're number one! As a result, we now find a strong undercurrent of political grievance among those who once were termed middle class and who suddenly realize that the American Dream in the form of a suburban house with two cars has been denied to them. This sense of grievance is apt to build and grow uglier as suburbia and its trappings become increasingly unaffordable to an ever-broader class of Americans.

The idea of a modest dwelling all our own, isolated from the problems of other people, has been our reigning metaphor of the good life for a long time. It must now be seen for what it really is: an antisocial view of human existence. I don't believe we can afford to keep pretending that life is a never-ending episode of Little House on the Prairie. We are going to have to develop a different notion of the good life and create a physical form that accommodates it.

Main Street USA is America's obsolete model for development – we stopped assembling towns this way after 1945. The pattern of Main Street is pretty simple: mixed use, mixed income, apartments and offices over the stores, moderate density, scaled to pedestrians, vehicles permitted but not allowed to dominate, buildings detailed with care, and built to last (though we still trashed it). Altogether it was a pretty good development pattern. It produced places that people loved deeply. That is the reason Main Street persists in our cultural memory. Many people still alive remember the years before World War Two and what it felt like to live in integral towns modeled on this pattern. Physical remnants of the pattern still stand in parts of the country for people to see, though the majority of Americans have moved into the new model habitat called Suburban Sprawl.

For all its apparent success, Suburban Sprawl sorely lacks many things that make life worth living, particularly civic amenities, which Main Street offered in spades. Deep down, many Americans are dissatisfied with suburbia – though they have trouble understanding what's missing – which explains their nostalgia for the earlier model. Their dissatisfaction is literally a dis-ease. They feel vaguely and generally un-well where they are. Nostalgia in its original sense means homesickness. Americans essay to cure their homesickness with costly visits to Disney World. The crude, ineffective palliatives they get there in the form of brass bands and choochoo train rides leave them more homesick and more baffled as to the nature of their disease than when they arrived – like selling chocolate bars to someone suffering from scurvy – and pathetically, of course, they must return afterward to the very places that induce the disease of homesickness.

In reality, people search for some way to make themselves useful and are rewarded with pay. The will to make oneself useful must precede the finding of a rewarding situation. And such situations are rarely tendered to the unwilling. It would be nice to live in a world where everybody could be a brain surgeon, a movie star, or a starting player in the NFL. But those positions are in short supply, and most of us must seek other avenues of usefulness. This seeking usually entails much trial and error, which forms the basis of our experience and, in theory, improves our character. And so we make our ways in the world, commonly upward from some lower situation to increasingly better ones.

Under the present political psychology none of this is considered necessary. The government is supposed to promote the supply of a commodity called jobs and also the training for these jobs, which is somehow extrinsic both to the institution that goes under the name of regular schooling, and the personal will of each individual to be useful. In reality, the leap from functional illiteracy and chronic idleness to a position as even a sales clerk seems implausible. To expect private employers to happily hire multiple-conviction felons is also probably asking too much. A troubling aspect of the problem is that menial labor is now beneath all Americans, including those who have the skills or ambitions to do nothing else. Much of what is called menial labor really involves the caretaking of things, places, and persons, and it is especially sad that there is so much to take care of in this country with nobody willing to do it.

It may strike some readers as an unbelievable effrontery to state that the poor ought to work in menial jobs. I am not arguing that they ought to live in violence and squalor – just the opposite. Before World War Two this was a nation full of menial employments, and many people so employed lived more decently than today's poor do, particularly in the cities where, for all the cities' historic shortcomings, the poor at least had easy access to a great deal of cultural and civic equipment. Poor people may have lived in cramped tenements in 1911, but they had access to well-maintained parks, low-cost public transit, safe streets, free public schools, excellent public libraries, museums, baths, and infirmaries. Most important, this civic equipment was shared by everybody. People of all stations in life went to parks, museums, and libraries. The poor saw the middle class and the wealthy every day in the public realm of the streets. They observed their behavior, and were constrained in their own behavior by seeing them. The poor saw where the rich lived. A boy from Hell's Kitchen could walk ten minutes across town and stand within a few yards of William H. Vanderbilt's front door on Fifty-ninth Street and Fifth Avenue with no fear of being hassled by private security guards. In short, the poor lived in a civic context that included the entire range of social classes, so that many of the problems of the poor in the cities were also the problems of the middle class and the rich.

Today the poor in most American cities live only in the context of the poor. The only place they see the other America is on television, and then through a wildly distorting lens that stimulates the most narcissistic, nihilistic consumer fantasies. Since the poor, by definition, can't participate fully in consumer culture, the predictable result is rage at what appears to be a cruel tease, and this rage is commonly expressed in crime. What may be equally damaging is that the poor see very little in the way of ordinary polite conduct, very little civil behavior. They do not see people routinely going about honorable occupations. What they do see all around is mayhem, squalor, and disorder, and almost no evidence that it is possible to live a happy life without being a sports hero, a gangster, or a television star.

The problems of the cities are not going to be relieved unless the middle class and the wealthy return to live there. For the moment these classes are off in suburbia, inhabiting those little cabins in the woods grouped together in the subdivisions as a symbolic antidote to the city. They will not return to the cities unless a couple of conditions obtain. One is the economic failure of the suburban equation, a likely event. A second condition is whether the cities themselves can be made habitable.

I believe the first condition will come to pass within the next twenty-five years. All the evidence (discussed in Chapter 3) demonstrates clearly that suburbia is becoming unaffordable and unsustainable, and its denizens seem to dimly apprehend this, like people hearing distant thunder on a still summer’s day.  The economy makes them nervous.  Companies are shedding employees.  They feel anxious, trapped.  For the first time in American history, there is nowhere else left to go, no place to escape to.  What will they do?

I'm afraid they may misunderstand the crisis of the suburbs, particularly as it manifests in the personal catastrophes of lost jobs, declining incomes, falling property values, family breakups, and misbehavior. Poor people are not the only Americans afflicted by the psychology of entitlement. Middle-class suburbanites really believe that they are owed a package of goodies called the American Dream, and when they are suddenly deprived of it, they may get very angry and vote for political maniacs.

The Republicans are now in charge of things at many levels of government, and though they have been shouting the loudest about the crisis of “family values," they are also the chief boosters of suburbia, which is to say, if a profoundly uncivil living arrangement. Their chosen way of life, therefore, is at odds with their most cherished wishes for a civil society, and so it is unlikely that they are going to be able to solve any of the social probems they deplore – even the problems of their own children's behavior.

Suburban moms and dads wonder why their fifteen-year-old children seem so alienated. These kids are physically disconnected from the civic life of their towns. They have no access to the civic equipment. They have to be chauffeured absolutely everywhere – to football practice, to piano lessons, to their friends' houses, to the library, and, of course, to the mall.  All they live for is the day that they can obtain a driver's license and use their environment. Except then, of course, another slight problem arises: they need several thousand dollars to buy a used car and pay for insurance, which is usually exorbitant for teens, often more than the price of their cars. Is it really any wonder that these kids view their situation as some kind of swindle?

Americans are convinced that suburbia is great for kids. The truth is, kids older than seven need more from their environment than a safe place to ride their bikes. They need at least the same things adults need. Dignified places to hang out. Shops. Eating establishments. Libraries, museums, and theaters. They need a public realm worthy of respect. All of which they need access to on their own, without our assistance – which only keeps them in an infantile state of dependency. In suburbia, as things presently stand, children have access only to television. That's their public realm. It's really a wonder that more American children are not completely psychotic.

In order to make American towns and cities habitable again, we will have to take the greater portion of public money now spent on subsidizing car use and redirect it into replacing the civic equipment of the cities that we allowed to be trashed over the past several decades. The cost of doing these things is, fortunately, apt to be less than the cost of continuing to subsidize the suburban automobile infrastructure. For instance, a single new freeway interchange can cost $600,000,000, which is the same cost as building and equipping an entire twenty-mile-long electric trolley line.

If there is any frontier left in America today, it probably exists in the vast amounts of underutilized, reclaimable real estate of our towns and cities. While the underclass occupies certain urban neighborhoods, other enormous districts stand virtually abandoned. Great swaths of inner Detroit, Cleveland, and St. Louis consist of empty, rubble-strewn lots, with hardly a building standing for scores of acres around. While the water and sewer lines may need updating, the infrastructure of streets and building lots already exists, and in a physical form that is much more emphatically civic than suburbia. These vacant wards beg redevelopment and present tremendous business opportunities. It may be hard to imagine suburbanites abandoning the leafy cul-de-sac subdivisions for the inner city, but these urban neighborhoods could become as beautiful and functional as we have the nerve to imagine. There is no reason why Cleveland, Detroit, and Harlem could not become as finely functional and spiritually gratifying as Paris.

Where does the underclass go if the cities are reoccupied by the well off? The underclass ceases to be an underclass and becomes something else: a working class of honorably occupied people who make less money. They share the city with other classes, as was always the case in history until our era. They observe the same standard of public conduct as everybody else. They live on less desirable streets in less desirable buildings, but they need not live either in material or spiritual squalor. They can share the aspirations of the mainstream and they can exercise their will to move upward in society by the traditional means of education, diligence, and respectful behavior.

Making our cities habitable again will take a rededication to forms of building that were largely abandoned in America after World War Two. It will call for devices of civic art that never really caught on here, but have always existed in older parts of the world – for instance, waterfronts that are integral with the rest of the city. The human scale will have to prevail over the needs of motor vehicles. There will have to be ample provision

green space of different kinds – neighborhood squares, wildlife corridors, parks – because people truly crave regular contact with nature, especially pockets of repose and tranquility, and having many well cared-for parcels of it distributed equitably around town improves civic life tremendously.

The transformation I propose will not be possible unless Americans recognize the benefits of a well-designed public realm, and the civic life that comes with it, over the uncivil, politically toxic, socially impoverished, hyper-privatized realm of suburbia, however magnificent the kitchens and bathrooms may be there. I don't believe that we can be an advanced society without cities. Tragically, American cities have become unworthy of the American republic. Our task is to make them worthy, to reconstruct them in a physical form that is worth caring about, and to reinhabit them.  It is unfortunate that people who consider themselves politically progressive sneer at the idea of urban "gentrification" as a supposed affront to the poor. This attitude logically leads to a position that the middle class and the wealthy have no business reinhabiting the cities, and it has led to many misguided efforts to defeat attempts at urban redevelopment (see Chapter 6). This attitude must change. Otherwise, the middle class and wealthy will have to consider themselves morally restricted to life in the suburbs – an untenable proposition.

The common good demands a public realm in which to dwell. It can't sustain itself merely in our hearts or memories. This is, finally, the sentimental fallacy of the suburban patriot: that hanging a cast-iron eagle over the garage door proves you care about your country.

Our car culture also reflects much confusion over the ideas of democracy and freedom. Freedom, as comprehended by the founders, stood for the management of affairs at an appropriate hierarchy of scale, meaning there are certain decisions best left to individuals, families, corporations, communities, counties, states, and nations in that order, and that human happiness depended on the proper match. Their main objection to conditions prevailing in 1776 was to the inappropriateness in scale of the British Crown attempting to regulate the day-to-day affairs of people 3,000 miles away spread across a region many times larger than England itself. The founders were not against hierarchy or authority in general. They were not cultural relativists.

Today, both hierarchy and authority are strictly taboo, and basic institutions such as the criminal justice system stand in disrepute as ineffectual or lacking the will of enforcement. Democracy in our popular culture includes the ideas that all opinions, like votes, have equal value, that all values are relative, and that nobody is better than anybody else. College professors are especially tenacious on these points lately, which is a curious intellectual position for those who are supposed to transmit standards of excellence. The devaluation of standards supports some collateral notions, for instance, that there are no social prerequisites to parenthood, that property ownership carries no obligations to the common good, and that the marketplace is the sole arbiter of what makes life worth living. Paradoxically, under this kind of democracy citizens spend inordinate amounts of time, money, and energy trying to prove that they are better than others by accumulating costly totem objects. There is an obvious relation, by the way, between our present unbalanced notions of property rights vis-à-vis the vestigial common good and our mania for accumulating status totems such as cars: it is the behavior of people who literally don't know their place in the world.

Freedom, in this culture, means that whatever makes you happy is okay.  This is the freedom of a fourteen-year-old child. Freedom to eat a whole box of donuts at one sitting. Freedom to make a mess, to be loud and obnoxious, to blow things up, to inflict injury for the thrill of it, to conceive babies without care or thought for the consequences. Mostly, it is freedom from authority, particularly parental authority which, when it exists at all now, often functions at a level qualitatively no higher than a child's. Under this version of freedom, there is no legitimate claim for any authority to regulate human desires – not even the personal conscience – nor any appropriate scale of management, and all supposed authorities are viewed as corrupt, mendacious, and irrelevant. This view of freedom is not what Hamilton, Jefferson, Madison, and the other founders had in mind. It is not a coincidence that the appeal of cars in our time derives from these crude emotional states masquerading as ideas about democracy and freedom.

The quality of city life suffered so enormously from the onslaught of cars that the majority of Americans soon made up their minds to reject the city, and what they thought it represented, for good. America was big, with seemingly endless room to spread out. The tacit promise of the car was that sooner or later everybody could live somewhere outside of town. The mature auto suburb of our time is the reenactment of life on the frontier. The landscape of the auto suburb is the new wilderness, and Americans pretend not to mind it because wilderness is supposedly America's natural social condition. What matters to us is hearth and home. The outside doesn't matter, except as excess space. Everything outside is merely to be traversed and endured. The freeway-scape is exactly this sort of wasteland. The car makes it endurable, even pleasurable with the recent innovations of air conditioning and stereo sound. It's not an accident that many car advertisements on television are filmed in wastelands that look more like the surface of Mars than anyplace Americans really live. The regime of mass car use is an offshoot of our historical aversion to civility itself. The car allows Americans to persist in the delusion that civic life is unnecessary. As a practical matter, this regime is putting us out of business as a civilization.

We have all the information we need to persuade us by means of rational argument that using cars the way we do is catastrophic. The trouble is we don't care. I attribute this to a particular set of beliefs held by most educated Americans today. We believe there is a technical solution to every problem life presents. That is why the debate about car use has centered on purely technical issues such as air pollution, which can be easily measured, rather than on the developmental needs of children marooned in suburbia, which are harder to quantify. Notice, by the way, that even our elegant technical solutions to problems such as tailpipe emissions – catalytic converters, new fuel formulas, et cetera – have barely kept up with minimum federal air quality standards, because the total number of cars Keeps growing, and we've done absolutely nothing to discourage the aggregate growth of car use (in fact, we encourage it). Meanwhile, there has been next to zero debate on the social or spiritual consequences of suburban sprawl, because such a debate would hinge on issues of quality, not on numbers.

This preoccupation with statistics and technical solutions to problems, especially problems of human behavior, is itself a byproduct of something more insidious in modern culture: the belief that science is an adequate replacement for virtue. This dangerous habit of mind can also be expressed as the fallacy of false quantification. It means that no matter how solid your statistics are, numbers don't necessarily tell the whole story, or even the important part of it. It also means that statistics can be used to lie. We tend to select for debate only those aspects of a question that can be quantified, whether they are relevant or not, because the ability to measure something, anything, makes us feel more secure, more in control of our destiny, than grappling with thornier qualitative issues of good and bad or right and wrong. That is why we so often hear the word methodology flung about as a verbal weapon in public policy debates, as in "I challenge your methodology!" The jargon of science makes non-scientists feel more authoritative, at least among themselves. Anybody who can even use the word methodology in an argument sounds more scientifically respectable, whether he has anything worthwhile to say or not. Professionals operating on the fringe of hard science – such as journalists and critics of technology – feel unarmed without the buzzwords of the lab. So, numbers and statistics drive our debates about modem life, to the exclusion of all other frames of reference. What else should we expect from a culture that has ruled "value judgments" to be generally inadmissible?

That few Americans even care about the ominous statistics produced by whatever methodologies (some of them quite sound on their own terms) is something else again. It leads to the inescapable conclusion that this nation doesn't really want to think about the damage cars are causing our society. We lack the will to reflect, and perhaps the requisite virtue to acquire the will. We're too comfortable munching Cheez Doodles on the freeway right now to think about the consequences of continuing this behavior. The relatively feeble public debate about cars currently underway is carried on mainly by a few environmental groups and it is aimed at other environmentalists – in other words, preaching to the choir.

I believe that we Americans have managed to go beyond driving ourselves crazy with cars.  There is a moral and spiritual dimension to these problems that we are unable to reckon. We have the knowledge to do the right thing; we lack only the will to do the right thing. The inescapable conclusion is that our behavior is wicked, and that we are liable to pay a heavy price for our wickedness by losing things we love, including our beautiful country and our democratic republic.

It is most unfortunate that under this regime the moral dimension of life was relegated to the domain of the supernatural, and therefore in our time moral issues have become the special province of evangelical religious bullies, political extremists, and maniacs like the abortion clinic assassins. These are our arbiters of right and wrong, of goodness and badness. The discussion has not taken place on a higher level. Those truly equipped to lead the discussion of values – as Hamilton, Madison, and Jefferson did in their day – have dropped out under the assumption that all cultural values are relativistic and equally valid. This assumption has been solidly lodged in most college curricula, where its influence on impressionable minds is hard to overstate. It is embedded in much of what passes these days for higher culture.

I sense that this is changing, that it is once again becoming possible, if not yet exactly respectable, for educated people to make value judgments, and that is a very good thing. Our times demand it. Our national civitas is failing, and it will not do any longer to pretend that all forms of conduct are equally okay, or that all economic choices are equally favorable, or that all products of human ingenuity are equally beneficial.

The times demand that educated people debate issues of good and bad, right and wrong, beauty and ugliness. Educated people must readmit notions such as virtue and wickedness into the realm of ideas, and arrive at a humane consensus about what they mean. It will no longer do to say that virtue is too complex to be understood – and that, therefore, we prefer no definition of virtue to a possibly imperfect one. In restoring these notions of value to respectability, we may even resurrect the fundamental source of beauty in our world, which is the shame, the original sin, of our innate human imperfection.

It is literally against the law almost everywhere in the United States to build the kind of places that Americans themselves consider authentic and traditional. It's against the law to build places that human beings can feel good in, or afford to live in. It's against the law to build places that are worth caring about.

Is Main Street your idea of a nice business district? Sorry, your zoning won't let you build it, or even extend it where it already exists. Is Elm Street your idea of a nice place to live – you know, the houses with the front porches on a tree-lined street? Sorry, that's against the law, too. All you can build where I live, in upstate New York, is another version of Los Angeles. The zoning laws say so.

This is not a gag. Our zoning laws comprise the basic manual of instructions for how we create the stuff of our communities. Most of these laws have only been in place since World War Two. For the previous 300-odd nears of American history we didn't have zoning laws. We had a popular consensus about the right way to assemble a town, or a city. Our best Main Streets and Elm Streets were not created by municipal ordinances, but by cultural agreement. Everybody agreed that buildings on Main Street ought to be more than one story tall, that corner groceries were good to have in residential neighborhoods, that streets ought to intersect with other streets to facilitate movement, that sidewalks were necessary, and that orderly rows of trees planted along them made the sidewalks much more pleasant, that rooftops should be pitched to shed rain and snow, that doors should be conspicuous so you could easily find the entrance to a building, that windows should be vertical to dignify a house. Everybody agreed that communities needed different kinds of housing to meet the needs of different kinds of families and individuals, and the market was allowed to supply it. Our great-grandfathers didn't have to argue endlessly over these matters of civic design. Nor did they have to reinvent civic design every fifty years because everybody forgot what they agreed about.

Everybody agreed that both private and public buildings should be ornamented and embellished to honor the public realm of the street, so they built the kind of town halls, firehouses, banks, and homes that today are on the National Register of Historic Places. We can't replicate any of that stuff. Our laws actually forbid it. Want to build a bank in Anytown, USA? Fine. Make sure that it's surrounded by at least an acre of parking, and that it's set back from the street by at least 75 feet, and that it be no more than one story high. There's your bank. The instructions for a church or a muffler shop are identical. That's exactly what your laws tell you to build. If you deviate from the template, you will actually be punished by not receiving a building permit.

Therefore, if you want to make your communities better, begin at once by throwing out your zoning laws. Get rid of them. Throw them away. Don't revise them. Set them on fire if possible and make a public ceremony of it – a public ceremony is a great way to announce the birth of a new consensus. While you're at it, throw out your "master plans" too. They're invariably just as bad. Replace these things with a new traditional town-planning ordinance, which prescribes a more desirable everyday environment.

 

The place that results from zoning is suburban sprawl. It must be understood as the product of a particular set of instructions. Its chief characteristics are the strict separation of human activities (or uses), mandatory driving to get from one use to the other, and huge supplies of free parking.

After all, it's called zoning because the basic idea is that every activity demands a separate zone of its very own. You can't allow people to live around shopping. That would be harmful and indecent. Better not even allow them within walking distance of it. They'll need their cars to haul all that stuff home, anyway – in case you haven't noticed, most supermarkets don’t deliver these days. While you're at it, let's separate the homes, too, by income gradients. Don't let the $75,000-a-year families live near the $200,000-a-year families – they'll bring down your property values – and, for Godsake, don't let some $25,000-a-year recent college graduate live near any of them, or a $19,000-a-year widowed grandmother on social security. There goes the neighborhood! Now, put all the workplaces in a separate office "park" or industrial "park," and make sure nobody can walk to them either. As for nice public spaces, squares, parks, and the like – forget it, we can't afford them because we spent all our public funds paving the four-lane highways and collector roads and the parking lots, and laying sewer and water lines out to the housing subdivisions, and hiring traffic cops to regulate the movement of people in their cars going back and forth to these segregated uses.

It soon becomes obvious that the model of the human habitat dictated by zoning is a formless, soulless, centerless, demoralizing mess. It bankrupts families and townships. It causes mental illness. It disables whole classes of decent, normal citizens. It ruins the air we breathe. It corrupts and deadens our spirits.

The construction industry likes it because it requires stupendous amounts of cement, asphalt, steel, and a lot of heavy equipment and personnel to push all this stuff into place. The car dealers love it. Politicians used to love it, because it produced big short-term profits and short-term revenue gains, but now they're all mixed up about it because the voters who live in suburban sprawl don't want any more of the same stuff built around them – which implies that at some dark level suburban sprawl dwellers are quite conscious of its shortcomings. They have a word for it: growth. They're now against growth. Their lips curl when they utter the word. It has the same appeal as the word fungus. They sense that any new construction is only going to make the place where they live worse. They're convinced that the future is going to be worse than the past. And they're dead-on right because, in terms of their everyday surroundings, the future has been getting worse throughout their lifetime. They're not hallucinating. Growth means only more traffic, bigger parking lots, and ever-bigger-and-uglier buildings than the monstrosities of the sixties, seventies, and eighties.

So the suburbanites become NIMBYs (Not In My Back Yard) and BANANAS (Build Absolutely Nothing Anywhere Near Anything). If they're successful in their NIMBYism, they'll use their town government to torture developers (i.e., people who create growth) with layer upon layer of bureaucratic rigmarole, so that only a certified masochist would apply to build something there. Eventually, all this unwanted growth leapfrogs over them to cheap, vacant, rural land farther out (controlled by politicians hungry for "rateables"), and then all the new commuters in the farther-out suburb are choking the NIMBY's roads anyway to get to the existing mall in NIMBYville.

Under zoning, alleys have not been permitted, so many localities have no experience with them, and hence, fear the idea.  Our popular image of an alley derives from movies and cartoons of the industrial city, circa 1930-1950, an era of semi-dereliction. This is a very limited view. To the contrary, the alley is a simple device of civic art that produces many beneficial results. It allows the garage to be removed from the front of the house. Without the ugly garage door dominating the facade, houses can be dignified and beautiful again. The alley is the place where your garbage cans go, too. Also the utility lines. Even where alleys with outbuildings already exist, such as the town where I live, rental apartments are currently discouraged under zoning. Under the New Urbanism, they are once again permitted. Alleys are made much safer when people live along them, namely renters in outbuildings. For the ultra-paranoid, there are simple new devices, such as light switches triggered by motion, that would provide an extra measure of security at little cost.

The argument has been made lately that cul-de-sac streets or gated streets are safer than connecting streets because evildoers have a hard time getting in and out.  This is a drastic remedy for an uncivil society and must not be thought of as normal. It is one thing to tame traffic, it is another to create paranoidal fortifications. The best way to bring security to streets is to make them delightful places that honorable and decent citizens will want to walk in. They become, in effect, self-policing. The disadvantages of an interrupted street network in all other respects far outweigh any supposed gain in security.

Because citizens have not been happy with the model of development that zoning gives them, they have turned it into an adversarial process. They have added many layers of procedural rigmarole so that only the most determined and wealthiest developers can withstand the ordeal. In the end, after all the zoning board meetings, flashy presentations, and environmental objections and mitigation, and after both sides' lawyers have chewed each other up and spit each other out, what ends up getting built is a terrible piece of shit anyway, because it's a piece of sprawl equipment – a strip mall, a housing subdivision. Everybody is left miserable and demoralized, and the next project that comes down the road gets even more beat up, whether it's good or bad.

Now, as far as I'm concerned, many projects deserve to get beat up and delayed, even killed. But wouldn't it be better for society if we could agree on a model of good development and make it relatively easy for people to go forward with it? This is the intent of the Traditional Neighborhood Development (TND) ordinance as proposed under the New Urbanism.

Human settlements are like living organisms. They must grow and they will change. But we can decide on the nature of that growth, particularly on the quality and the character of it, and where it ought to go. We don't have to scatter the building blocks of our civic life all over the countryside, impoverishing our towns and ruining farmland. We can put the shopping and the offices and the movie theaters and the library all within walking distance of each other. And we can live within walking distance of all these things. We can build our schools close to where the children live, and the school buildings don't have to look like fertilizer plants. We can insist that commercial buildings be more than one-story high, and allow people to live in decent apartments over the stores. We can build Main Street and Elm Street and still park our cars. It is within our power to create places that are worthy of our affection.

Our system of property taxes punishes anyone who puts up a decent building made of durable materials.  It rewards those who let existing buildings go to hell. It favors speculators who sit on vacant or underutilized land in the hearts of our cities and towns. In doing so it creates an artificial scarcity of land on the free market, which drives up the price of land in general, and encourages ever more scattered development, i.e., suburban sprawl. In tandem with zoning, the taxing of buildings rather than land itself promotes such wasteful practices as putting up cheap one-story burger joints in huge parking lots on prime city land. It is one of the biggest impediments to the free market creation of affordable housing. As a consequence of all these things it is a drag on economic productivity and employment.

This happens because we tax buildings much more heavily than the land under them. These buildings are visited by an official assessor who determines their value. The higher the building's value, the higher the tax. Under this system, a rational person has every reason to put up crappy buildings that will not be highly assessed, or he has every reason to let his property run down, or build nothing at all. This is a major reason for the current desolation of American towns and cities.

The alternative to this is to tax land itself and not the buildings on it. The criteria for assessing the value of land minus buildings is based on its location or site. If it is one block away from Main Street, for instance, it is considered to have high site value because it is very close to other things that people like to be near: public utilities, the post office, civic amenities such as parks, museums, libraries, schools, other businesses and other services, and so on. The theory behind this is that in human society land derives value from both explicit public investment (sewers, water lines, streets), and from the aggregate of private human activities that go on around it. This is termed socially created value. Owners of prime real estate derive large benefits from socially created value and therefore should be taxed on that basis rather than on the basis of whether they choose to use or squander those benefits – for example, whether they chose to use it in the form of a vacant lot or a seven-story hotel. I will try to make it clear why our current system favors the vacant lot and discourages the hotel.

The dead shopping center syndrome became particularly aggravated in the 1980s, when the U.S. government deregulated the banks then known is savings and loan associations. Under the relaxed rules, the directors of S&Ls could profit hugely off any cockamamie real estate venture, whether it succeeded or not. Whenever a given deal was inked – before a single spadeful of earth was turned – the S&L would award itself "points" for being the lender. These points would be translated into instant premiums or onuses paid to the bank's officers. The developers would likewise award themselves big bonus fees off the top. The development would get mall. Whether   built – let’s say a strip mall.  Whether it succeeded or failed in normal terms meant little to the bankers or the developers, who had already hit the jackpot. The worst thing that could happen was that the bank would fail or the development company might go bankrupt. In either case, the principals would have long since transferred their personal fortunes to the Cayman Islands, Panama, or Switzerland, where all the U.S. Treasury agents who ever lived could not pry loose a single deposit slip. Meanwhile, the money actually invested (i.e., lost) had existed in the form of federally insured bank deposits, so the individual depositors didn't lose either. Why should they care how the banks pissed away their deposits, as long as they got their money back from the U.S. Treasury? In the end, the estimated half-trillion-dollar aggregate losses of all bankrupt S&Ls were merely chalked up to "the deficit," that chimerical many-numbered beast that stalks the TV airwaves at election time.

Let's look for a moment at the American heartland. When settlers arrived in the Midwest, they found a geological legacy of topsoil six feet deep in places. It was unbelievably good, rich, deep topsoil by any standard in the world. This territory had existed as grass-covered plains since the last ice age, undisturbed by any significant human cultivation. For eons, the plains had renewed and enriched themselves through natural cycles of growth, death, and prairie fire, fertilized by stupendous herds of wild quadrupeds. Cereal grains, which are grasses biologically, were perfectly suited to it. So this area, which today includes the states of Illinois, Wisconsin, Minnesota, Iowa, Nebraska, Missouri, Oklahoma, Kansas, parts of the Dakotas, Texas, and Montana (as well as Canada's provinces of Manitoba and Saskatchewan) became the breadbasket of the Western world.

The weather in this region is very severe. Wind sweeps easily over the flat terrain. Rainfall often occurs in the form of intense storms, punctuated by long dry spells – and just as often these storms bring with them destructive tornadoes, lightning, hail, and flooding. Before farming, the thick roots of prairie grasses held together prairie soils in the face of these meteorological assaults.

The mechanization of farming that followed World War One introduced practices that allowed farmers to increase their scale of operation by orders of magnitude and at the same time promoted a much larger scale of regular soil erosion. The dust bowl of the 1930s was a symptom of these aggressive, mechanized farming practices, combined with a drought of several years duration. It made an impression on our national psyche with frightening duststorms, much-photographed rural desolation, and catastrophic social consequences, but the soils of the American breadbasket have continued to erode steadily, if less dramatically, ever since.

Today, that once-six-foot-deep layer of topsoil has been reduced in places to less than six inches. The other 66 inches have either washed into the Missouri–Mississippi River drainage or blown away. It is estimated that nowadays topsoil losses exceed the weight of grain harvested by five times in Iowa. This erosion is taking place largely because of our methods of cultivation, and the mechanical needs of the gigantic pieces of equipment used to cultivate cereals, especially corn. Herbicides keep the soil between the cornstalks bare and exposed. Additionally, herbicides and pesticides kill any of the living organisms that organically hold soil together. Every time it rains, the water and topsoil form a thick slurry that turns the rivers of the Midwest a rich ocher color, like house paint. Eventually it ends up as new swampland down in Cajun country.

The once-magnificent soils of the American grain belt have been reduced to mere growing mediums, with little nutrient value of their own. We compensate for the absence of nutrients with more and more chemical fertilizers. We do not compensate for the deadness of the soil, its lack of beneficial microorganisms. Erosion is the long-term price we pay for that. The short-term winner has been agribusiness: the chemical and oil companies. A crop may be destroyed by a hailstorm in June, but Monsanto has already been paid for the inputs it sold the farmer in April – his pesticides and herbicides. Fertilizers are made largely of petroleum by-products.

California's Central Valley, agribusiness's other Eden, has somewhat different soil problems, but the cause is similar: unsustainable methods and inappropriate scale. The Central Valley is as big as all of Massachusetts and Connecticut. Pacific-borne winter storms pass over it and dump all of their moisture in the Sierras as snow. The Sierra sends the huge runoff of its winter snows into two major rivers – the Sacramento to the north, and the San Joaquin to the south. These two rivers eventually converge in San Pablo Bay, near San Francisco. Since 1900, their tributaries have been dammed in order to provide flood control, and diverted into aqueducts to water both the cropland of the Central Valley and the huge suburban sprawl agglomerations assembled in the Los Angeles Basin and around the San Francisco Bay area. This has led to evermore bitter quarreling between the thirsty, ever-growing cities and the lettuce growers and peach producers of the Valley. The quarrel has been settled temporarily in favor of the cities, whose happiness is considered to be more important economically these days. Meanwhile, California farmers are beginning to see some scary unexpected consequences of an agribusiness based on continual irrigation the salinization of the soil.

As I drove from San Francisco to L. A. down Interstate 5 in June of 1994, past little burgs called Raisin and Richgrove, I frequently saw barren farm fields hundreds of acres in size with a weird, dirty-white crust on them, as though they'd been sprinkled with volcanic ash. These were salt precipitates caused by decades of continual irrigation. Crops could no longer be grown there, and there was no technical remedy for the problem.

Nearly all irrigation water, whatever its source, contains traces of salt. More troublesome is that many chemical fertilizers contain compounds of chlorine. Those molecules not used by plants can decay into free chlorine ions that recombine into other compounds, of which sodium chloride is one. So year by year there is a slow but constant increase in the buildup of salts. These salts invade the water table. Over decades, the salinity of underground water can become increasingly concentrated. Where drainage is poor, the water table lies close to the root level of crops, and some crops are subjected directly to salt water. Water also has a tendency to wick up to the surface by capillary action. In hot, and places, like California's Central Valley, there is naturally a higher evaporation rate. As salinated water wicks back to the surface and evaporates, the salt precipitates out of it and eventually forms a crust on the surface.

This is not just a problem peculiar to California in the 1990s. Worldwide today, fully one-third of continually irrigated land is being ruined by salinization. Great civilizations of the past have faded to insignificance because salinization ruined their croplands. From 2100 B.C. to 1700 B.C., the Sumerian empire's crop yields fell by 75 percent in lands irrigated by the Tigris and Euphrates, and the cities that depended on them withered away. The prospects for California's croplands are similarly dire.

The destruction of soil is one part of our agricultural predicament. A related problem is our reliance on food that is trucked thousands of miles from where it is grown to where it is eaten. A Caesar salad might be made of ingredients that have traveled at least 1,500 miles to get to your table. Our ability to continue this behavior is based on the belief that we will always have cheap gasoline and be able to maintain our highways at their current high "level of service," as the traffic engineers say. Both of these assumptions stand on quicksand.

In order for a given system to break down, it is not necessary for all the parts of the system to fail. It is not even necessary for one of the parts to fail completely. All that is necessary is for one of the parts to fail partially. The larger the scale of a system, the less adaptable it is to change, and the more vulnerable it will be to the partial failure of one part.

 “The conventional approach is to nail the crop plant with soluble chemical,” he continued.  “The plant doesn’t have a chance.  So it gets hit with this stuff and spurts.  The problem is that this kind of abnormal, fast, fleshy growth is also a sign of weakness in plants. It attracts insects. Then the grower has to come in with insecticides to keep the bugs off and fungicides to keep the disease away. The minute you start down the path of chemical fertilizers, you have to use the rest of these things. The same company that sold you the fertilizer will be happy to sell you the other stuff. Those farms are now 100 percent dependent. If they don't use chemicals, they won't get a crop. They've killed off the natural fertility of their soil."

For all the shortcomings I perceive about my homeland these days, I must admit it has allowed me to function freely in my vocation, which is saying a lot. The only conditions I value more are loving relations with friends and kin and a more generalized gratitude for being born in the first place. I am not religious, but I am aware of a spiritual dimension to this mysterious world. As Ludwig Wittgenstein remarked, it is astonishing that anything exists. I believe we pass this way but once, and that this is the source of man's essentially tragic condition. Yet I believe simultaneously, perhaps incongruously, even obdurately and foolishly, that each of us is an offspring of the intelligent and benevolent organism that is the universe – though this model leaves a lot unaccounted for, from war to root canal therapy – and that we remain part of it, in some fashion, everlastingly.

Strange to relate, as a result of my travels around the United States the past seven years, I begin to come to the disquieting conclusion that we Americans are these days a wicked people who deserve to be punished. The idea embarrasses me, but I nevertheless stand by it. I suppose this is what comes of a vocation that places one, for instance, in a New Jersey gambling casino full of overweight slobs pissing away their kids' college tuition in pursuit of "excitement." I therefore also believe in the existence of genuine evil, as embodied, in the Hannah Arendt sense, by the behavior of many well-known American corporations, especially those that prey on the aspirations of children.


Appendix N. Excerpts from James Howard Kunstler’s The Long Emergency (Grove Press, 2005)

Chapter 1.  Sleepwalking into the Future

Your Reality Check Is in the Mail

What is generally not comprehended about this predicament is that the developed world will begin to suffer long before the oil and gas actually run out. The American way of life—which is now virtually synonymous with suburbia—can run only on reliable supplies of dependably cheap oil and gas. Even mild to moderate deviations in either price or supply will crush our economy and make the logistics of daily life impossible. Fossil fuel reserves are not scattered equitably around the world. They tend to be concentrated in places where the native peoples don't like the West in general or America in particular, places physically very remote, places where we realistically can exercise little control (even if we wish to). For reasons I will spell out, we can be certain that the price and supplies of fossil fuels will stiffer oscillations and disruptions in the period ahead that I am calling the Long Emergency.

A coherent, if extremely severe, view along these lines, and in opposition to the cornucopians, is embodied by the "die-off" crowd.  (http://www.dieoff.com an Internet site started by Jay Hanson, popularizing the ideas of many who believe that the Industrial Age is a terminal condition of humankind.)  They believe that the carrying capacity of the planet has already exceeded "overshoot" and that we have entered an apocalyptic age presaging the imminent extinction of the human race. They lend zero credence to the cornucopian belief in humankind's godlike ingenuity at overcoming problems. They espouse an economics of net entropy. They view the end of oil as the end of everything. Their worldview is terminal and tragic.

The view I offer places me somewhere between these two camps, but probably a few degrees off center and closer to the die-off crowd. I believe that we face a dire and unprecedented period of difficulty in the twenty-first century, but that humankind will survive and continue further into the future—though not without taking some severe losses in the meantime, in population, in life expectancies, in standards of living, in the retention of knowledge and technology, and in decent behavior. I believe we will see a dramatic die-back, but not a die-off. It seems to me that the pattern of human existence involves long cycles of expansion and contraction, success and failure, light and darkness, brilliance and stupidity, and that it is grandiose to assert that our time is so special as to be the end of all cycles (though it would also be consistent with the narcissism of baby-boomer intellectuals to imagine ourselves to be so special). So I have to leave room for the possibility that we humans will manage to carry on, even if we must go through this dark passage to do it. We've been there before.

Adios Globalism

The so-called global economy was not a permanent institution, as some seem to believe it was, but a set of transient circumstances peculiar to a certain time: the Indian summer of the fossil fuel era. The primary enabling mechanism was a world-scaled oil market allocation system able to operate in an extraordinary sustained period of relative world peace. Cheap oil, available everywhere, along with ubiquitous machines for making other machines, neutralized many former comparative advantages, especially of geography, while radically creating new ones—hypercheap labor, for instance. It no longer mattered if a nation was halfway around the globe, or had no prior experience with manufacturing. Cheap oil brought electricity to distant parts of the world where ancient traditional societies had previously depended on renewables such as wood and dung, mainly for cooking, as many of these places were tropical and heating was not an issue. Factories could be started up in Sri Lanka and Malaysia, where swollen populations furnished trainable workers willing to labor for much less than those back in the United States or Europe. Products then moved around the globe in a highly rationalized system, not unlike the oil allocation system, using immense vessels, automated port facilities, and truck-scaled shipping containers at a minuscule cost-per-unit of whatever was made and transported. Shirts or coffeemakers manufactured 12,000 miles away could be shipped to Wal-Marts all over America and sold cheaply.

The ability to globalize industrial manufacturing this way stimulated a worldwide movement to relax trade barriers that had existed previously to fortify earlier comparative advantages, which were now deemed obsolete. The idea was that a rising tide of increased world trade would lift all boats. The period (roughly 1980-2001) during which these international treaties relaxing trade barriers were made—the General Agreements on Tariffs and Trade (GATT) —coincided with a steep and persistent drop in world oil and gas prices that occurred precisely because the oil crises of the 1970s had stimulated so much frantic drilling and extraction that a twenty-year oil glut ensued. The glut, in turn, allowed world leaders to forget that the globalism they were engineering depended wholly on nonrenewable fossil fuels and the fragile political arrangements that allowed their distribution. The silly idea took hold among the free, civilized people of the West, and their leaders, that the 1970s oil crises had been fake emergencies, and that oil was now actually superabundant. This was a misunderstanding of the simple fact that the North Sea and Alaskan North Slope oil fields had temporarily saved the industrial West when they came online in the early 1980s, and postponed the fossil fuel depletion reckoning toward which the world has been inexorably moving.

Meanwhile, among economists and government figures, globalism developed the sexy glow of an intellectual fad. Globalism allowed them to believe that burgeoning wealth in the developed countries, and the spread of industrial activity to formerly primitive regions, was based on the potency of their own ideas and policies rather than on cheap oil. Margaret Thatcher's apparent success in turning around England's sclerotic economy was an advertisement for these policies, which included a heavy dose of privatization and deregulation. Overlooked is that Thatcher's success in reviving England coincided with a fantastic new revenue stream from North Sea oil, as quaint old Britannia became energy self-sufficient and a net energy-exporting nation for the first time since the heyday of coal. Globalism then infected America when Ronald Reagan came on the scene in 1981. Reagan's "supply-side" economic advisors retailed a set of fiscal ideas that neatly accessorized the new notions about free trade and deregulation, chiefly that massively reducing taxes would actually result in greater revenues as the greater aggregate of business activity generated a greater aggregate of taxes even at lower rates. (What it actually generated was huge government deficits.)

By the mid-1980s deregulated markets and unbridled business were regarded as magic bullets to cure the ills of senile smokestack industrialism. Greed was good. Young college graduates marched into MBA programs in hordes, hoping to emerge as corporate ninja warriors. It was precisely the entrepreneurial zest of brilliant young corporate innovators that produced the wizardry of the computer industry. The rise of computers, in turn, promoted the fantasy that commerce in sheer information would be the long-sought replacement for all the played-out activities of the smokestack economy. A country like America, it was now thought, no longer needed steelmaking or tire factories or other harsh, dirty, troublesome enterprises. Let the poor masses of Asia and South America have them and lift themselves up from agricultural peonage. America would outsource all this old economy stuff and use computers to orchestrate the movement of parts and the assembly of products from distant quarters of the world, and then sell the stuff in our own Kmarts and Wal-Marts, which would become global juggernauts of retailing. Computers, it was believed, would stupendously increase productivity all the way down the line. The jettisoned occupational niches in industry would be replaced by roles in the service economy that went hand in hand with the information economy. We would become a nation of hair stylists, masseurs, croupiers, restaurant owners, and show business agents, catering to one another's needs. Who wanted to work in a rolling mill?

Finally, the disgrace of Soviet communism in the early 1990s resolved any lingering philosophical complaints among the educated classes about the morality of business per se and of the institutions needed to run it. The Soviet fiasco had proven that a state without property laws or banking was just another kind of scaled-up social Ponzi scheme running on cheap oil and slave labor.

I have argued in previous books that capitalism is not strictly speaking an "ism," in the sense that it is not so much a set of beliefs as a set of laws describing the behavior of money as it relates to accumulated real wealth or resources. This wealth can be directed toward the project of creating more wealth, which we call investment, and the process can be rationally organized within a body of contract and property law. Within that system are many subsets of rules and laws that describe the way money in motion operates, much as the laws of physics describe the behavior of objects in motion. Concepts such as interest, credit, revenue, profit, and default don't require a belief in capitalism in order to operate. Compound interest has worked equally well for communists and Wall Street financiers, whatever they personally thought about the social effects of wealth and poverty. People of widely differing beliefs are also equally subject to the law of gravity.

It is therefore not a matter of whether people believe in capitalism (hyper, turbo, neoliberal, or anything else you might call it), but of the choices they make as individuals, and in the aggregate as communities and nations, that determine their destiny. I am going to argue in later chapters that Americans in particular among the so-called "advanced" nations made some especially bad choices as to how they would behave in the twilight of the fossil fuel age. For instance, conditions over the past two decades made possible the consolidation of retail trade by a handful of predatory, opportunistic corporations, of which Wal-Mart is arguably the epitome. That this development was uniformly greeted as a public good by the vast majority of Americans, at the same time that their local economies were being destroyed —and with them, myriad social and civic benefits— is one of the greater enigmas of recent social history. In effect, Americans threw away their communities in order to save a few dollars on hair dryers and plastic food storage tubs, never stopping to reflect on what they were destroying. The necessary restoration of local networks of economic interdependence, and the communities that rely on them, will be a major theme later in this book.

I will also propose that globalism as we have known it is in the process of ending. Its demise will coincide with the end of the cheap-oil age. For better or worse, many of the circumstances we associate with globalism will be reversed. Markets will close as political turbulence and military mischief interrupt trade relations. As markets close, societies will turn increasingly to import replacement for sheer economic survival. The cost of transport will no longer be negligible in a post-cheap-oil age. Many of our agricultural products will have to be produced closer to home, and probably by more intensive hand labor as oil and natural gas supplies become increasingly unstable. The world will stop shrinking and become larger again. Virtually all of the economic relationships among persons, nations, institutions, and things that we have taken for granted as permanent will be radically changed during the Long Emergency. Life will become intensely and increasingly local.

The End of the Drive-In Utopia

I view the period of history we have lived through as a narrative episode in a greater saga of human history. The industrial story has a beginning, a middle, and an end. It begins in the mid-eighteenth century with coal and the first steam engines, proceeds to a robust second act climaxing in the years before World War I, and moves toward a third act resolution now that we can anticipate with some precision the depletion of the resources that made the industrial episode possible. As the industrial story ends, the greater saga of mankind will move on into a new episode, the Long Emergency. This is perhaps a self-evident point, but throughout history, even the most important and self-evident trends are often completely ignored because the changes they foreshadow are simply unthinkable. That process is sometimes referred to as an "outside context problem," something so far beyond the ordinary experience of those dwelling in a certain time and place that they cannot make sense of available information. The collective mental static preventing comprehension is also sometimes referred to as "cognitive dissonance," a term borrowed from developmental psychology. It helps explain why the American public has been sleepwalking into the future.

The Long Emergency is going to be a tremendous trauma for the human race. It is likely to entail political turbulence every bit as extreme as the economic conditions that prompt it. We will not believe that this is happening to us, that two hundred years of modernity can be brought to its knees by a worldwide power shortage. The prospect will be so grim that some individuals and perhaps even groups (as in nations) may develop all the symptoms of suicidal depression. Self-genocide has certainly been within the means of mankind since the 1950s.

The survivors will have to cultivate a religion of hope, that is, a deep and comprehensive belief that humanity is worth carrying on. I say this as someone who has not followed any kind of lifelong organized religion. But I don't doubt that the hardships of the future will draw even the most secular spirits into an emergent spiritual practice of some kind. There is an excellent chance that this will go way too far, as Christianity and other belief systems have done at various times, in various ways.

If it happens that the human race doesn't make it, then the fact that we were here once will not be altered, that once upon a time we peopled this astonishing blue planet, and wondered intelligently at everything about and the other things who lived here with us on it, and that we celebrated the beauty of it in music and art, architecture, literature, and dance, and that there were times when we approached something godlike in our abilities and aspirations. We emerged out of depthless mystery, and back into mystery we returned, and in the end the mystery is all there is.

Chapter 2.  Modernity and the Fossil Fuels Dilemma

Global Peak

The key to understanding what is about to happen to us is contained in the concept of the global oil production peak. This is the point at which we have extracted half of all the oil that has ever existed in the world —the half that was easiest to get, the half that was most economically obtained, the half that was the highest quality and cheapest to refine. The remaining oil is the stuff that lies in forbidding places not easily accessed, such as the Arctic and deep under the ocean. Much of the remaining half is difficult to extract and may, in fact, take so much energy to extract that it is not worth getting—for instance, if it takes a barrel of oil to get a barrel of oil out of the ground, then you are engaged in an act of futility. If it takes two barrels of oil to get one barrel of oil, then you are engaged in an act of madness. Much of the remaining half comes in the form of high-sulfur crude, which is difficult to refine, or tar sands and oil shales, which are not liquids but solids that must be mined before they are liquefied for refining, adding two additional layers of expense on their recovery. Quite a bit of the remaining half of the world's original oil supply will never be recovered.

To move beyond the world oil production peak means that never again will all the nations of the earth combined extract as much oil from the ground as we did at peak, no matter what happens on the demand side. This has extraordinary implications for oil-based industrial civilization, which is predicated on constant and regular expansion of everything— population, gross domestic product, sales, revenue, housing starts, you name it.

The world oil production peak represents an unprecedented economic crisis that will wreak havoc on national economies, topple governments, alter national boundaries, provoke military strife, and challenge the continuation of civilized life. At peak, the human race will have generated a population that cannot survive on less than the amount of oil generated at peak—and after peak, the supply of oil will decline remorselessly. As that occurs, complex social and market systems will be stressed to the breaking point, obviating the possibility of a smooth ride down from the peak phenomenon.

The best information we have is that we will have passed the point of world peak oil production sometime between the years 2000 and 2008. The date is inexact for several reasons. One is that the reported reserves (oil left in the ground) of private sector and nationalized oil companies tend to be routinely overestimated, variously to benefit the share price of stock or to gain export quota advantages in international markets, as in the case of Organization of Petroleum Exporting Countries (OPEC) members. Another reason is that the "peak" will tend to manifest in several years of oscillating market instability, a volatile period of recurring price shocks and consequential recessions dampening demand and price, presaging a terminal decline. The peak therefore will only be seen in a "rearview mirror" once the terminal decline begins. Signs of sustained market instability therefore tend to suggest the earlier onset of peak but would not be provable except in hindsight.

(Among the authorities combining to predict the global oil production peak in this range are the Uppsala Hydrocarbon Study Group of the Association for the Study of Peak Oil (ASPO), chaired by Colin J. Campbell, retired geologist for Texaco, British Petroleum, Amoco, and Fina; David L. Goodstein, professor of physics, California Institute of Technology; Matthew R. Simmons, CEO of Simmons & Company International, chief investment banking firm serving the oil industry; Albert Bartlett, professor emeritus, physics department, University of Colorado, Boulder; Jean Laherrère, retired geologist for the French oil company, Total; Kenneth S. Deffeyes, professor emeritus of geology at Princeton University; Walter Youngquist, retired professor of geology at the University of Oregon; L. F. Ivanhoe, coordinator of the M. King Hubbert Center for Petroleum Supply Studies in the Department of Petroleum Engineering at the Colorado School of Mines in Golden, Colorado; Cutler J. Cleveland, director of the Center for Energy and Environmental Studies at Boston University; David Pimentel, professor emeritus, entomology, ecology and systematics, Cornell University; and others.)

In other words, the peak may seem to be a kind of plateau or overhang for a few years as economic stagflation (i.e., lack of growth) curtails demand. During this rollover period, markets may use allocation strategies to keep their best (industrialized) customers supplied at the expense of the cash-starved "loser" nations of the world (once called "developing nations" but more likely to become "nations never to develop"). Then, slowly at first and at an accelerating rate, world oil production will decline, world economies and markets will exhibit increased instability with ever wilder oscillations from prepeak norms, and we will enter a new age of previously unimaginable austerity. These trends are irreversible.

How could such a catastrophe be so close at hand and civilized, educated people in free countries with free news media and transparent institutions be so uninformed about it? I am not one for conspiracies. While they have happened in history, conspiracies almost invariably have to be very small, and limited to tiny circles of individuals. Human beings are not very good at keeping secrets; individual self-interest is not interchangeable with group interest and the two are often in conflict, most particularly among small groups of plotters. I do not believe that the general ignorance about the coming catastrophic end of the cheap-oil era is the product of a conspiracy, either on the part of business or government or news media. Mostly it is a matter of cultural inertia, aggravated by collective delusion, nursed in the growth medium of comfort and complacency. Author Erik Davis has referred to this as the "consensus trance." (Erik Davis, TechGnosis: Myth, Magic and Mysticism in the Age of Information, New York: Three Rivers Press, 1998.)

When we think about it at all, most Americans seem to believe that oil is superabundant, if not limitless. We believe that the world is full of enormous amounts of as-yet-undiscovered oil fields, and that "new technologies" for drilling and extraction will perform prodigious miracles in extending the life of existing oil fields. For many of us, even people who ought to know better, the thinking stops here. The oil corporations know better but they also know that bad news is bad for business, and because there are no ready substitutes for oil they have decided to soft-pedal the news about world peak. Either that or they put a Smiley-face spin on the situation. British Petroleum (BP) recast itself "Beyond Petroleum" in order to gain some points for social responsibility without really changing anything it does.

Colin Campbell, an oil geologist who has worked for many of the leading international oil companies, including BP, put it this way:

The one word they don't like to talk about is depletion. That smells in the investment community, who are always looking for good news and the image, and it's not very easy for them to explain all these rather complicated things, nor indeed do they have any motive or responsibility to do so. It's not their job to look after the future of the world. Their directors are in the business to make money, for themselves primarily and for their shareholders when they can. So I think it's certainly true the oil companies shy away from the subject, they don't like to talk about it, and they are very obtuse about what they do or say about it. They themselves understand the situation as clearly as I do, and their actions speak a lot more than their words. If they had great faith in growing production for years to come, why did they not invest in new refineries? There are very few new refineries being built. Why do they merge? They merge because there's not room for them all. It's a contracting business. Why do they shed staff, why do they outsource people? BP aims to have 30 percent of its staff on contract. This is because it doesn't want long-term obligations to them. The North Sea is declining rapidly. They don't like to say so, but I think only four wildcats were drilled there this year [2002]. It's over! It's finished! And how can BP or Shell and the great European companies stand up and say, well, sorry, the North Sea is over? It's a kind of shock they don't wish to make. It's not evil, or there's no great conspiracy, or anything. It's just practical daily management. We live in a world of imagery and public relations and they do it fairly well, I'd say.

Corporate executives are subject to various other mentally disabling pressures. One is that they understandably tend to believe the economists in their employ, who are violently opposed to economic models that are not based on continual growth. Because the oil peak phenomenon essentially cancels out further industrial growth of the kind we are used to, its implications lie radically outside their economic paradigm. So the oil peak phenomenon has been discounted to about zero among conventional economists, who assume that "market signals" about oil supplies will inevitably trigger innovation, which, in turn, will cause new technology to materialize and enable further growth. If the market signals are not triggering innovation, then the problem must be overstated and growth under the oil regime will resume —after, say, a normal periodic downcycle. This is obvious casuistry, but casuistry can be a great comfort when a problem has no real solution.

Corporate executives fall victim to their own propaganda as much as the general public, in this case the wishful fantasies that someone will come up with alternative fuels in time for all the oil executives to retire with a clear conscience as well as a portfolio full of stock options. This kind of blind optimism is a holdover from the techno-miracle cavalcade of the twentieth century, combined with the mythic production exploits of American industry in World War II—all enabled by now-squandered domestic supplies of petroleum—which has fed the mentality of American exceptionalism.

The American government has had access to better information, too, but that information has only led to a political dilemma for administrations of both parties. Our investment in an oil-addicted way of life—specifically the American Dream of suburbia and all its trappings— is now so inordinately large that it is too late to salvage all the national wealth wasted on building it, or to continue that way of life more than a decade or so into the future. What's more, as we have outsourced manufacturing to other countries, the entire U.S. economy has become more and more dependent on continued misinvestment in American Dream suburbia and its accessories. No politician wants to tell voters that the American Dream has been canceled for a lack of energy resources. The U.S. economy would disintegrate. So, whichever party is in power has tended to ignore the issue or change the subject, or spin it into the realm of delusion—assisted by agencies such as the U.S. Geological Survey, which serve their masters very well by supplying often inaccurate but reassuring reports.

One president in recent times, Jimmy Carter, told the truth to the American public. He told us that our continued hyperdependence on oil was a deadly trap and that we would have to change the way we live in America. He was ridiculed and voted out of office for it. Of course Carter himself, having been trained as a nuclear technician in the Navy, during the Sputnik era, when America rushed to catch up with the Russian space program (another heroic and successful exploit of technological research and development), tended to believe that a crash program in alternative and synthetic fuels would yield some miracle replacement for fossil fuels. And the hopes Carter planted in a series of 1979 speeches still affect our national psychology, though we are no closer to developing significant replacements for oil a quarter century later.

It is a little hard to say what Ronald Reagan and the first George Bush really thought about America's oil predicament, because both affected to subscribe to a branch of evangelical Protestantism that posited an "end times" apocalyptic scenario for the near future, meaning that it wouldn't matter what happened to the world very far into the twenty-first century because the kingdom of Jesus was at hand. Were Reagan and George H. W. Bush only pretending, or did they actually believe the future was irrelevant?

During the Clinton presidency, baby-boomer hippies had matured into yuppies who enjoyed the benefits of cheap oil so much (and were so spoiled by it) that they fell easily into a consensus trance regarding America's energy future: party on. The Alaskan and North Sea oil bonanzas had erased their memories of the brief 1970s oil crises. During most of the 1980s and 1990s, gas prices at the pump were lower in constant dollars than at any other time in history. It was the former-hippie boomer yuppies, after all, who started the SUV craze and bought the McMansions way off in the outermost suburbs. At the same time, stunning advances in computer development (boomer-led), and the rapid growth of the huge new industry that went with it, had induced among the boomer cultural elite a mentality of extreme techno-hubris, leading many to the conviction that our fantastic innovative skills guaranteed a smooth transition into the alternative fuels future—which, of course, squared with the wishful views of conventional economists. It all amounted to an unfortunate self-reinforcing feedback loop of delusion. Clinton Democrats regarded any upticks in oil prices as being a conspiracy between the Republicans and their donor-sponsors in the oil industry. Meanwhile, Democrats have tried to compensate for their purblind irresponsibility on energy issues by assuming a position of moral superiority on environmental issues. Yet many yuppie progressive "greens" are the ones who drove their SUVs to environmental rallies and, even worse, made their homes at the far exurban fringe, requiring massive car dependence in their daily lives. The epitome of this attitude was Amory B. Lovins, head of the Rocky Mountain Institute, who devoted his organization's time and energy in the 1990s to the development of a high-mileage "hypercar" that would have only promoted the unhelpful idea that Americans can continue to lead urban lives in the rural setting. Lovins also built the organization's headquarters in a remote part of the Colorado backcountry, which employees could get to only by car.

It can be stated with certainty that George W. Bush was fully informed of the hazards of the oil peak situation by at least one credible authority, Matthew Simmons, a leading oil industry investment banker and a highly regarded public commentator who 'has spoken forthrightly in scores of conferences and symposia about the hazards presented by the coming global oil peak. Simmons was brought in to advise the Bush campaign as early as 1999 and had many frank discussions with Bush both before and after the election. Of course, the younger George Bush, like his father, as well as being an "oil man," was a self-professed evangelical Christian and in the background of his belief system lurked that dark idea that Armageddon was just around the corner. Could he be relied on to care about the future?

What's So Special About Fossil Fuels, Anyway?

Fossil fuels are a unique endowment of geologic history that allow human beings to artificially and temporarily extend the carrying capacity of our habitat on the planet Earth. Before fossil fuels—namely, coal, oil, and natural gas—came into general use, fewer than one billion human beings inhabited the earth. Today, after roughly two centuries of fossil fuels, and with extraction now at an all-time high, the planet supports six and a half billion people. Subtract the fossil fuels and the human race has an obvious problem. The fossil fuel bonanza was a one-time deal, and the interval we have enjoyed it in has been an anomalous period of human history. It has lasted long enough for the people now living in the advanced industrialized nations to consider it absolutely normative. Fossil fuels provided for each person in an industrialized country the equivalent of having hundreds of slaves constantly at his or her disposal. We are now unable to imagine a life without them—or think within a different socioeconomic model—and therefore we are unprepared for what is coming.

Oil and gas were generally so cheap and plentiful throughout the twentieth century that even those in the lowest ranks of the social order enjoyed its benefits—electrified homes, cars, televisions, air conditioning. Oil is an amazing substance. It stores a tremendous amount of energy per weight and volume. It is easy to transport. It stores easily at regular air temperature in unpressurized metal tanks, and it can sit there indefinitely without degrading. You can pump it through a pipe, you can send it all over the world in ships, you can haul it around in trains, cars, and trucks, you can even fly it in tanker planes and refuel other airplanes in flight. It is flammable but has proven to be safe to handle with a modest amount of care by people with double-digit IQs. It can be refined by straightforward distillation into many grades of fuel—gasoline, diesel, kerosene, aviation fuel, heating oil—and into innumerable useful products—plastics, paints, pharmaceuticals, fabrics, lubricants.

Nothing really matches oil for power, versatility, transportability, or ease of storage. It is all these things, plus it has been cheap and plentiful. As we shall see later, the lack of these qualities is among the problems with the putative alternative fuels proposed for the post-cheap-energy era. Cheap, abundant, versatile. Oil led the human race to a threshold of nearly godlike power to transform the world. It was right there in the ground, easy to get. We used it as if there was no tomorrow. Now there may not be one. That's how special oil has been.

The World Leader

After World War II, the American public made two momentous and related decisions. First was the decision to resume the project of suburbanization begun in the 1920s and halted by the Great Depression and war. By the 1950s, the prevailing image of city life was Ralph Kramden's squalid tenement apartment on television's The Honeymooners show. Suburbia was the prescribed antidote to the dreariness of the hypertrophied industrial city—and most American cities had never been anything but that. They were short on amenity, overcrowded, and artless. Americans were sick of them and saw no way to improve them. Historically, a powerful sentimental bias for country life ruled the national imagination. As late as 1900, the majority of U.S. citizens had lived on farms and American culture was still imbued with rural values. As far as many Americans were concerned in the 1950s, suburbia was country living. There was plenty of cheap, open rural land to build on outside the cities, and as soon as mass-production house builders like William Levitt demonstrated how it might be done, suburbia would be thoroughly democratized—country living for everyone. That suburbia turned out to be a disappointing cartoon of country living rather than the real thing was a tragic unanticipated consequence, which I have described in my previous books. (The Geography of Nowhere, New York: Simon and Schuster, 1993; Home from Nowhere, New York: Simon and Schuster, 1996; The City in Mind: Notes on the Urban Condition, New York: Free Press / Simon and Schuster, 2002.)

Second was the political decision in 1955 to build the interstate highway system, the largest public works project in the history of the world. It has been argued elsewhere that the interstate system was conceived for moving troops and evacuating cities in the event of a future war; also that it was an economic stimulus program to prevent a return of the dreaded prewar depression. Perhaps, but I would argue that the public was simply entranced by cars and wanted a state-of-the-art nationwide road system as a kind of present to ourselves for winning the war. It was not anticipated at the time that the interstates would lead to the catastrophic disinvestment in U.S. cities. Nor did many foresee the debasement of the rural landscape as the office parks, strip malls, chain stores, and fry-pits eventually settled beside the freeway off-ramps and came to occupy every hill and dale between the housing developments. The consensus in the 1950s was that suburbanites would continue to commute into the city for work, shopping, and entertainment.

In any case, the result of these two decisions was technological lock-in. Once the investment was made in the infrastructure and furnishings of suburbia, we were stuck with it, and with the enormous amounts of oil required to run it.

Hubbert’s Curve I – American Peak

After 1945, America's position in the world vis-à-vis oil was special and privileged, perhaps to a degree that remains less than fully appreciated. Europe, having fought over distant supplies of oil in two world wars and suffered hugely, never became complacent about it, as reflected in Europeans' compact living arrangements and their high luxury taxes on gasoline. But America, having won those wars and possessing substantial reserves of oil in situ, became overconfident to a dangerous degree about its oil future. When a geologist named M. King Hubbert announced in 1949 that there was, in fact, a set geological limit to the supply of oil that could be described mathematically, and that it didn't lie that far off in the future, nobody wanted to believe him. Hubbert was not a lightweight. Before World War II, he had taught geology at Columbia University and worked for the United States Geological Survey. His theoretical work on the behavior of rock in the earth's crust was highly regarded and led to innovations in oil exploration. Stretching from 1903 to 1989, Hubbert's whole life took place during the high tide of the oil era, and he played a large part in developing its science. But he was a visionary who dared to imagine the final act of the oil drama.

By the mid-1950s, as chief of research for Shell Oil, Hubbert had worked up a series of mathematical models based on known U.S. oil reserves, typical rates of production, and apparent rates of consumption, and, in 1956, he concluded that the oil production in the United States would peak sometime between 1966 and 1972. Hubbert also demonstrated that the rate of discovery would plot out a parallel path as the rate of production, only decades earlier. Since discovery of new fields in the United States had peaked in the 1930s and declined remorselessly afterward, despite greatly improved techniques in exploration, the conclusion was obvious. Production declines would follow inexorably, Hubbert predicted, despite improved drilling and extraction methods. After this point of maximum production, or "peak," U.S. oil fields would enter a steady and irreversible arc of depletion.  He displayed this information in a simple bell curve.  The peak was the top of the curve.  Nobody took “Hubbert’s curve” seriously.  His was a lone voice in a nation that was having too much fun cruising for burgers to imagine what really lay down the road.

The crisis occurred for a very simple reason: The United States had lost pricing power over globally traded oil because, having passed peak, it was pumping its oil at the maximum rate. What's more, as the United States passed peak, net imports rose swiftly from 2.2 million barrels just before peak to 6 million in 1973. Suddenly the United States was importing roughly a third of its oil. Without surplus capacity, the ability to open the valves and flood the market with "product," the United States had ceded control of world oil prices to somebody else who still did have surplus capacity. That "somebody else" was the Organization of Petroleum Exporting Countries (OPEC), led by Saudi Arabia.

Hubbert's Curve II—The Worldwide Peak

The OPEC embargo brought home the frightening implications of the U.S. oil peak. But M. King Hubbert continued to research the oil depletion story he had pioneered. He eventually came up with a new model that expressed the coming global production peak, the point at which the highest rate of global annual oil production would occur, with a subsequent steady rate of annual falloff thereafter along the depletion. The model was fairly straightforward: Compare consumption with known reserves, then make some modest educated guesses about future rates of consumption. The only tricky part was knowing what was actually in the ground, because all entities engaged in oil production, both private companies and nations, tended to be cagey about what they had, for reasons already discussed. But within the drilling-and exploration community news of major strikes always got out. Most of the world's oil, in fact, came from, a score or so of giant oil fields, called "elephants." By the 1980s, the world had been geologically mapped to the extent that elephant oil fields on the scale of East Texas or the Ghawar field of Saudi Arabia were unlikely to have evaded discovery. The Russians, unhampered by conventional business constraints, had been especially avid in exploring their vast Siberian territories. World oil reserves were pretty much all accounted for. Hubbert Iived a long time, and by the 1980s, the "rearview mirror effect" showed that world discovery had indeed peaked in the 1960s. Most importantly, the decreasing rate of discoveries comprised only small fields of minor consequence, which played out quickly. Hubbert's previous estimate about America's peak had been based on his theory, proved correct, that peak discovery preceded peak production by roughly thirty years. Hubbert initially estimated that the world peak would occur between 1990 and 2000. He was a little off, but not by much. Some experts think the world had, in fact, entered the "bumpy plateau" of the global production peak in the early 2000s, but it was a little too early to get a clear view via the rearview mirror effect.

Subsequent tweaking of Hubbert's model by Kenneth Deffeyes of Princeton; Colin J. Campbell, retired chief of research for Shell Oil; Albert Bartlett of the University of Colorado; and others following Hubbert's 1989 death put the peak somewhere between 2000 and 2010. At the time of this writing [2005], Campbell had estimated the peak coming in 2007 and Deffeyes at 2005. Where world economic and political affairs are concerned, the difference is inconsequential.

The earth's total endowment of conventional liquid petroleum was estimated to be roughly two trillion (2,000 billion) barrels. At peak we will have extracted and burned half of that endowment. Virtually all of the consumption to date took place after 1859 and the bulk of it was disproportionately consumed in only the past fifty years, so the entire oil age, from birth to high point, has been historically very brief. According to Hubbert's model, and assuming at least current levels of world petroleum consumption at 27 billion barrels a year (and notwithstanding a still hugely expanding world population and the continued rapid industrialization of China), then the world has only about thirty-seven years of oil left, in the ideal case that every last drop is pumped out. Of course, it is extremely unlikely that the human race will ever completely drain the world's oil fields. Long before a field gives up its final barrels, the oil becomes so difficult or expensive to pump out of the ground that it takes more than a barrel of oil's worth of energy to pump out one barrel. (Many wells are currently taken out of production when they are only half-depleted because of the economics of extraction.) In the meantime, there will be degrees of difficulty and relative cost between peak and the last drop; these costs will be both attended and greatly aggravated by political competition over the remaining supplies.

Iraq Attack and the Great Sleepwalk

Meanwhile, South Korea, Malaysia, Thailand, Singapore, and especially China were becoming the world's manufacturing workshops as America "outsourced" heavy industry and focused its energies on hypertrophic suburban land development and the consumer infrastructure that went with it— malls, so-called power centers, and the vast highway strips with their fried-food shacks, tanning huts, and muffler shops. Americans were using more oil than ever before, and proportionately more of it was being burned in cars and trucks than for any productive activity. By the 1990s, American households were making a record eleven separate car trips a day running errands and chauffeuring children around. Automobiles were getting larger as the station wagon and van yielded to the supremacy of the sport-utility vehicle (SUV), an expeditionary car based on a light truck chassis and therefore exempt from legislated fuel efficiency standards.

In the 1950s, western Europe felt secure and the former Iron Curtain nations of eastern Europe emerged from the coma of communism. The North Sea fields were producing at full blast. Tourism had exploded in response to cheap air travel. The ethnic brawls of imploding Yugoslavia were an exception, and it is perhaps not insignificant that it was a geographic friction point between the Christian and the Islamic worlds. The former Soviet Union, or Russia, also contributed to the surge in production as its oil industry was reorganized on a quasi-capitalist basis. Attempting to get back on its feet economically, Russia's only means for obtaining hard currency was oil sales, as its decrepit factories produced nothing anybody would buy in an export market.

Apart from the Balkans, and the awful and innumerable civil wars of post-colonial Africa, and the low-grade insurgencies of Colombia and Peru, and the incomprehensible struggle in Sri Lanka, and the chaos of Chechnya, the nineties were a relatively quiet decade. The great proxy battles of the Cold War were over. The United States and Russia did not even confront each other symbolically anymore.

With gas remaining cheap month after month, employment rising, and the computer revolution promising a "new economy," the American public entered a decade-long sleepwalk of complacency. Was it Bill Clinton's immense good fortune to preside over two terms devoid of apparent crisis? If he had any reservations about the economy becoming Hostage to the creation of suburban sprawl, he never voiced them. Like a lot of Sunbelters, he might have viewed sprawl as good to live in and good for business. Nor did he raise any alarms about the approaching global oil weak. He must have received intelligence briefings about it, even while the U.S. Geological Survey issued inflated estimates on total world reserves. The relative calm in world and domestic affairs from 1992 to 2001 led the public instead to a foolish preoccupation with trivialities such as the President's extramarital sex life and other celebrity misdeeds. As Bill Clinton yielded to George W. Bush in 2001, the only trouble on the scene was the disappointing slide in stock valuations and the extraordinary meltdown of Internet-based "dot-com" businesses that were supposed to form the infrastructure of the New Economy.

Then, one September morning that could not have been more beautiful in the eastern United States, nineteen Islamist maniacs hijacked four airliners and changed everything.

Crunch Time

As the United States waged war against "terror"—or what I call militant Islamic fundamentalism— in Afghanistan and Iraq following the 9/11 attacks, strange things coincidentally began to happen in the global oil markets. Princeton geologist Ken Deffeyes would contend that production data finally coming in during 2003 seemed to indicate that world oil had hit a production ceiling back in 2001, above which the world's producers could not penetrate. Was this, in fact, the peak? The geologists in the Hubbert's Peak crowd knew that the peak would be detected only in the rearview mirror, several years beyond its actual occurrence, because the data took so much time to assemble. But these suspicious data suggested that something epochal had occurred. The price of oil was steadily going up, too, leaving behind the $20-a-barrel "ideal price" that the Saudis said would perfectly balance their need for profit against the West's need to maintain industrial growth, and therefore robust demand for oil. The markets seemed to know something.

(Interview with Kenneth Deffeyes of Princeton with Julian Darley of Global Pub-

Media (http://www.globalpublicmedia.com ), April 4, 2003. “The Oil and Gas Journal pubIishes lists of country-by-country world oil production in the last week of each year and lo and behold in the years 2001 and 2002... the numbers showed that the world oil production had not been as big as it was in the year 2000.")

Chapter 3. Geopolitics and the Global Oil Peak

In historical memory, the world has never faced such dangerous circumstances as it does early in the twenty-first century. The nations of the world face not only a life-and-death struggle over crucial energy resources, but an ideological struggle that makes the old capitalist/communist rivalry of the past century seem like a simple soccer match. Communism merely said, "We will bury you," and Comrade Khrushchev meant that in terms of economic and social progress. The avatars of inflamed Islam want to utterly destroy the infidel West, and its Great Satan seducer, the United States, and they mean down to the last beating heart.

This is a much darker time than 1938, the eve of World War II. The current world population of 6.5 billion has no hope whatsoever of sustaining itself at current levels, and the fundamental conditions of life on earth are about to force the issue. The only questions are: What form will the inevitable attrition take, and how, and in which places, and when? Some of these questions will be determined by the gathering calamity of climate change and its associated environmental implications, especially starvation, lack of fresh water, and the rise of epidemic disease (see Chapter 5). In the meantime, the world is faced by the dangerous posturings and maneuverings of nations around the control and possession of oil.

At the heart of this is the United States' sick dependency relationship with the Islamic world. Islamic nations possess most of the remaining oil in the world. We're addicted to that oil. Due to our inattention, narcissism, and almost unbelievably foolish complacency, we have allowed ourselves to become hostages to that addiction. We have enriched the ruling classes of the Middle East beyond the wildest fantasies ever dreamed by any emir, sheikh, pasha, or caliph in all the centuries of their dreaming.  That wealth has transformed a poetic, decorous religion into a virulent agency of potential world conflagration.

The rhetorical use of the term "war on terror" may be particularly unhelpful for Americans struggling to deal with these problems, as 'terror’ is not a nation or even a group or political entity, but rather a tactic in the service of an enemy that happens to be the widespread Islamic insurgency against American interests led by bin Laden and his al-Qaeda movement. Under Koranic law, jihad in the defense of Islam is the individual duty of every Muslim in the ummah, or worldwide Islamic community. As the United States has either troops, technical personnel, or intelligence agents in Afghanistan, Iraq, Saudi Arabia, Kuwait, Qatar, Turkey, Pakistan, Indonesia, Malaysia, the Philippines, and several former Soviet republics, it is hard to dispute the fact that we have a physical presence in Islamic territory—whatever our reasons—and that is enough to trigger jihad from the ummah, which has repeatedly issued explicit declarations of such. We are therefore at war with that community—not because of our choosing but because it has declared war on us. It is a circumstance that the U.S. government cannot possibly acknowledge honestly, for several reasons. One is that the government officials of many Islamic nations are effectively alienated from this ummah —by dint of their relations with us—and we are accustomed to understanding war only against governments and the armies they control, not against communities of faith, however broad they might be. In addition we are desperately beholden to these governments for our continued supply of lifeblood oil and we are bound to support them no matter how alienated they are from their own community—which only reinforces the jihadi rage and determination of the ummah. Another reason is our now pervasive political correctness, which forbids U.S. leaders from inveighing against another ethnic group or creed, or even acknowledging that deadly conflict may exist between the group and us. This leads to other forms of impotence, such as our chronic failure to respond to tactical insults such as the first World Trade Center bombing, the attack on the U.S.S. Cole, the Khobar Towers incident, or the bombing of embassies in Tanzania and Kenya, the attack on an international residential compound in Riyadh, the decapitation of American contract workers, and a long list of other hits, and our hesitation to engage wholeheartedly or to complete any military affray we find ourselves in, whether it is against Iraq in 1991 or Somalia, or giving the Taliban a month to escape from Afghanistan after 9/11, or the 2004 Shiite insurgency in Iraq.

While our intractable conflict with insurgent militant Islam has occupied center stage of the geopolitical scene for several years, we have many other concerns with other nations of the world—and they with us. Global relations have entered a period of turbulent flux, and events may turn in any number of strange directions before they resolve in what I believe will be the likely finale of the Long Emergency—with world powers retreating into their own regional corners, left to deal with fateful contraction of their societies due to the depletion of cheap fossil fuel. Whatever else happens, in the meantime we are still stuck with our sick dependency on Middle Eastern oil and the difficulties emanating globally out of it.

Peak and the Fate of Nations

Oil is the world's most critical resource. Without it, nothing works in industrial civilization as currently configured. Few people dispute the idea that the world will eventually run out of oil, and there is a broad recognition that it will happen sometime in this century—but there is next to zero understanding about what happens between now and then, on the way down to "empty." The American public also generally assumes that by the time the oil runs out humankind will have moved on to the next energy system—the current favorite candidate being one based on hydrogen, and that it will arrive just in time, by special delivery, because the free market decrees it is so, and the free market never lets us down.

I don't believe it is going to work that way—and presently I will discuss the issue of alternative energy, the putative replacements for oil. The world will be in trouble long before we run out of oil, when we reach peak production. At absolute peak, there will still be plenty of oil left in the ground—in fact, half of all the oil that ever existed—but it will be the half that is deeper down, harder and costlier to extract, sitting under harsh and remote, parts of the world, owned in some cases by people with a grudge against the United States, and this remaining oil will be contested by everyone. At peak and just beyond, there is massive potential for system failures of all kinds, social, economic, and political. Peak is quite literally a tipping point. Beyond peak, things unravel and the center does not hold. Beyond peak, all bets are off about civilization's future.

The public has been generally unconcerned or ignorant of the global oil production peak and what it portends. The previous oil shocks of 1973 and 1979 are viewed as transient difficulties that were overcome, and the illogical conclusion is drawn that all future problems with our oil supplies also will be overcome. The public has heard "experts" and "Cassandras" cry wolf about oil before —and no wolf appeared, life continued normally, so why take them seriously this time? Many people consider the peak oil story another fantasy brought to us by the same alarmists who said that the Y2K computer bug would bring on the end of the world as we know it. The attacks of September 11, 2001, were supposed to change everything, too, but we are still a nation of happy motorists tooling down the highway with our iced beverages and savory snack food products, with Rush Limbaugh cheering us along on the radio. Nobody is prepared for the sinkhole that awaits us down the road.

For many Americans, who have never known a way of life without cheap oil, there is a simple inability to imagine life without it. Some say that just because more oil hasn't been discovered doesn't mean that it isn't there. They are unimpressed by data showing that discovery peaked worldwide forty years ago and has been declining steadily since. In government, the discussion over our oil and natural gas situation has been plagued by misinformation, denial, and secrecy.

But it may be in the nature of crises that the conditions leading to them are ignored until it is too late to do anything about them. It may be hard to form a clear picture of a complex situation through a fog of facts and statistics. You decide. Here again are some of the salient facts of the global oil situation:

·       The total planetary endowment of conventional nonrenewable liquid oil was roughly 2 trillion barrels before humans started using it. Since the mid-nineteenth century, the world has burned through roughly 1 trillion barrels of oil, half the total there ever was, representing the easiest-to-get, highest-quality liquids. The half that remains includes the hardest oil to get, lowest-quality liquids, semisolids, and solids.

·       Worldwide discovery of oil peaked in 1964 and has followed a firm trendline downward ever since.

·       The rate of oil use has accelerated tremendously since 1950. The explosive rate of world population growth has run parallel to our rates of oil use (in fact, oil has enabled the population explosion).

·       The world is now using 27 billion barrels of oil a year. If every last drop of the remaining 1 trillion barrels could be extracted at current cost ratios and current rates of production —which is extremely unlikely— the entire endowment would last only another thirty-seven years.

·       In reality, a substantial fraction of the remaining half of the world's total oil endowment will never be recovered.

·       After peak, world demand will exceed world capacity to produce oil.

·       After peak, depletion will proceed at 2 to 6 percent a year, while world population is apt to continue increasing (for a while).

·       More than 60 percent of the remaining global oil endowment lies under the Middle East.

·       The United States possesses 3 percent of the world's remaining oil reserves but uses 25 percent of world daily oil production.

·       The United States passed peak in 1970 with the annual rate of production falling by half since then —from roughly 10 million barrels a day in 1970 to just above 5 million in 2003.

·       The ratio of energy expended in getting the oil out of the ground to the energy produced by that oil in the U.S. oil industry has fallen from 28:1 in 1916 to 2:1 in 2004 and will continue falling.

The Bumpy Plateau

Looked at closely, the peak would resemble a kind of bumpy plateau because the price and demand data would all appear to wobble inconclusively for a while, perhaps for several years. High price, they say, "destroys" demand. As demand lessens, prices fall. Lower prices prompt demand to pick up again, and prices rise. The global peak period itself will be a period of both confusion and denial. Then, as the inexorable facts of the world peak assert themselves, and the global production line turns down while the demand line continues to rise, all the major systems that depend on oil—including manufacturing, trade, transportation, agriculture, and the financial markets that serve them—will begin to destabilize (including the oil industry itself). The peak will set into motion feedback loops of strange behavior as the boundaries among politics, economics, and collective paranoia dissolve, especially in relation to global markets and supply chains, which depend on a modicum of reliable expectations and transcultural trust. Once the world is headed firmly down the arc of depletion, fuel supplies will be interrupted by geopolitical contests and culture clashes.

Eventually, economic growth as conventionally understood in industrial societies will cease, or continue in only a few places at the expense of other places. On the bumpy plateau, global oil production rates may seem strangely at once robust and flat—robust because at peak the total barrels per day will remain in the highest historical range, but flat because they will never manage to go beyond a certain ceiling. Global production will never again increase. After oscillating at peak a few years, production rates will inexorably drop, and then the question becomes: how steep the drop?

During the singularity of the peak years, there will be no "swing" producer that can ramp up production in order to "flood the market" and keep global crude prices stable, as Saudi Arabia did for many years to the great benefit of the West. All producers will be pumping at the maximum rate. Rising demand among still-growing populations will bid up the price. The lack of a moderating market mechanism, such as surplus supply, to influence price will, by default, lead to allocation-by-politics. The politics of jihad (them) and blood-for-oil (us) will prove to be a very unfavorable basis for allocating scarce-but-indispensable commodities.

The economic stress among virtually all nations, the rich and poor, the advanced and "developing," will be considerable and is certain to lead to increasingly desperate competition for diminishing supplies of oil. Whatever a given nation's official take on the crisis may be, whatever level of denial or panic, all will be players in the ensuing contest for the remaining supplies of oil. The denial about global peak in the United States is already fierce, as investments in car-dependent, oil-addicted infrastructure are greater here than in any other nation and Americans consider their way of life a God-given entitlement. "The American way of life is not negotiable," vice-president Dick Cheney once famously remarked.

George W. Bush asserted in his 2003 State of the Union address that we would continue running that way of life on hydrogen. He did not indicate that the way of life itself might contain a few problems. Two months after that speech, the United States invaded Iraq in order to set up a police station in the fractious Middle East, which contains more than two-thirds of the world's remaining oil.

Flashpoint

The Middle East has been a geopolitical flashpoint for more than half a century. The first oil discoveries in Arabia were made in 1932 and it is not an accident that intense friction around the region has occurred in parallel with the development of its oil endowment. Without oil, the desolate Arabian peninsula would support a tiny fraction of the population now living there. It is otherwise devoid to an extreme of other resources, especially arable land and water. The modern state of Israel would not have been possible, either, without oil to fuel a modern European-type economy. It is a peculiar and tragic fact of history that the founding of the modern state of Israel occurred in tandem with the takeoff of the modern Arab oil states, and likewise, that the rise of Muslim fanaticism – as personified by al-Qaeda, the Taliban, the mullah-ocracy of Iran, Saddam Hussein, the suicide bomb death cult of the Palestinians, the militant sect of Pakistan and Indonesia – has focused its wrath on Israel as the embodiment of all Muslim hatreds for classic Western civilization.  Cairo, Baghdad, Kabul, Tehran, and other Muslim cities contained major Jewish communities in the late nineteenth century.  Until the modern era, Islam had shown tolerance for these embedded populations, which comprised much of the commercial middle class.  The evolution of Jewish settlements in Palestine into the nation of Israel over the twentieth century not only prompted a flight of these embedded populations to the attraction of a "promised" new homeland, but also instigated harassments and expulsions against the remaining Jews in those cities. As this process went forward into the mid-twentieth century, engendering hatred, envy, and resentment, the Muslim world took its cues for extreme anti-Semitism from the European totalitarian movements of the twentieth century—Nazism and Stalinism.  All these tendencies have been further distorted and aggravated by the resource base of oil to which Western industrial society is addicted.

When Mark Twain visited the Holy Land in 1867, he found it weirdly vacant, and wrote in Innocents Abroad:

[T]hese unpeopled deserts, these rusty mounds of barrenness, that never, never do shake the glare from their harsh outlines, and fade and faint into vague perspective; that melancholy ruin of Capernaum: this stupid village of Tiberias, slumbering under its six, funereal palms. ... We reached Tabor safely.... We never saw a human being on the whole route.

The industrialization of Europe across the nineteenth century provoked tremendous social change. On the one hand, the Jews of Middle Europe (Germany, Poland, Austria-Hungary, and others) gravitated to the booming cities, began to assimilate into mainstream culture, secularize, and enter cosmopolitan professions. On the other hand, new modes of transportation—railroads, steamships—made emigration easier, and large numbers of Jews chose to leave Europe altogether, many for America and a few for the sparsely populated Holy Land. The Zionist movement, led by Hungarian-born Theodor Herzl, institutionalized this epochal population shift into a nation-building venture. Herzl envisaged a wholesale transfer of millions of Jews out of Europe, both solving the age-old problems of anti-Semitism and fulfilling biblical prophecies of a return to Jerusalem.

The European Jewish pioneers who resettled this desolate land of biblical and Roman ruins in the early twentieth century arrived with up-to-date technologies in agriculture, irrigation, civil engineering, and other accoutrements of Western life such as electricity and telephones, setting the stage for a culture clash with Arab peoples whose manner of living remained stubbornly medieval. By 1914, approximately 100,000 Jews were living in the Holy Land, along with 600,000 Arabs. The Jews regarded the Arabs not unlike the way other European colonizers regarded their colonial natives, with condescension. Arabs there (in a region not yet called Palestine) nonetheless benefited from the area's economic development. Some yielded to modernity, became urban and educated, and established themselves in commerce and the professions, which would lead inevitably to politics. Their population grew.

Meanwhile, World War I broke out in Europe. The mass slaughter and futility of the trenches went on for years and national treasuries groaned under the strain. Both the British and the Germans hoped to gain aid from their Jewish financial networks to continue paying for the war. One result was the Balfour Declaration of 1917, in which Britain essentially took up the cause of Zionism. Another result would be the postwar German resentment at being "stabbed in the back," as Hitler would put it, by Germany's Jews.

At the war's end the geopolitical remnants of Germany's defeated ally, the Ottoman Empire, had to be disposed of, including its possession of the Holy Land, now de facto a pan-European Jewish outpost under British control. The British engineered a quasi-colonial protectorate under a League of Nations mandate in what would now be called Palestine. At the same time they established the kingdom of Transjordan by plucking a Hashemite chief named Abdullah out of Arabia and installing him on a throne in a previously unincorporated zone east of the Jordan River, with the expectation that it would become the de facto homeland of the majority of Arabs in the region —that is, the people who today call themselves Palestinians. The British underestimated the Palestinians' attachment to Zion proper.

Once Palestine was more or less officially established as a Jewish homeland, yet more purposeful settlement occurred in waves, first by those who saw clearly what Hitler intended before World War II, and then, of course, of those who survived the Holocaust and arrived later. Just before and then after the war, the oil resources of Arabia, under the Saudi royal family, went into development, first by the British and then by American companies. Oil would soon bring unheard-of riches to a people who had lately been a sparse tribe of camel-riding nomads in an ocean of sand and rock. The establishment of Israel as a sovereign nation in 1948—an ethnic anomaly along a Muslim strip nine thousand miles long reaching from Morocco to Java—provided the Islamic world with a focus for the anxieties that oil unleashed, amplified by the difficult adjustment to sudden enormous wealth and the cultural incursions of modernity that came with it. And this was occurring just as the European nations were systematically dismantling their obsolete colonial empires and withdrawing back across the Mediterranean, leaving expanded Muslim populations to govern themselves.

Another factor in setting the tone for chronic Arab-Israeli strife was the refusal of Muslim states around Israel to absorb Palestinian Arabs displaced after the struggles that attended the founding of Israel in the late 1940s. The Palestinians would become pawns and proxies both of the cold war Soviets and of their Islamic cousins in a series of battles lasting half a century that has never been resolved nor ever quite added up to a major regional war lasting more than a couple of weeks. It became the role of the Palestinians to act out all the Muslim world's real and imagined injustices. Had they been absorbed into other Arab countries, those nations would not have so easily diverted themselves from their own internal contradictions.

Against this background, we move to the present situation. Israel has become the region's sole nuclear power. It operates a fleet of submarines armed with nuclear missiles that roams all over the globe, impervious to threats. Israel is said to possess at least two warheads programmed to strike each of the capital cities of its enemies. Egypt has long ceased to be a military threat to Israel. The Soviet Union no longer exists as anyone's patron. By default, Egypt has become the financial ward of the United States. Egypt and Israel officially still observe the 1978 Camp David peace accords, but the status quo depends on the continued rule of Hosni Mubarak and not much else, and he has begun to show signs of failing health in his mid-seventies.

Israel has been indispensable to the Arab world as the default distraction from local or domestic issues, the stand-in for all of Islam's quarrels with the outside world. Without Israel to focus on, all the anxieties and political frustrations of many Muslim states would turn inward, in the form of revolution, insurrection, and civil strife. Paradoxically, a strong Israel is more valuable to the current generation of Arab politicians than a vanquished Israel would be.

Problematic as it may have been for the United States as an ally and client, Israel has been indispensable as the regional policeman of last resort. The Israeli intelligence service in particular has been a tremendous benefit to the United States as a window on the dark interstices of Arab politics. But in the new age of asymmetrical warfare, the constant deadly harassments of terror acts, suicide bombings, and armed incursions against civilians threaten the viability of "normal" life in Israel. A negotiated resolution of the Arab–Israeli conflict will not happen as long as the neighboring states enjoy the artificial prosperity of oil wealth. For that matter, the modern state of Israel, being a creature of the industrial era and as dependent on oil as any industrial society, may not survive the fossil fuel crises of the coming decades. The exploding Palestinian population itself might be the ultimate weapon that would overwhelm the experiment of a modern Jewish state, but as the oil runs out, the region will probably not support continued population growth by any group. Life will become much more desperate when the struggle for resources intensifies, and it is not hard to imagine circumstances that would turn Palestine back into the kind of sparsely populated wasteland that Mark Twain encountered only 150 years ago.

For half a century, the Arab-Israeli conflict has been a mask over the much graver contradictions of the West's ever-growing dependence on the oil resources of a handful of Arab and other Muslim nations in the Middle East (e.g., the Persians of Iran). Whatever else was going on in that region, the United States and Europe have enjoyed, with few interruptions, a remarkably steady supply of the single most indispensable resource to industrial civilization, and at favorable prices—especially once the United States passed its own oil production peak in 1970 and lost its role as the global swing producer that could push prices down by simply pumping more. Plain avarice and a lack of other options had this handful of Middle East oil nations in a thrall of codependency with their addicted patrons. In a very few years all the other oil-producing nations of the world will be past their individual peaks. This will leave Saudi Arabia, Iraq, and Kuwait in a special and uncomfortable position, with an energy-starved world of armed and dangerous nations glaring hungrily at them (and menacingly at each other). Then these oil-rich few will face the prospect of their own inevitable depletion. The tensions arising from these linked prospects have, I believe, contributed hugely to the pronounced mood of political animus currently gripping the Islamic world.

That world has entered a period of the most extreme turbulence, from Algeria to Pakistan, extending through the old frontiers of the former Soviet Union and down into Indonesia, the world's most populous Muslim nation. And this turbulence coincides with the climactic phase of the Arab oil bonanza. It is a multidimensional turbulence—religious, ethnic, ideological, economic—and it is taking place on an underlayer of ecological desperation, as populations in many Muslim nations grossly overshoot the carrying capacities of the places they inhabit.

Even nonreligious observers must regard with awe the potential that the Middle East now holds for setting off a civilization-ending war, a virtual Armageddon. The approaching end of the oil age seems to have propelled the Islamic world into an accelerating mood of dark, bottomless fury as it sees its wealth squandered, looks out on its seething populations, and views a harsh and hopeless post-oil future.

The Main Event

The reason I have made a distinction between Arabia and Saudi Arabia is that sooner or later the Arabian peninsula will cease to be the personal possession of the al-Saud clan. When that happens, we will be dealing with a very different Arabia. Arabia is the center of gravity of the Islamic world and the world oil markets. In Baer's phrase, Arabia is "the fulcrum on which the global economy teeters." American policy for the past quarter-century has been based on the delusion that Saudi Arabia is stable and that America can enjoy regular supplies of its oil at a fair price indefinitely.

Saudi governance is already imploding into its own internal vacuum of extreme cupidity, self-deceit, sloth, apathy, and inertia. The modernity that the 30,000-odd al-Saud family members found themselves in, especially at the highest royal levels, was a purposeless modernity of dissipation. The thousands of princes on the oil dole have received welfare benefits — anywhere from $19,000 to $270,000 a month — unmatched by any royal family at any time or place in history, as well as a steady income of "commissions," bribes, and other kickbacks in the construction industry and arms deals. Their positions have also permitted extremely liberal expropriations of other families' private property, a kind of blank-check eminent domain system for the benefit of the al-Saud clan members only. They have squandered much of these immense fortunes on pleasure-seeking, toy-buying, jet-setting, and self-aggrandizing land development. (Prince Abdul Aziz, a favorite of the disabled King Faud, built his own $4.6 billion theme park featuring scale models of Mecca and other Islamic holy places, with actors carrying on within.) At least a trillion dollars of Arabian money has been parked in American securities markets for the past decade; the rapid withdrawal of it could bring American finance to its knees (and probably global finance with it).

Because the Arabian birth rate is among the highest in the world, the annual oil welfare allotment of ordinary Saudi subjects —that is, non-royal family members—fell from a high of $28,600 in 1981 to less than $7,000 in 2003. Seventy percent of all jobs in the kingdom and 90 percent of private sector jobs are filled by foreigners. (Robert Baer, "The Fall of the House of Saud," Atlantic Monthly, May 2003, p. 58.) The depressing bill of particulars about the daily details of life in the feudal kingdom is a prescription for anomie of the most extreme kind. Few Arabian national adults work at real jobs. The sexes are completely segregated. The temperature is over 100 degrees half of the year. The nation leads the world in beheadings. Higher education is free, but half the Ph.D.s awarded are in Islamic studies. Liquor and movies are forbidden (the ruling elite can do whatever they like within their private compounds, or travel free on the national airline to places where such recreations are allowed). Unemployed young men have little to do but contemplate the futility of their prospects—and the overthrow of the regime that maintains the status quo.

The very religious doctrine that the al-Sauds have been financially underwriting for decades, and the enormous infrastructure supporting it—from the thousands of madrassas, or Islamic academies around the region, to the al-Qaeda training camps, to mosques in the United States and Europe —is dedicated to extirpating exactly the kind of decadence embodied in the behavior of the royal family. The kingdom has huge numbers of young men without jobs or prospects of jobs and the religious schools have been the chief means of occupying them. Clerics inside Arabia openly call for jihad against all infidels, by clear implication including the ruling family. The al-Sauds have been pouring money into this enterprise of their own self-destruction in the hope that they might divert fundamentalists' attention away from their own wretched excesses to the sins of the West, and of hegemonic America in particular, and so far it has worked.

The chief concern from the point of view of America and the rest of the industrialized world is that a revolution against the al-Saud clan would likely be carried out by these fundamentalists —as opposed, say, to democratic reformers, socialists, competing royals, or any other conceivable rivals to power—and a further implication is that such a revolutionary government in Arabia could be headed by the world's leading Arabian Islamic radical, Osama bin Laden, or someone like him. This would almost certainly lead to a cessation of direct Arabian oil imports for the United States. Since oil is fungible, meaning a batch of light crude from one region can be traded for a batch of light crude in another region, the United States might be able to work around such a boycott in world oil markets. But a revolutionized Arabia could simply reduce total production. In fact, it would be in its interest to do so, as oil is the only exportable resource it possesses and the nation would be far better off husbanding whatever it has left. What's more, fundamentalists in power, unlike al-Saud princes, would not require billions of dollars in spending money for the maintenance of yachts and trips to Monte Carlo, so they could more easily afford a necking-down in production. Disburdened from supporting several thousand decadent princelings, a revolutionary government could reduce oil production and still maintain subsidies for ordinary Arabian citizens. This assumes, of course, that the infrastructure of oil production would not be damaged in the first place during a revolutionary struggle. Were al-Qaeda to direct such a revolution, its ultimate strategic goal would be the establishment of a pan-Islamic ummah with Arabia at its center, containing Islam's two holiest sites, Mecca and Medina.

Of course, the global industrial economy would not easily tolerate a significant reduction in Arabian oil exports, nor an Arabia as a permanent jihadi base, and any number of possible scenarios might be spun out of that circumstance. One is that the United States might attempt to intervene in support of a threatened Saudi regime, including an attempt to physically take over and secure the Aramco oil assets on the ground, a gambit that could fail spectacularly. Destruction of the Arabian oil infrastructure has been a nightmare for U.S. government strategic planners, second only perhaps to the spread of nuclear weapons. From the insurgents' point of view, the system can be attacked at a few key points using weapons and explosives easily obtained in the international arms bazaar, and hitting some select targets would afford them tremendous leverage against their adversaries at low cost.

The Alqaiq refinery, the world's largest, processes more than half of the nation's crude. From there it is piped thirty miles or so to a couple of terminals on the Persian Gulf coast, Ras Tanura and Ju'aymah, where it is loaded on tankers at the offshore facility called the Sea Island. The terminals are heavily defended against external attacks, but Saudi intelligence, the army, the police, and even Aramco have been infiltrated by bin Laden sympathizers and it would be next to impossible to stop a determined insider bent on sabotage within the facilities themselves. Pipelines running from the giant oil fields of the Persian Gulf west to Yanbu'al Bahr on the Red Sea are indefensible. Most of that oil goes through the Suez Canal to Europe. The pipeline could be taken out with a camel and a few pounds of Semtex explosive at any point along a five-hundred-mile transit. Action against wellheads, of which there are thousands, could cause irreversible damage to the oil fields themselves. If the Saudi oil infrastructure were crippled, the global economy would stagger, with the U.S. economy leading the way off an economic cliff. Normality, as it has been understood in the United States for a long time, would end very quickly.

Many Arabians regard oil as a curse. They have lived with this bonanza a little more than half a century. It has turned their lives upside down and inside out and has devastated their traditional culture. An Arabian proverb of our time goes something like this: "My father rode a camel, I drive a Rolls-Royce, my son flies a jet airplane, and his son will ride a camel." The fatalism is revealing. How will Arabians live after the oil bonanza ends? Many in the generations under forty years of age have never known life without air conditioning, automobiles, and shopping malls. Even under the most favorable circumstances, the Arabian oil endowment would not last more than another fifty years. There will come a time—before the end of the twenty-first century—when circumstances will compel a return to traditional lifeways. The prospect is sobering. The region will not support anywhere near the number of current inhabitants, who owe their existence in one way or another to the subsidy of their oil endowment. There must be many Arabians who follow this thread to its logical conclusions and feel the terrible weight of destiny bearing down on them. It is their palpable and inescapable fate —and why wouldn't the apprehension of this destiny not propel a demoralized people into a desperate religious hysteria?

Of course, it is for the Arabians to sort out what they will do in this fast-approaching future. But the United States will face a quandary in the even shorter term, if confronted one way or another with the precipitous cutoff of its Arabian oil imports, by either sabotage or politics. America has demonstrated its willingness to invade sovereign nations in the Middle East—and ultimately justifies it on the basis of the Carter Doctrine, which states that oil supplies represent vital U.S. interests to be defended by militant force if necessary. In regard to the problems presented by Arabia, there are two basic questions for us: What might the United States do? And what can it do?

Iraq and Iran

The third complaint, this one made by the American antiwar lobby, is that "it's just about the oil." Of course it was about oil. The Iraq invasion was a desperate attempt by the United States to establish political stability in the Middle East, where so much of the industrial world's oil comes from. But members of the antiwar lobby were just as likely to be car-dependent suburbanites as Bush supporters were. At least that was my observation among my fellow middle-aged yuppies in upstate New York. None of them had traded in their giant cars or scaled down their driving habits or moved closer to town or done anything to make their lives less reliant on liberal supplies of Middle Eastern oil. One family in my neighborhood had a sign in their yard that said "War Is Not the Answer"—and had two SUVs parked in the driveway. The American public, including the educated minority, seemed eerily clueless about the connection between their own living arrangements and our problems overseas.

Europe and Russia

France was prescient enough over the past three decades to construct a network of nuclear generating plants that supply roughly 80 percent of the country's electric power —far more than any other nation. France also employed a uniform design standard for all its reactors that has made for exceptionally safe and well-run system. Alone among the nations of Europe, France is planning a new generation of nuclear power installations. Germany and Belgium, for example, are looking to shut down their testing nuclear power facilities. In any case, all the nations of Europe, including France, will have problems when oil and natural gas grow scarce. France may differ somewhat in being able to keep the lights on longer.

Great Britain is in a particular state of denial and jeopardy over its energy prospects. The bonanza of the North Sea oil and gas fields induced a dangerous sense of euphoria there. For two decades Great Britain was a net oil exporter. Its newfound oil riches supercharged the economy. Now that the North Sea fields have passed their peak and are depleting at about 5 percent a year, Britain faces a bleak energy future. To make matters worse, the North Sea bonanza provoked a spate of suburban development across England that will prove to be an additional burden. All of Western Europe for the foreseeable future will have to rely on natural gas imports from Russia—a dangerous dependence in any case—and Great Britain will find itself near the end of a long pipeline delivery system, ahead only of Ireland, which faces an even more dire energy future. Any time Russia sneezes, Great Britain will have to worry about catching a cold. Germany is not much better off in that respect, made worse by its status as Russia's historical enemy. Germany has next to zero oil and gas resources and not much of a Plan B, despite a big push to develop wind farms on the North Sea.

That said, the nations of Europe enjoy some advantages over the United States in facing the Long Emergency. Although all European countries have some suburban development, it is nowhere comparable to the complete fiasco of American suburbia, and they did not trash their towns and cities in the process, as America did. The quality of compact urbanism, the scale of it, and its integral nature, even in large European cities, is much more sustainable than anything found in America. Most European cities, including the big ones, are still composed primarily of buildings under seven stories at their centers. Because of their high density and the fact that the middle class and even the rich inhabit their centers, even small European cities have high levels of amenity and culture. If there were a major supply interruption in oil, most Europeans would still be able to get to work and carry on the business of their societies. Public transit is still excellent in most of Europe at all scales, from city subways, streetcars, and buses to major passenger rail—though England, in its twenty-year North Sea oil mania, allowed its train system to go to seed. As in America, the megacities of Europe will suffer from their gigantism in the Long Emergency and unquestionably undergo significant contraction. But their centers may hold; in America, most cities no longer have any center.

Finally, the Europeans did not allow local agriculture to be overwhelmed by corporate gigantism and industrial totalism. There is still a clear distinction between urban and rural life in Europe, and virtually all cities and towns are surrounded by active agricultural hinterlands. The degree of local value-added activity associated with European agriculture – winemaking, cheese and olive oil production, and the like—remains high and retains high levels of craft quality. The European Union has led to some decay of the special protections and subsidies enjoyed by farmers, but local agriculture is still an ongoing enterprise there. The cultural memory has not been erased, as it has in the United States. Since local food production will be a crucial issue in the Long Emergency, the Europeans have better prospects of being able to feed themselves. Only one other factor looms ominously offstage at this point: the effects of climate change. Will Europe heat up, or will hydrothermal changes in the Gulf Stream plunge it into icy cold? Something is happening and we do not know the answer yet.

Global Turbulence

Whether the nations of the world fall into war over the remaining oil remains to be seen. What is certain is that we are entering a new period of world history, the uncharted territory of a post-oil world. We will be in it long before the middle of the twenty-first century. Eventually, all the nations will have to contend with the problems of the Long Emergency: the end of industrial growth, falling standards of living, economic desperation, declining food production, and domestic political strife. A point will be reached when the great powers of the world no longer have the means to project their power any distance. Even nuclear weapons may become inoperable, considering how much their careful maintenance depends on other, technological systems linked to our fossil fuel economy.

Before long, all nations will retreat back into themselves either in autarky or anarchy. Many of them—including possibly even the United States—will probably follow the example of the Soviet Union and fragment into smaller autonomous units, as life becomes intensely local everywhere. I have not mentioned South America so far in this chapter, for the simple reason that I think it will remain off on the geopolitical margins into the Long Emergency. This does not mean it will be a safer and happier place than the rest of the world. South American countries will have to contend with exactly the same problems of energy scarcity, falling food production, and all the rest. Given the already high level of violent anarchy in many regions of South America, we can expect at least a continuation of armed conflict and disorder. South American nations will not, however, be in much of a position to project their power into the Eastern Hemisphere. Mexico, Colombia, and Venezuela may find themselves combatants in their own oil wars, though they are all well past peak.

Australia and New Zealand may fall victim to desperate Chinese adventuring, or to anarchy emanating from Southeast Asia. Or perhaps they will be left alone. In any case, both will be starved for fossil fuels. Africa, too, has been left to the margins in this discussion. While it contains some major oil producers, the continent is already a poster child for the kind of hardship and chaos that will become common elsewhere. The oil-producing nations there may easily become too disorderly to even support the continued exploitation of their oil fields.

In the Long Emergency the world would become a larger place again. GIobalism, as a set of economic relations, will fizzle out. The 12,000-mile supply lines from the factories of Asia to the Wal-Marts of Pennsylvania will be a thing of the past. Commercial sea lanes might become indefensible. In fact, the coastlines of all nations may become prey to a new species of stateless freebooting raiders, as the area around the Molucca Sea is now. I believe the Pacific coast of North America will be especially vulnerable to raids emanating from the disintegrating nations of Asia. Air transport could become a rarity or a prerogative of only small and diminishing elites. Finally, the international oil trade itself would become so chaotic and unmanageable that no region on earth would be able to rely on distant energy supplies any longer. Nations, and even more likely regions within nations, would have to fall back on their own resources, and sink or swim.

Chapter 4. Beyond Oil: Why Alternative Fuels Won't Rescue Us

Based on everything we know right now, no combination of so-called alternative fuels or energy procedures will allow us to maintain daily life in the United States the way we have been accustomed to running it under the regime of oil. No combination of alternative fuels will even permit us to operate a substantial fraction of the systems we currently run —in everything from food production and manufacturing to electric power generation, to skyscraper cities, to the ordinary business of running a household by making multiple car trips per day, to the operation of giant centralized schools with their fleets of yellow buses. We are in trouble.

The known alternatives to conventional oil that I will discuss in this chapter include natural gas, coal and tar sands, shale oils, ethanol, nuclear fission, solar, wind, water, tidal power, and methane hydrates. We will certainly use many of these things, and the various systems they entail, as much as we can, but they will not make up for the depletion of our oil supply. To some degree, all of the non-fossil fuel energy sources actually depend on an underlying fossil fuel economy. You can't manufacture metal wind turbines using wind energy technology. You can't make lead-acid storage batteries for solar electric systems using any known solar energy systems.

The pseudo-fuel hydrogen will be considered in its own special category, as the popular hopes about it are based on higher orders of unreality. The so-called "hydrogen economy" centered around hydrogen-powered cars, as promised by President Bush in his 2003 State of the Union message, is at this point a fantasy, and an especially dangerous one insofar as it promotes complacency about the predicament we face. If there is ever going to be a hydrogen economy, then we are not going to segue seamlessly into it when the fossil fuel economy begins to wobble. At best, the world is going to suffer an interval of economic chaos and social stress between the end of the fossil fuel age and whatever comes next. The question is how long this interval will last: ten years, a hundred years, a thousand years, or forever.

The belief that "market economics" will automatically deliver a replacement for fossil fuels is a type of magical thinking like that of the cargo cults of the South Pacific.

This age-old tendency of humans to believe in magical deliverance and to wish for happy outcomes has been aggravated by the very technological triumphs that the oil age brought into existence. Technology itself has become a kind of supernatural force, one that has demonstrably delivered all kinds of miracles within the memory of many people now living— everything from airplane travel to moving pictures to heart transplants. There's no question that technology has prolonged life spans, relieved misery, and made everyday life luxurious for a substantial lucky minority. (The diminishing returns and unintended consequences of technology are important topics that will be explored later in Chapter Six.) A hopeful public, including leaders in business and politics, views the growing problem of oil depletion as a very straightforward engineering problem of exactly the kind that technology and human ingenuity have so successfully solved before, and it therefore seems reasonable to assume that the combination will prevail again. There are, however, several defects in this belief.

One is that we tend to confuse and conflate energy and technology. They go hand in hand but they are not the same thing. The oil endowment was an extraordinary and singular occurrence of geology, allowing us to use the stored energy of millions of years of sunlight. Once it's gone it will be gone forever. Technology is just the hardware and programming for running that fuel, but not the fuel itself. And technology is still bound to the laws of physics and thermodynamics, which both say you can't get something for nothing, and there is no such thing as perpetual motion. All of this is to say that much of our existing technology simply won't work without petroleum, and without the petroleum "platform" to work off, we may lack the tools to get beyond the current level of fossil-fuel based technology. Another way of putting it is that we have an extremely narrow window of opportunity to make that happen. In the meantime, here are the problems with the various alternative fuels, based on what we know now.

The Hydrogen Economy

The widespread belief that hydrogen is going to save technological societies from the fast-approaching oil and gas reckoning is probably a good index of how delusional our oil-addicted society has become. The idea is enticing because the only by-product of burning hydrogen is water vapor, and that would seem to obviate most of the world's global warming and air pollution worries. And hydrogen is also a superabundant chemical element. It would be nice, neat, and simple if all the powered infrastructure and equipment of our society could simply be switched to hydrogen, but it's not going to happen. A few things may run on hydrogen, but not America's automobile and trucking fleets. In the long run hydrogen will not replace our lost oil and gas endowments.

Proposals for switching from an oil and gas to a hydrogen economy are generally associated with the fuel cell technology. A single fuel cell is basically a piece of plastic between a couple of carbon plates that are sandwiched between two end plates acting as electrodes. These plates have channels that distribute the fuel and oxygen. They are modular and can be stacked to produce different amounts of power. Fuel cells can operate at efficiencies two to three times that of the internal combustion engine, and require no moving parts. In a kind of reverse electrolysis, hydrogen introduced through a catalytic metal membrane combines with oxygen to produce water vapor and an electric current, which then does useful work. In a fuel-cell car, for example, electricity from the fuel cell would power an electric motor and make the car go. However, due to the cost of making pure hydrogen, most current schemes for mass-market fuel cells propose using natural gas or methanol as the fuel, and that would produce carbon dioxide just like any tailpipe.

Fuel cells have been around a long time. Sir William Robert Grove demonstrated the process in 1839. In the late 1950s, NASA began to build a compact fuel cell electricity generator for use on space missions. Cost was not a constraint. The fuel cells and hydrogen to run them weighed much less than batteries, an important consideration when firing loads into space on rockets. Later, in manned spacecraft, the astronauts could also drink the water that the fuel cells produced.

There is no question that fuel cells exist and that they work. But huge and confounding questions arise over the economics of hydrogen. The problem is that hydrogen is not exactly a fuel. It's more accurately a "carrier" of energy than a fuel. It takes more energy to manufacture hydrogen than the hydrogen itself produces. So at this time hydrogen production depends on the other known energy sources that are all problematic for one reason or another—namely, oil, natural gas, coal, nuclear, hydro, solar, biomass, wind. To some extent, the term "hydrogen economy" is a disguise for "nuclear economy," because nuclear energy may be the advanced societies' only realistic resort where large-scale electric generation is concerned, and the subtext is that an expanded and updated array of nuclear plants could produce large amounts of hydrogen economically. But I will get back to the question of nuclear energy itself later in this chapter.

Of course hydrogen is produced commercially now and has many industrial and chemical uses. But compared with the oil we burn, the amount of hydrogen used by industry is minuscule. Using hydrogen as an industrial catalyst or chemical ingredient is one thing, but it is quite another to propose burning hydrogen as an energy commodity. Where running hundreds of millions of cars is concerned, hydrogen just doesn't scale, as the engineers say. The amount of hydrogen needed to power the U.S. car fleet would be orders of magnitude greater and a net energy loser. We'd get less energy out of the hydrogen than we would put in to create the hydrogen, so what would be the point? The "hydrogen economy" fantasy also does not address the issue of replacing oil and gas to heat tens of millions of houses and other buildings.

The longer you look at the particulars of a proposed "hydrogen economy," the more laughable a fantasy it appears to be. But it is instructive in showing the limits of our thinking, for instance, our blindness to other solutions for America's extreme car dependency in the coming permanent oil crisis. Instead of finding a new fuel to run suburbia, a far more sane and intelligent response might be for Americans to live in traditional walkable communities served by public transit. However, the psychology of previous investment, aggravated by our national mythology about individualism and country living, has so far prevented mainstream America from even considering this alternative. We've poured so much money into suburbia and its accessories that we cannot now allow ourselves to imagine giving it up. And the paradoxical bundle of ideas that combines the liberating nature of endless motoring with entitlement to a settled home in the rural landscape (the American Dream) still exerts awful pressures on our capacity to dream of other living arrangements. Americans who travel to Europe regularly and enjoy the life of the walkable city there also regularly vote against higher-density building proposals back home in Minneapolis and Nashville.

The upshot of all this is that there is not going to be a "hydrogen economy." We may use hydrogen in some new ways, and may continue producing commercial hydrogen chemical products. An enlarged nuclear power infrastructure may lower the cost of making hydrogen by electrolysis. But we are not going to run places like Hackensack, New Jersey, or Anaheim, California, on hydrogen. We are not going to replace the current U.S. automobile and truck fleet with hydrogen-powered cars. And, in the event that yet more miraculous technological breakthroughs occur that would alter the known laws of thermodynamics to make hydrogen as cheap as Texas oil once was, then there is going to be a Long Emergency between now and that rosy future.

Hydroelectric Power

Our national system of interdependent giant regional distribution grids is widely considered to be in dangerously decrepit condition. This was underscored by the great regional blackout of 2003, which took out power from New York all the way to Detroit. The giant energy companies, themselves seem to anticipate a major systems change to what they call "distributed generation," meaning that people will get their power closer to home. The trouble is, the big companies are far from confident about how that would be accomplished. There was a lot of excitement in the 1990s about the development of home fuel-cell generators. These units, each about the size of a refrigerator, would generate all household current as needed by means of fuel cells. Power lines could be dispensed with. One weak spot in the theory was that the fuel cells would run on natural gas, a commodity now in depletion. Another weak spot was that research and development by several companies, led by General Electric, had so far failed to engineer an affordable home generation unit. So distributed generation has come to naught so far. The upshot has been that the giant regional grids, with their long ranks of towers and power lines and substations, are not being maintained because the utility companies are still betting that they will be obsolete sooner rather than later. It may not be long, though, before a critical point is reached where the equipment will not be reparable. And in the Long Emergency we will certainly not have the financial resources to replace it. After that, all electric power may be local, and some localities will be luckier than others.

Solar and Wind Power

…Postwar houses in the southeastern states especially were able to dispense with all the traditional architectural appurtenances for managing uncomfortable weather—porches, high ceilings, transom windows—and the result was a species of phenomenally ugly air-conditioned bunkers utterly sealed off from the surrounding habitats.

I have run a modest solar electric operation for four years at a remote Adirondack vacation house.  We’re off the grid there, unable to hook into any public utility power lines….

The system cost about $3,000 in 2001. If we were hooked up to the grid, we wouldn't have used that amount of electricity at current prices in thirty summers—effectively for the rest of my lifetime. We didn't get it to save money. We got it because it was our only option for having some electricity at our summer place. As I said, it is a very modest system. If you were to run something closer to a normal American household on solar power—meaning a refrigerator, a clothes dryer (another energy-sucking devil), televisions, desktop computers, and so on—you would need something more like a twenty-four-cell battery bank running off sixteen solar panels. The hardware alone would run close to $20,000 (not including installation). The time needed to monitor and service the batteries would necessarily be greater, and batteries do go bad. Even with careful maintenance, the whole battery bank might have to be changed every ten years at the cost of thousands of dollars. The solar panels themselves would last quite a bit longer than the batteries, but even they are subject to ultraviolet degradation and exposure to water and ice. Of course, in certain regions of the country limited seasonal sunlight might make solar electric marginally worthwhile even if there were no alternative.

It is possible that improved batteries and more efficient solar cells may be engineered. So far, however, the battery problem has been particularly vexing. The technology has not changed much in nearly a century. The lead-acid wet-cell batteries in my circa-2001 solar electric system are not substantially different from the battery in a 1912 Oldsmobile, and although researchers have been working doggedly in recent years to improve battery technology, their work has yielded only modest refinements. For example, lithium-based batteries work well in laptop computers and LED lights, but so far they have not been economically scalable for household solar power systems. This is one of the main reasons that electric cars have been such a flop during the past decade: The batteries could not be improved to make them significantly less bulky or lighter, or to increase the travel range between charges. What's more, electric cars would have carried a base price 30 percent higher than comparable gasoline models, while the batteries would have to be replaced every few years for many thousands of dollars more. These problems left the electric car in oblivion. But they were developed in the first place not in expectation of oil shortages but to mitigate the separate problem of air pollution. In 2001, the California legislature mandated that 10 percent of all cars sold in the state be low-emission vehicles by 2003. In 2003, having failed abysmally to interest the public in buying electric cars, California rescinded the mandate. Meanwhile, General Motors shelved the development of its once-touted EV (electric vehicle). As of late 2003, both Ford and General Motors were turning their attention to fuel-cell cars instead —the idea being that a fuel-cell car would be in effect an electric car, using an electric motor, only without the bothersome batteries. However, fuel-cell cars are problematic for reasons already discussed pertaining to hydrogen and natural gas.

There is a set of erroneous popular notions to the effect that renewable energy systems such as solar power, wind power, and the like are available as freestanding replacements for our fossil-fuel-based system, that they are pollution-free and problem free —that renewables represent something akin to perpetual motion, a gift from the sun. The operation of a solar electric system, like the one I run on an Adirondack lake, does not itself produce pollution, but the manufacturing of the components certainly does. The batteries, the panels, the electronics, the wires, and the plastics all require mining operations and factories using fossil fuels. And the components were transported by diesel truck to the marina dock from far away, and ferried to the site via motorboat. This gets back to the question as to whether these systems could exist without the platform of an oil or coal economy to produce them. I don't think so….

Fossil fuels allowed the human race to operate highly complex systems at gigantic scales. Renewable energy sources are not compatible with those systems and scales. Renewables will not be able to take the place of oil and gas in running those systems. The systems themselves will have to go. Even many "environmentalists" and "greens" of our day seem to think that all we have to do is switch inputs. Instead of running all the air conditioners of Houston on oil- or gas-generated electricity, we'll use wind farms, or massive solar arrays; we'll have super-fuel-efficient cars and keep on commuting over the interstate highway system. It isn't going to happen. The wish to keep running the same giant systems at gigantic scale using renewables is the heart of our illusions about solar, wind, and water power.

Synthetic Oil

Coal can be processed into very high-grade synthetic oil and gasoline, as it is itself just a solid hydrocarbon version of the same prehistoric organic goop from which oil was formed. The Nazis were able to do a lot with coal during World War II. They had to because they possessed almost no oil of their own. But they had rich supplies of coal. In the 1930s, when the United States was getting half its total energy from coal, Germany was still getting 90 percent from coal—and only 5 percent from oil. When Adolf Hitler came to power in 1933, he had already enlisted the help of the giant chemical company I. G. Farben in a scheme to produce significant quantities of synthetic oil from coal. The process had been invented in Germany in 1913 by Nobel Prize-winning chemist Friedrich Bergius, and I. G. Farben owned the patents. It involved adding hydrogen to coal under high temperature and pressure, in the presence of a catalyst. The process was energy-intensive and expensive, but price was no object for Hitler. By September 1939, as he prepared to invade Poland, Germany was running fourteen hydrogenation plants for making synthetic gasoline and aviation fuel, with six more on the drawing board.

Biomass

Forget biomass. It's only a cruder variation of thermal depolymerization. The idea is that we would supplement our fossil fuel-burning power plants by adding organic materials such as cornstalks, switchgrass, willow sticks, and sawdust. Biomass schemes are predicated entirely on the assumption of an underlying fossil fuel platform, especially in terms of agricultural waste products such as cornstalks grown under an industrial agriculture regime using massive petroleum and natural gas "inputs" for artificial manufactured fertilizers, harvesting, and transport. This applies in particular to all schemes promoting ethanol (alcohol derived from plants) as an "environmentally friendly" additive to gasoline. The amount of petroleum and natural gas needed to produce the corn to make the ethanol would more than cancel out any benefit from using a supposedly non-fossil fuel.

In fact, we will surely have to resort to one particular form of "biomass" use in the future, but not in any way resembling the fantasies proposed by the corporate and environmentalist tech-meisters. That is, we'll probably have to burn a lot of wood to stay warm in the Northern Hemisphere, which means that many of us in advanced industrial societies will be returning in some respects to preindustrial modes of living. In this event, I think we can expect a fairly massive devastation of forest in those places—such as America east of the Mississippi —where forests had been able to recover during the many decades when coal, oil, and natural gas reigned in home heating. The future deforestation of North America (and Europe) could be as rapid and dramatic as the extermination of the American bison in the decades after the Civil War.

Nuclear Energy

Since the so-called "alternative" energy sources described above are all in one way or another implausible on a long-term basis without the subsidy of oil, the only remaining alternative is nuclear energy. About 20 percent of the electricity generated in the United States today comes from plants powered by nuclear reactors. In France it is closer to 70 percent (much of the rest is hydroelectric). Despite the fact that the use of nuclear power has become rather routine, it is extremely problematic in the long term for reasons that go far beyond, but include, plain energy economics, and it is fraught with potentially great political tribulation. But in the short-to-medium term, it might be all we really have to fall back on.

What the nuclear option comes down to is this: Unless we want living standards in the United States to slide far beyond premodern levels in the absence of cheap oil and natural gas, we will have to use nuclear fission as our principal method for generating electricity for some time into the twenty-first century while we scramble to make other arrangements. However, even if the United States embarks on an aggressive policy of building a new generation of nuclear reactors, life will still have to change drastically. It's really a question of whether we want these changes to happen with the lights on or the lights off. What distinguishes modern life most from premodern life is our access to electricity, and especially liberal, regular supplies of it.

We surely will have to reform our land-use habits and the oil-based transportation system that has allowed us to run our car-crazy suburban environments. We'll have to drastically change the way we grow our food and where we grow it. Social organization may be quite different in the decades ahead. Features of contemporary life that we have taken for granted, such as commercial aviation and canned entertainments, may fade into history. Politics that evolved to suit the fossil fuel fiesta, both on the right and the left, may morph beyond recognition around new forms, patterns, and values. But if we want the enterprise of civilization to continue as a general proposition, we'll have to keep the lights on, and the only way to do that by the mid-twenty-first century will be by using nuclear reactors to generate electricity.

I am not entirely convinced that we can do this for long without the fossil fuel platform to support the construction, manufacture, maintenance, mining, and processing activities that are necessary to create and service nuclear reactors. But the power obtainable from nuclear fission is so much greater than that of solar-electric, wind, biomass, and all other "alternative" fuels that an investment of any remaining fossil fuel in nuclear power could be more than a break-even or dead-loss proposition, and might, in turn, buy the human race more time to make more sustainable arrangements. Thirty years from now, we may have to resort to coal to service nuclear reactors, or perhaps synthetic oil derived from coal. But the basic energy equation of nuclear power vis-à-vis coal is very plain: One single atom of fissionable uranium will produce 10 million times as much energy as the burning of a single carbon atom. Uranium will produce 2 million times as much energy per unit mass as oil.

There is enough naturally occurring conventional uranium around to generate electricity based on current technology for perhaps a hundred years. Naturally occurring uranium is composed of two isotopes: It is 99.3 percent U-238 and 0.7 percent U-235. U-235 is the more fissionable of the two. Most nuclear power plants today use enriched uranium, in which the concentration of U-235 is increased from 0.7 percent to about 4 to 5 percent. Uranium is relatively cheap—about $30 per kilogram (2.2 pounds). The amount of uranium needed to supply electricity for a family of four for a lifetime would fit in a beer can.

There are 109 licensed nuclear power reactors in the United States and about 400 in the world. Reactors work by producing heat from controlled nuclear fission—that is, from neutrons induced by a critical mass of uranium atoms bombarding adjacent nuclei and splitting off more national nuclear waste storage program. Spent fuel rods can also be reprocessed in such a way that enough fissionable material is recovered from one batch to run a given reactor for an additional year. Ultimately, though, the waste has to go somewhere and has been accumulating all over the nation for decades. The average reactor will produce about 1.5 tons of waste per rear. When incorporated in a stable glass matrix, this would amount to around five cubic yards of waste. Since the first commercial nuclear power plant began producing electricity in 1957, the total amount of accumulated spent fuel is 9,000 tons. It would all fit inside a space equivalent to a high school gymnasium with room to spare.

In July 2002, President George W. Bush signed House Joint Resolution 87, allowing the U.S. Department of Energy to take the next step in establishing a safe repository at Yucca Mountain. The DOE is currently preparing an application to obtain a Nuclear Regulatory Commission license to proceed with construction of the repository. This has ended the long political deadlock, though not the profound questions of ultimate safety. It takes five hundred years for the spent, stored waste of a nuclear reactor to decay to the point at which it is only as dangerous as naturally occurring uranium ore.

In reality, there may only be such a thing as relative safety. But it is worth considering that many more lives have been lost in the coal industry than in the nuclear power industry in the past five decades. In the past forty years, not a single fatality has occurred as a result of the operation of a civilian nuclear power plant in the United States, Western Europe, Japan, or South Korea. The Chernobyl nuclear power plant accident on April 26, 1986, in the former Soviet Union, was another matter. Thirty-one people died as a direct result of the explosion and fire that followed. The largest estimates of cancer deaths related to the Chernobyl accident is in the low thousands, with an unknown number of cancer cases yet to present in people who were children at the time of the explosion. About twenty square miles of land became uninhabitable for a long time. In comparison, there were no deaths in the 1979 accident at Three Mile Island, Pennsylvania. Radioactive gases were vented, but there is no accepted evidence that this harmed the public.

The Chernobyl reactor was a Russian-designed RMBK model infamous for its built-in lack of safety features. It was designed in the spirit of Soviet expedience to both produce electricity and make bomb-grade material at the same time. The reactor didn't have a containment shell. It was also designed in such a way that if the reactor happened to overheat, the reaction rate automatically increased rather than decreased. It was, in short, an accident waiting to happen. Sixteen such RMBK reactors were built in the former Soviet Union. Many of them are still operating. Reactors in the United States and the West, including Japan and South Korea, are designed very differently.

No new U.S. nuclear plants have begun commercial operation since 1996, and most date from the 1970s and 1980s. No nuclear plants were under construction from the 1990s to the time of this writing, and no proposed ones have begun the difficult licensing and approval process. In essence, after Three Mile Island and Chernobyl nuclear energy became a politically toxic subject, and the peak of the cheap-oil fiesta that ran from the 1986 price crash until the attacks of September 11, 2001, allowed the American public and their leaders to stop even thinking about nuclear energy. This situation is apt to change, especially as the United States begins to experience the coming natural gas crunch, which will chiefly affect electric power generation.

The use of so-called breeder reactors could extend the horizon of obtainable electricity from nuclear power further into the future. Breeder reactors use the widely available uranium isotope U-238, together with small amounts of fissionable U-235, to produce a fissionable isotope of plutonium, Pu-239. Plutonium, however, is tremendously dangerous both as a persistent radioactive poison and as a material for bomb-making, and therefore the security requirements for running breeder reactors may be beyond the organizational means of the society we are apt to become in the future, namely one with much weaker central authority, less police power, and reduced financial resources. This is perhaps another way of stating that social stability has been an indirect benefit conferred to us by cheap oil, and in the absence of that oil we can't assume the complex social organization needed to run nuclear energy safely.

In any case, the United States shut down its only prototype breeder reactor and currently has no significant breeder research, development, and demonstration program. Other countries are not doing much better. Work continues in Japan and in Russia but has ceased in the United Kingdom and France.

Chapter 5. Nature Bites Back: Climate Change, Epidemic Disease, Water Scarcity, Habitat Destruction, and the Dark Side of the Industrial Age

There is near unanimity among the scientific community that global warming is happening. There is also a definite consensus emerging that the term "climate change" may be more accurate than "global warming" to describe what we are in for. The mean temperature of the planet is going up. The trend is unmistakable. Average global land temperature was 46.90 degrees Fahrenheit when modern measurements began and had reached 49.20 degrees F in 2003. The rate of change has also increased steadily. The total increase of 2.30 degrees might seem trivial, but has tremendous implications. And the rise in temperature happens to correlate exactly with the upward scale of fossil fuel use since the mid-nineteenth century.

It may not matter anymore whether global warming is or is not a byproduct of human activity, or if it just represents the dynamic disequilibrium of what we call "nature." But it happens to coincide with our imminent descent down the slippery slope of oil and gas depletion, so that all the potential discontinuities of that epochal circumstance will be amplified, ramified, reinforced, and torqued by climate change. If global warming is a result of human activity, fossil fuel-based industrialism in particular, then it seems to me the prospects are poor that the human race will be able to do anything about it, because the journey down the oil depletion arc will be much more disorderly than the journey up was. The disruptions and hardships of decelerating industrialism will destabilize governments and societies to the degree that concerted international action—such as the Kyoto protocols or anything like it—will never be carried out. In the chaotic world of diminishing and contested energy resources, there will simply be a mad scramble to use up whatever fossil fuels people can manage to lay their hands on. The very idea that we possess any control over the process seems to me further evidence of the delusion gripping our late-industrial culture —the fatuous certainty that technology will save us from the diminishing returns of technology.

So for the purposes of this book, the relevant question concerning global warming and climate change is not whether human beings caused it or whether we will come up with some snazzy means to arrest it, but simply what the effects are likely to be and what they signify about the way we will live later on in this century.

Environmental Destruction

The damage to global ecologies by human activity accelerated rapidly with the onset of industrialism. The twentieth century, with its oil-nurtured bloom of human population, was especially harsh. Everywhere, biological complexity was compromised or reduced to monoculture. Habitats were wrecked. Species were exterminated. Terrain and water were poisoned. The amount of asphalt paving alone in the United States represents an ecological insult beyond calculation. These man-made environmental catastrophes will combine with and be reinforced by the new problems of climate change in the Long Emergency.

Harvard biologist Edward O. Wilson warns that China's current program to mitigate huge population increases with gigantic water projects may have dire consequences. Irrigation and other withdrawals have already depleted the Yellow River, which, starting in 1972, has run bone-dry part of the year in Shandong province, where one-fifth of China's wheat and one-seventh of its corn is produced. In 1997, the river stopped flowing for a record 226 days. The groundwater levels of the northern China plains have plummeted. The water table in major grain-producing areas is falling at the rate of five feet a year. Of China's 617 cities, 300 already face water shortages. Of China's approximately 23,000 miles of major rivers, 80 percent no longer support fish life.

The Xiaolangdi dam project now under way along the Yellow River in north China is exceeded in size only by the Three Gorges Dam on the Yangtze in south China. In addition, the Chinese government intends to siphon water from the Yangtze—which has not yet run dry—and send it over by a canal system to the Yellow River and Beijing, respectively. When it is running, the Yellow River is already one of the most particle-laden in the world. Because of that, it is estimated that the Xiaolangdi dam would silt up within thirty years of completion. The $58 billion project is reminiscent of another centrally planned mega-project that ended in grief: the Soviet Union's scheme to drain the Aral Sea to irrigate gigantic cotton farms in Kazakhstan. The project turned one of the world's largest inland bodies of fresh water into salty desert.

Like China, the United States is divided roughly in half between wet and dry. Though the human population of the United States is proportionately much smaller than China's, the amount of effort America has expended on manipulating habitats and altering terrain is as impressive in its own way as China's birthrate. Especially significant is the stupendous amount of paving laid down in the United States during the past hundred years. It prevents rain from being absorbed as groundwater and sends it instead into rivers, and ultimately into the ocean. The effect of this is the inability of water tables and wetlands to recharge and the diminishing ability of the terrain to support life. In the United States, only 2 percent of the country's rivers and wetlands remain free-flowing and undeveloped. As a result, the country has lost more than half of its wetlands.

The U.S. average of 1,300 gallons of water per day, per citizen, is the highest use rate in the world, and some sixty times the average for many third world nations. Low density suburban sprawl is the fastest-growing sector of water use in the United States now. Both suburban Atlanta and suburban Denver are virtually tapped out, unable to increase their water supply under any circumstances. Dallas and San Antonio are not far behind. Las Vegas hallucinates its future water supply, and southern California is at the mercy of the Sierra and Rocky Mountain snow-packs, which in recent years have shown alarming declines. Global warming implies that a greater proportion of the annual precipitation in the American west will fall as rain rather than snow. The snowpack acts as a storage reservoir, releasing water in summer time when demand peaks. If that precipitation falls as rain instead, it will flow into rivers and streams and run off into the Pacific Ocean at a time of year when demand is lower. The result will be summer crises in both water and power generation. A joint study by a consortium of U.S. agencies and institutes projected that over the first half of the twenty-first century a one-third drop in reservoir levels along the Colorado River would cut hydropower generation by as much as 40 percent. The same study also predicted reduced flows in the Sacramento River and the Columbia River.

Much of Florida is barely a few feet above sea level. It is not necessary for the tides to wash over Dade County, or for a category-five hurricane to strike, for "normal" American life to be endangered there. Most Floridians live within ten miles of the coast. On the Atlantic side, they depend on the Biscayne aquifer for their fresh water. More than go percent of Florida's population depends on groundwater as the source of drinking water for public and private wells. If ocean levels rise even marginally, seawater will invade the Biscayne aquifer and Floridians will have to make other arrangements. At the upper margins of global warming prediction, a reduction of the West Antarctic and Greenland ice sheets similar to past reductions could cause sea level to rise ten or more meters. A sea-level rise of ten meters would flood about 25 percent of the U.S. population, with the major impact being mostly on the people and infrastructures in the Gulf and East Coast states.

The nation currently has about 50,000 separate municipal or county water systems, and to aggravate matters, the existing infrastructure of pipes in most U.S. cities and towns is decrepit. Much of it was originally installed in the early decades of the previous century. Water main breaks run around 238,000 incidents a year. In Atlanta in recent years, the water coming out of the highly stressed Chattahoochee River was so turbid that at times citizens could not see the drains in their filled-up bathtubs. The urgent need to replace this massive infrastructure will confront the reality of a nation entering functional bankruptcy in the Long Emergency. Climate change, competition for water, and polluted water sources will also be exacerbated by failures in the electric grid caused by oil and gas supply disruptions. Even if water is available, localities may lack the power to push it through their treatment plants and municipal pipes.

Return of the Grim Reaper

The huge rise in world population and relative remission of global warfare in the decades since 1945 has also seen a tremendous increase in the factory farming of animals both in sheer numbers and scale of operation. This has led to many unhappy consequences, some of them rather arcane. For instance, when hurricanes Floyd and Irene successively struck North Carolina in 1999, the damage they caused was due not to high winds as much as flooding from torrential rains. North Carolina had somewhat recently developed an enormous pig factory farming industry, which was very hard hit by the hurricane-caused floods. As these storms sent local streams over their banks, untold quantities of pig manure and hundreds of thousands of drowned swine carcasses were distributed over the lowlands of eastern Carolina. In a matter of days, the dead swine began to rot. Groundwater was compromised for months afterward and homeowners who used wells—which were the majority of residents in these rural counties —had to make other arrangements for their water. The situation could have been much worse had the hurricanes struck earlier in the season and been followed by a few days of late summer heat.

Despite miraculous advances in medical technology, genetic typing, and immunology, the nations of the world are not much better prepared for a severe flu epidemic than they were for the 1918 outbreak. Epidemic influenza is extremely difficult to counteract. Flu vaccines developed in any given year are notoriously ineffective against new strains that come along the following year. It takes seven months or more to create, test, manufacture, and distribute a vaccine developed in direct response to a new virus, and by that time the disease can burn through global populations. If a pandemic broke out today, hospital facilities would be overwhelmed. Nurses and doctors would be infected along with the rest of the population.

Methods of factory farming in recent decades have included massive dosing of the animals with antibiotics; the predictable result has been the evolution of germs and bugs that are now resistant to drugs, in particular the bacteria responsible for food poisoning: Salmonella, E. coli, and Campylobacter. It takes years to develop, test, and gain approval for new antibiotic drugs. So while pharmaceutical companies are slowly developing potent new classes of antibiotics, resistance is developing at a rate faster than the drug companies can develop replacements. The overuse of antibiotics in livestock has been mirrored by the overuse of antibiotics in regular medicine. Within the last few years there has been an emergence of bacteria resistant to vancomycin – a last defense drug for some illnesses, including deadly blood infections and pneumonia caused by Staphylococcus bacteria.

Factory farming of animals has been behind the frightening and mystifying mad cow problem. The effort to economically hyperrationalize meat production on a gigantic scale led to the use of slaughterhouse waste in cattle feed as a protein booster. The material used included the brains and spinal cords of cattle, sheep, and pigs, turning livestock, in effect, into cannibals—and they are not even supposed to be carnivores. In England, where proportionately more sheep and lambs are raised than in the United States, sheep's brains and nerve tissue infected with the neurological disease scrapie made their way into cattle via commercial feeds, and the cattle began presenting horrifying symptoms—loss of motor control, raging fits, seizures, and ultimately death. Autopsy showed that the affected cows' brains were riddled with channels and holes, like sponges; hence the name of the disease: bovine spongiform encephalopathy (BSE). The disease first came to public attention in 1986. In the years since, 155 human beings, mostly in England, came down with an odd variant of a rare condition called Creutzfeldt-Jakob disease (CJD). Autopsy showed very similar spongy brain degeneration as had been found in BSE-infected cattle. CJD had previously been encountered as a medical curiosity in such exotic milieu as the more isolated parts of Indonesia, where there was a long tradition of eating the brains of enemies slain in tribal warfare. The CJD that showed up in Europe presented slightly different and terrifying symptoms. With this variant disease, called vCJD, patients showed not just dementia but also extraordinary behavioral problems—wild rages, violence, screaming. Unlike previously-known CJD, which almost always appeared in victims over sixty years old, the new variant showed up in younger adults. It was inexorably fatal. It was also believed to have an exceptionally long incubation period, longer even than the AIDS virus, somewhere between ten and twenty years. Because of the exceptionally dramatic course of the illness and the long incubation period, a very strong reaction set in once the public became informed of the problem. This led to the wholesale slaughter of British cattle and the collapse of the English beef industry.  Years later, English beef is still regarded with suspicion in Europe.

The Social and Economic Consequences of Disease

Western Europe had been a forgotten backwater of the known world after the fall of Rome and the consequent shift of wealth and power to distant Constantinople. A brief climate cooling had accompanied the fall of Rome and its aftermath. Europe had endured centuries of darkness, cultural amnesia, and squalor. The climate then underwent a general warming from about A.D. 900 to 1300. Life in Europe improved. Under the mode of social organization generally labeled feudalism, European populations increased along with the food supply. Much of the surplus wealth that the feudal kings and lords of Europe managed to acquire was spent in the on going project of the Crusades, an attempt to defend Christianity by pushing back the conquests of a militant Islam that had subsumed the old Christianized people of the Middle East, then moved aggressively through North Africa up into Spain and France, and also into Christian Asia Minor by way of the Seljuk Turks. For three centuries the armies raised for the Crusades also had been an outlet for a European peasantry multiplying under favorable conditions, and thus a brake on population growth.

By 1291 the Crusades were over, ending in a stalemate, with the Muslims shut out of northern Europe and the Europeans chased out of the Holy Land. Over the next peaceful quarter-century Europe's population reached a critical level about equal to the solar carrying capacity of the region. That is, the people could raise enough food to feed themselves and no more. Some towns were beginning to suffer from scarcity of firewood. The tweak of climate change beginning in 1315 lowered the carrying capacity of Europe instantly. Grain production suffered markedly for three years running and a general famine commenced. Even when "normal" weather patterns returned after 1318, there was a scarcity of seed grain to resume full food production and the famine lingered. The mortality rate was high and all classes eventually suffered. Ten to 15 percent of the population died, most from disease induced by weakened immunity. The famine certainly provided a vivid and tangible sense of limits for the number of people the region could support, a warning from the earth to its inhabitants that was, of course, interpreted as a punishment visited by God for man's wickedness.

By 1325, agriculture in Europe had recovered, and for the next twenty years the population resumed rising back to the solar carrying capacity of the region. Before long, the military effort that had been put into the Crusades for so long was now directed into the first skirmishes of the Hundred Years' War, a contest between England and France over the control of French territory. This was explicitly a struggle for extra carrying capacity and resources.

The chief beneficiaries of the Crusades had been Venice, Genoa, and the.other Mediterranean ports. And it was as a result of the trade these port cities generated with far-flung corners of the world that the bubonic plague stole into Europe from somewhere in Asia in 1347. What made the plague so terrible was not just the sordidness of the disease itself, or even the shockingly high mortality rate, but the fact that once established it recurred in the same region or city for several years running. So many people died that there were labor shortages all over Europe. By the end of the 1300s peasant revolts broke out in England, France, Belgium, and Italy. Feudalism, based on a surplus of agricultural peasants tied to a particular place, unraveled. The general notion of wealth and status began shifting from land to money. Though urban areas suffered grievously during the plague years, in the aftermath of the epidemic displaced peasants and rural artisans gravitated into depopulated towns and cities, found opportunities there, and began to take part in the civic relations that would lead to a new commercial society we now identify with the Renaissance.

A contemplation of these circumstances that occurred seven hundred years ago gives us an idea of what to expect in the Long Emergency. One big difference is that now we can see it coming. However, we in America flatter ourselves to think that we are above this kind of general catastrophe — because our technologic prowess during the cheap-oil fiesta was so marvelous that all future problems are (supposedly) guaranteed to be solved by similar applications of ingenuity. This was certainly the consensus among the scientists, computer geniuses, and biotech millionaires I rubbed elbows with this year [2005] at get-togethers such as the Pop Tech conference. They were uniformly uninterested in the issues of the global oil peak and natural gas depletion and utterly convinced that the industrial societies would be rescued by hydrogen, wind power, and solar electricity, all to be figured out by their cohort techno-geniuses in due time. If there is anything we have been stupendously bad at in the preceding century of wonders, it is recognizing the diminishing returns of our technologic prowess. Some of our greatest achievements, such as industrialized farming and the interstate highway system, have produced dreadful diminishing returns (e.g., national epidemics of obesity and diabetes and the fiasco of suburban sprawl). This persistent failure or weakness pretty much negates the value of our ability to see what's coming. If anything in the turbulence of the Long Emergency, rather than technologic progress, we are more likely to see a lot of technologic regress—the loss of information, ability, and confidence.

The current urban population of the world, 3.2 billion, is greater than the entire population of the world in 1960. Seventy-eight percent of the urban dwellers in the so-called developing world live in slums. From the West African littoral to the mountain sides of the Andes to the banks of the Nile, the Ganges, the Mekong, and the Irrawaddy, new gigantic slums spread like immense laboratory growth media, waiting to host epidemic disease cultures. Lagos, Nigeria, for example, grew from a city of 300,000 in 1950 to over 10 million today. But Lagos, writes Mike Davis, "is simply the biggest node in the shanty-town corridor of 70 million people that stretches from Abidjan to Ibadan: probably the biggest continuous footprint of urban poverty on earth." Most of the world's new, exploding slums have only the most rudimentary sanitary arrangements, open sewers running along the corridorlike "streets." In the slums of Bombay, there is an estimated one toilet per five hundred inhabitants. Currently two million children die every year from waste-contaminated water in the world's slums. The enormity of this urban disaster is poorly comprehended in advanced nations like the United States, where the drinking water is still safe and even the poor have flush toilets connected to real sewers. But the slums of the world will probably be the breeding ground of the next pandemic, and chances are, once it is under way, the wealthy nations will not be spared.

Chapter 6. Running on Fumes: The Hallucinated Economy

The most significant characteristic of modem civilization is the sacrifice of the future for the present, and all the power of science has been prostituted to this purpose.

—William James

The entropic mess that our economy has become is the final blowoff of late oil-based industrialism. The destructive practices known as "free-market globalism" were engendered by our run-up to and arrival at the world oil production peak. It was the logical climax of the oil "story." It required the breakdown of all previous constraints—logistical, political, moral, cultural —to maximize the present at the expense of the future, and to do so for the benefit of a very few at the expense of the many. In America, free-market globalism became the reigning orthodoxy of both political parties, challenged only by cranks wearing nose-rings at the very margins of society. The moment that the world recognizes the passing of the oil production peak as a reality, globalism will be dead both in theory and practice.

During the years of its brief reign, free-market globalism was regarded as a permanent institution by a broad consensus of leaders from the most august Harvard economists to the most vulgar corporate buccaneers. The news media and their left-right punditry all bought it, too. The idea was that humanity had arrived at an advanced level of sociopolitical evolution, a new economy that would eventually deliver heaven on earth, where everyone everywhere would be rich. The key word was "eventually." Globalism pretended to promise the same nirvana as communism had failed to deliver in its time, and came into full flower just as communism lost its legitimacy. Globalism also had the same tendency to impoverish and enslave huge populations while enriching the elite who managed its operations. The American people were sold on it, even while it destroyed their towns, their landscapes, and their vocations. What a shock, then, to find out that the so-called global economy was just a set of transient economic relations made possible by two historically peculiar circumstances: twenty-odd years of relative international peace and reliable supplies of cheap oil.

Who Needs the Future?

Globalism was primarily a way of privatizing the profits of business activity while socializing the costs. This was achieved by discreetly discounting the future for the sake of short-term benefits. The process also depended on the substitution of corporate monocultures and virtualities for complex social ecosystems wherever possible, for instance, Wal-Marts and theme parks for towns. Globalism was operated by oligarchical corporations on the gigantic scale, made possible by cheap oil. By "oligarchical" I mean that power was vested in small numbers of people running large organizations who were not accountable for their actions to many of the people who were subject to those actions. By "corporation," I mean a group enterprise given the legal status of a "person," with "rights," but in fact devoid of any human qualities of ethics, humility, mercy, duty, or loyalty that would constrain those rights. As Wendell Berry put it, "a corporation, essentially, is a pile of money to which a number of persons have sold their moral allegiance.... It can experience no personal hope or remorse. No change of heart. It cannot humble itself. It goes about its business as if it were immortal, with the single purpose of becoming a bigger pile of money."  Wendell Berry, "The Idea of a Local Economy," Orion Online (http://www.orion.com ), 2002.) The corporate oligarchs of, say, Wal-Mart, Archer Daniels Midland, and the Disney Corporation were not necessarily evil people, but it was in the nature of their actions that a great deal of harm came to localities and local people in them. Under the banner of free-market globalism, the chief side effect of oligarchical corporatism making its money piles bigger was the systematic destruction of local economies and therefore local communities. Thus, the richest nation in the world in the early twenty-first century had become an amazing panorama of ruined towns and cities with broken institutions and demoralized populations—surrounded by Wal-Marts and Target stores.

The free-market part of the equation referred to the putative benefit of unrestrained economic competition between individuals, and because corporations enjoyed the legal status of persons, they were assumed to be on an equal footing with other persons in a given locality. Thus, Wal-Mart was considered the theoretical equal of Bob the appliance store owner, and if Bob happened to lose in the retail competition because he couldn't order 50,000 coffee-makers at a crack from a factory 12,000 miles away in Hangzhou, and receive a deep discount for being such an important customer, well, it wasn't as though he hadn't been given the chance.

The free market also referred to an extreme version of the old idea of comparative advantage, which had meant originally that every locality has something special it is good at producing, or some raw material in ready supply, and that a larger macro economy is made up of such specialist trading partners. Under globalism, this was modified to mean that for the sake of "efficiency" such trading partners ought to forget everything else and pump out as much of their specialty as possible (using the money received to buy goods and services from other specialists). There were a number of problems with this simplistic idea. One was that cheap oil subsidized the whole system, and the system would have been impossible without it.

Cheap oil had allowed populations to explode in precisely those parts of the world that had had, for millennia, a high infant mortality rate and modest life expectancy. Cheap oil was behind the "green revolution" that Increased the food supply in the nonindustrial world. Oil was also behind many of the medicines and preventives that had neutralized tropical diseases. Now, suddenly, most of those children actually survived, grew up, and produced more children who survived and grew up, and over the course of the twentieth century, the global populations hurtled into extreme numerical overshoot. Populations were, in effect, eating oil, notably in food exports from the United States, where agribusiness had completely taken over from agriculture. Local farmers in Africa, Asia, or South America couldn't compete with corporate Archer Daniels Midland's oil-and-gas-based grain crops and U.S. government subsidies. There was no point in even bringing their hardscrabble crops to market when sacks of cheap American wheat sat on the docks of Pusan or Colombo. Farmers in those places felt that they had no choice but to migrate to the city and find some other way to get by. The only comparative advantage that these people possessed was their willingness to work for next to nothing. Cheap oil and free-market globalism turned comparative advantage into a new-kind of feudalism, with the corporations as the lords and the overabundant locals as the serfs. And then, when the comparative advantage of cheap labor ($5 a day) of one place, such as Mexico, was superseded by the cheaper labor (99 cents a day) of another place, such as Sri Lanka, the corporations just moved their operations.

The idea of comparative advantage works when there is a complex local economy intact in the background of each trading partner's specialized item of production, with a variety of social roles and occupational niches to support the long-term project of community. But a locality geared to doing only one thing for export is ultimately a slave system based on the extractive economics of mining. In the extreme version of comparative advantage, under the regime of hyper-turbo late-oil-age industrialism, with its ultracheap transport and instant communications that defeated any advantages of geography, the only comparative advantages left were cheap labor and free capital. One group had all the cheap labor and another group had all the capital, and for a while one group made all the things that the other group "consumed." Thus, comparative advantage became, for a time, a con game strictly for the benefit of large corporations, which ended up enjoying all the advantages while the localities sucked up the costs.

The corporations benefiting from this regime often had no physical home of their own, even in their country of origin—and not a few American corporations had moved their official address to Caribbean pseudonations, where the banking and tax laws were more agreeable. The corporations had no allegiance to any particular place or the people of that place, so the destruction they wreaked was as manifest in the ravaged towns of Ohio and upstate New York as in the environmental degradation of China. America was hardly immune to the consequences of free-market globalism. In effect, the American heartland was overtaken by a new kind of corporate colonialism, emanating from our own culture, but no less destructive than the imposition of foreign rule.

Americans failed to recognize the essential fraudulence of the idea that this destruction was "creative" and would lead to a higher good —in other words, that the end justified the means, even as they watched their towns die around them. Corporations such as Wal-Mart and its imitators used their wealth and muscle to set up "superstores" on the cheap land frontier outside small towns and put every other retail merchant out of business, often destroying most of the town's middle class. They also, incidentally, destroyed the local capacity to produce goods. And the American public went along with it for the greater good of paying a few dollars less for a hair dryer. Bargain shopping justified the extermination of the middle class and all its relations with the locality. The American people were gulled into the fantasy that every day of the year would be like Christmas, Wal-Mart style. The public enjoyed this bonanza of supercheap manufactured goods without reckoning any of the collateral costs, which were astronomical.

The local merchants who were put out of business had been the caretakers of the town. They often owned at least two buildings in town —their homes and the buildings in which they did business—and they generally took good care of them. The physical decrepitude that is now the most visible characteristic of American towns is the direct result of extirpating that class of local people. These individuals also were generally the caretakers of the town's institutions. They sat on the library boards, the school and hospital boards, the planning board. They ran the local charities. They were invested in the history of a place and their living actions had to honor the memory of their forebears and the prospects of generations to come after. Every virtue that grew out of these local relations of person and place was traduced by the big-box national retail corporations, and the American public was absolutely complicit in the hosing that it got.

This raises an interesting question: Is one led to a determinist view that this outcome was an inevitable result of circumstances? Did Americans sell out their towns, their neighbors, the memory of their ancestors, and the future of their grandchildren because they were helplessly in thrall to the blandishments of a cheap-oil economy? I honestly don't know, though I tend to view the outcome as the result of many collective bad choices made by the public and its leaders. But were those choices inescapable? Certainly the process was insidious and played out over several generations.

There is a kind of narrative arc to a story like the industrial revolution. It had a beginning that is fairly easy to establish, say, from Newcomen's first steam engine in 1725, deployed to pump water out of the British coal mines. It had a middle, which I put around 1900 with the factory system fully established, cities at their peak of development, and the conversion from coal to petroleum under way. We are reluctant to identify the climax or the beginning of the end because we are afraid that we will suffer in it, and it is very hard to imagine a world without technological amenities, or fewer of them, or the process in which they become lost to us, and our comfort and safety perhaps with them. In any case, I'd propose that the industrial "story" climaxed in America during the 1970s. The climax was coincident with our passing of the American oil peak. And what we have been experiencing with the so-called free-market global economy and all the disruptive damage around us is a manifestation of our slow and painful arrival at the end of the story. This final phase has taken about thirty years so far, and will probably be complete within this decade. And it will certainly be coincident with the passing of the global oil peak. The economy of the past three decades has been increasingly freakish and bears some examination.

First Heyday of the Corporation

The elevation of abstract finance as a valid realm above the "real" world of hard assets and actual commodities gained legitimacy as the industrial revolution advanced. As commodities and finished goods multiplied, the amount of paper created as media of exchange for these things multiplied. Law's ideas eventually proved correct. Paper finance had a life independent of moneybags in a cellar. The dynamic growth in manufacturing and trade was an engine of wealth in its own right and could only be practically represented by paper certificates agreed to have a certain meaning and value.

In the early days of the United States there were very few corporations, and of those almost all were created for the building of public works such as canals, roads, and bridges. Their officers could be held personally responsible for failures and disasters. Their charters lasted between ten and forty years, often requiring the termination of the corporation on completion of a specific task. In the 1840s, as railroads began to be organized, the nature of corporations changed. Railroads certainly functioned as public amenities but, unlike canals, they were organized as private money-making ventures. Factories, too, began to organize on a scale much larger than the individual workshop. Technologic innovation prompted the need for corporations to define their own purposes, not have one imposed by the government.

By the 1850s, the idea of limited liability began to be adopted in law. Officers of corporations were no longer held personally liable for the financial vicissitudes of a venture, apart from cases of criminal wrongdoing—and there was broad latitude in this, too, if only because the law lagged behind new swindles being innovated alongside new technologies. Under limited liability, a corporation could go bankrupt, but the personal assets of its executives and stockholders enjoyed protection. A corporation could be sued for some misfeasance, and perhaps ruined, put out of business, but the officers were not necessarily subject to civil damages. By 1886, the U.S. Supreme Court decided that corporations essentially had to be treated as "natural persons" under the law, specifically the fourteenth amendment to the Constitution, which had been crafted recently to protect freed slaves in the post-Civil War South. A corporation was able to use this new "Identity" as a means to escape onerous regulations that might abridge its life, liberty, or property. Finally, the life of this fictitious corporate "person" was no longer deemed to be limited to any term of years but would be permitted a kind of immortality, to continue on past the lives of its founders.

The emergence of the modern corporation, along with new industrial technologies and the increased energy inputs of fossil fuels, led to the economic free-for-all of the late nineteenth century. Great complex ventures such as Standard Oil and the Union Pacific Railroad rose, financed by the issuance of shares. There were problems with this new way of doing things. All-out competition between companies in a given field tended to resolve in monopolies, and rather swiftly, too. There were many opportunities for mischief among the corporate officers, such as the "watering" of stock and the cornering of markets in commodities, leading some industrialists to be called "robber barons." Means had to be devised for regulating the immense amount of tradable paper generated; these were institutionalized in banks and stock exchange protocols. Standards and norms of operation for the trading of paper "assets" were established so that the public could agree on the value of things in order to trade them fairly. People got used to the idea of stocks (shares in a company) and bonds (units of debt owed by a company or government at interest) as elements of daily life—at least well-off city people did—and these instruments became normative devices for managing surplus capital, i.e., wealth. Skepticism about the reality of these items persisted, especially among the large rural population, and was reinforced by a business cycle that remorselessly went bust at intervals, leaving families wrecked as if they had been hit by tornadoes, and shaking the very consensus of hopeful expectation that underlay the acceptance of abstract finance in the first place.

What sustained fundamental faith in all these novelties of finance and capital was the continually upward-ratcheting industrial growth despite periodic reversals, which was made possible by the constant increase in available energy, that is, fossil fuels. In America particularly, offering surplus ecological carrying capacity to Europe's saturated habitats, a massive wave of immigration between 1880 and 1920 sustained the idea that growth was a permanent feature of the modern economic landscape. The business cycle might go boom and bust, but when the next boom occurred, there would always be more. More growth. More available energy. More commodities. More finished goods. More grain and beef. More immigrants coming from the constrained ecologies of Europe. More demand for things. More jobs. More production.

The Entropy Express

That economy featured, most importantly, the rehabilitation of finance. American life, with its twin engines of suburbanization and factory production of consumer goods for the whole world, became so quickly and obviously successful that a new consensus formed supporting the value of the dollar and its paper accessories in capital markets, chiefly stocks and bonds. This is not to say that the securities markets boomed in the 1950s and 1960s— it took until then just to recover the value levels of the pre-1929 crash—but stocks and bonds did regain respectability, legitimacy. Those who had lived through the Great Depression, meaning virtually all the men who had served in the wartime army, had very modest expectations about the role of finance in the postwar economy. In the 1950s and 1960s, Americans bought stocks for the annual dividends they paid, not to flip them for a quick profit. In fact, share prices remained relatively very flat during this period. The whole notion of investment was different than it would become later in the twentieth century. In the 1950s and 1960s, stock and bond values were linked much more directly with the successful production of real goods. General Motors derived its profits and paid its dividends on the basis of auto sales, not as today, primarily from leveraging interest rates and other abstract numbers games removed from the actual making of products. In sum, the public attitude about the role of finance was extremely conservative. Finance was not an "industry" per se, but a set of institutions designed to keep the idea of money and its accessories credible, so as to allow real industries to function. A small fraction of the public bought securities, a tiny fraction of the public actually made their livings in finance, and the majority of this tiny fraction —the workaday stockbrokers, bankers, commodities traders, and so on —had incomes that would seem laughable by today's standards. They were middle-class.

Indeed, the middle class in America was never broader. Differences in pay scales from the very top to the bottom in American life at the mid-twentieth century were amazingly democratic by today's standards. From 1947 to 1968, the wage inequality between top executives and lowly workers actually dropped steadily. In the 1960s, automobile assembly line workers made more money than college professors. In 1960, the pay of company CEOs was on average forty-one times the pay of the company workers; by the year 2000 the multiple for CEO pay reached 531 times the pay of a worker.

Banking also regained respectability after the calamities of the 1930s. Federal deposit insurance, which had been instituted in the depths of the Great Depression, and only for deposits under $2,500, was raised to $10,000 in 1950, and the middle class was induced to feel confident about keeping its money in banks again. Interest rates remained modest, but so did inflation. The influx of savings made money available in capital markets to invest in new ventures. It was real money derived from work already done, pay already earned, true capital. Before the great orgy of mergers and consolidation that began in the 1970s, retail banking was largely local and community-centered. Bankers made loan decisions based on firsthand knowledge of projects going on in their communities— not, as today, based on bundling and selling clumps of mortgages for generic suburban developments they have never laid eyes on.

The baby boom generation, the offspring of those who fought in World War II, grew up in this period of extraordinary financial stability and economic promise, and it became their lifelong benchmark for normality. Other bogeymen lurked in the shadows of American culture during the Eisenhower years—nuclear war, racial inequality, Sputnik—but few Americans doubted the soundness of the dollar or the sanctity of the New York Stock Exchange. In fact, the consensus about the rightness of the U.S. economy was so broad and sturdy that the baby boomers revolted violently against its chief manifestation, belief in the value of money, as soon as they entered adolescence. The assassination of John F. Kennedy in 1963 was certainly a crisis point for the collective boomer psyche, since it shattered virtually all of the shared sense of security about the rightness of American political and economic life. The death of JFK sent pubescent baby boomers into a deep funk, from which they emerged with a new and rather strange worldview.

The Final Fiesta

In America, globalism meant the accelerated dismantling of the nation's manufacturing base and its reassignment to other countries where labor was dirt cheap and environmental regulations did not apply. It also meant the ramping up of a "service economy" or, more properly, the myth of a service economy to replace the old manufacturing economy. I say "myth" because it was essentially absurd. It was like the old joke about the village that prospered because the inhabitants were all employed taking in each other's laundry. In fact, far fewer actual things of value were being created in the service economy. It was yet another temporary and protean manifestation of the tremendous entropy produced by inputs of cheap fossil fuel.

This wasn't the only myth, however. Another myth was something called the "digital economy." Computers came on the scene in a big way in the early 1980s, and by the mid-1980s the personal computer began to democratize the "information revolution." Computers changed a lot of things about the way business was done, but at the expense of enormous diminishing returns, which were rarely calculated into the dominant statistical analyses of our national condition. It was assumed, for instance, that computers greatly boosted productivity. Much of that gain was either illusory or fraught with collateral social and economic losses of other kinds. Companies that reported higher productivity were shedding employees like mad and the entire ethos of work in America was being transformed from one of people having secure careers and permanent positions with reliable companies to one of institutionalized insecurity for practically every one below top management in a new general atmosphere of Darwinian corporate ruthlessness —under the rubric of "free-market competition." The computer revolution created an enormous structure of exploitation in the "service" sector of retail, for instance, from underpaid workers at the big-box stores to the supplier plants in China, where one factory was pitted against another to see who could fill Wal-Mart orders for less.

Another way of looking at the productivity myth was as a shifting of burdens from companies to the public. For example, most companies, government agencies, and schools computerized their phone systems, eliminating numbers of live human beings answering the phones. The net result was that it became nearly impossible to make contact with a live person at any company or institution in America. The public now had to waste astronomical amounts of time wading through tedious recorded phone menus or listening to Muzak while placed automatically on "hold," often getting disconnected, or ending up in voice-mail limbo. Communication was hampered by computerization, not facilitated by it. One of the obvious lessons was that human beings were actually better computers than computers. Human receptionists were much more adept than computers at evaluating requests and routing concerns in the proper direction. Under the new universal computer-managed regime, though, it was often impossible for customers to even order products from the companies that sold them.

The outfitting of corporate America with computer networks and systems for bookkeeping, inventory, shipping, and tracking certainly generated a lot of business and sales activity for the computer industry itself, and the boom of the 1990s was, of course, largely based on this tremendous installation of digital infrastructure and its regular updating every two or three years as the computers got more powerful. But that too was fraught with diminishing returns, and unanticipated consequences —another manifestation of entropy. The computerization of corporate America promoted the hemorrhaging of jobs and whole industries to offshore locations and the "outsourcing" of whole departments to other countries. Additional diminishing returns associated with the victory of national chain retail were the wholesale destruction of American communities, including both the "hardware" of towns and the "software" of social roles and networks associated with them. Computers only assisted predatory corporations in more successfully parasitizing existing value in victimized localities. They were most efficient at sucking the lifeblood out of complex communities. They helped "convert" complexity into simpleness (one big box instead of twenty-seven local businesses) and entropizing society.

Ultimately, the computer revolution led to the "dot-com economy" of the late 1990s, which amounted to a classic bubble over the perceived (or misperceived) moneymaking potential of the internet. A few gigantic successes were scored in Web-based businesses. Soon, investment banks were backing stock offerings on hundreds of businesses, many of which amounted only to a dream or a wish on paper. Vast amounts of money were raised in initial public offerings for laughable ventures, but the public had lost its critical faculties. Many investors knew nothing about computers anyway, or were intimidated by them. They had seen immense fortunes made by Microsoft, Apple, Oracle, Sun Microsystems, and the like. They even used Web-based businesses such as Google and eBay, and they assumed that some of the bright young dudes in black outfits and stylish eyeglasses behind the public offerings would be the next Bill Gates or Larry Ellison. Hundreds of other ventures were capitalized and geared up, and a stunning percentage of them failed. The diminishing returns of overinvestment had struck again. Entropy expressed itself in the form of mass delusion. The stock market, especially the tech sector, lost credibility, but there was still plenty of hallucinated wealth left in the American economy and, as we shall see later, it went somewhere else.

The Sprawl Economy and Funny Money

What one also saw in the America of the 1980s and 1990s was commoditization and conversion of public goods into private luxuries, the impoverishment of the civic realm, and, to put it bluntly, the rape of the landscape—a vast entropic enterprise that was the culminating phase of suburbia. The dirty secret of the American economy in the 1990s was that it was no longer about anything except the creation of suburban sprawl and the furnishing, accessorizing, and financing of it. It resembled the efficiency of cancer. Nothing else really mattered except building suburban houses, trading away the mortgages, selling the multiple cars needed by the inhabitants, upgrading the roads into commercial strip highways with all the necessary shopping infrastructure, and moving vast supplies of merchandise made in China for next to nothing to fill up those houses.

The economy of suburban sprawl was a systemic self-organizing response to the availability of inordinately cheap oil with ever-increasing entropy expressed in an ever-increasing variety of manifestations from the destruction of farmland to the decay of the cities, to widespread psychological depression, to the rash of school shooting sprees, to epidemic obesity. Americans didn't question the validity of the suburban sprawl economy. They accepted it at face value as the obvious logical outcome of their hopes and dreams and defended it viciously against criticism. They steadfastly ignored its salient characteristic: that it had no future either as an economy or as a living arrangement. Each further elaboration of the suburban system made it less likely to survive any change in conditions, most particularly any change in the equations of cheap oil.

It wasn't until the traumas of the 1970s that the finance sector mutated from being an adjunct of the industrial economy to becoming a leading "industry" in its own right helping to "drive" the economy. Among the distortions and perversions engendered by the "stagflation" economy was the rise of corporate cannibalism in the form of "creative" mergers and acquisitions, specifically hostile takeovers, the aggressive use of voting stock shares to gain control of companies that did not wish to sell, with the subsequent filleting and sell-off of assets, and discarding of the bones and offal (employee payrolls and obligations, careers, livelihoods, communities). The business culture celebrated the "corporate buccaneers" who engaged in these shenanigans as superstars the way Andy Warhol had elevated junkies and drag queens to celebrity a decade earlier. Of course, many businesses would not have been vulnerable to takeover if the entire manufacturing sector had not been wobbling with a range of problems from antiquated plants and equipment (steel) to inadaptable management (cars) to dismal quality control (electronics). The truth was that by the mid-1970s, American industry was uniformly showing signs of fatigue.

Banking underwent radical consolidation and change, too, following the disorders of the 1973 oil embargo and the Iran price-jacking of 1979. Big banks began gobbling up small banks, which had the effect of hypercommoditizing credit, turning it into just another "consumer" activity carried on at mass scale. Loans became ever more abstract "units" of generic "product," such as commercial mortgages, traded in bundles and clumps like scrap metal. As local business and local ownership became irrelevant, so did local banking and local lending for local ventures. The hypercommoditizing of lending disconnected bankers from knowledge of the ventures they lent money for—just so many strip malls or condominiums—which also tended to reinforce the generic predictability of suburban development all over the country. But this insidious surrender of human judgment would also work in the collective public consciousness to further abstract the nature of assets from the meaning of value or money as a general proposition. The entropy in this kind of banking produced huge diminishing returns that eventually showed up as a landscape defaced by ugly, clownish buildings deployed in wastelands of parking, built by people who didn't care about the places they were exploiting.

Parallel to the consolidation of commercial banks was the deregulation of the savings-and-loan (S&L) sector. These special banks, or "thrifts," were first chartered as a means to provide for long-term home mortgages. Before Ronald Reagan took office, S&Ls had to keep at least 80 percent of their assets in home loans, by law. A typical S&L would offer 3 percent interest on money deposited with it, and make mortgage loans to homebuyers at 6 percent. The 3 percent difference or "spread" covered the bank officers' salaries, and paid building rent and owner profits. Obviously this required a stable currency. The severe inflation and interest rate hikes of the 1970S threatened to drive the thrifts out of business. In 1980, Congress began eliminating the interest rate ceilings on S&Ls, and simultaneously raised deposit insurance from $40,000 to $100,000 per account for S&Ls. The 1982 Garn-St. Germain Act allowed S&Ls to invest up to 40 percent of their money in ventures not related to housing. This freed the owners of the S&Ls to invest in any cockamamie scheme, routinely awarding their banks substantial "points" for lending large sums to construct shopping centers, malls, condo complexes, and so forth, things associated with suburban sprawl development. Often these projects were egregiously unnecessary or redundant, but still very profitable for the bankers and developers. The bankers kept the "points" regardless of whether the projects failed, and these fees could be substantial for developments in the $100 million range. The developers also made out on super-generous fee payments to companies they controlled for superintending construction. If the projects did happen to fail, so much the better. They could be resold (with more "points" garnered). Accounting irregularities abounded, often involving the multiple resale of defaulted properties at inflated appraisal values to conceal previous losses. The more confusingly complex the deals were, the more resistant they were to oversight and the more beneficial they would be to the participants. It was an obvious racket. The orgy of fraud led to the abandonment of the notion, ever associated with credit and money, that foolish lending would be rewarded by failure and ruin. In the case of the S&Ls and their officers, failure and, ruin were heavily rewarded with federal deposit insurance payouts. As a nice side racket, thrift bankers could stash their money in multiple accounts in their own banks, and when their banks failed, they enjoyed $100,000 payouts per account in federal deposit insurance. When the S&Ls went belly up, of course, multitudes of ordinary citizens whose savings were lost also had to be paid out in federal deposit insurance. The result was roughly a half-trillion-dollar tab for the U.S. Treasury. Only a few of the most blatant offenders went to jail when their frauds were discovered, and several senators and congressmen saw their careers ended. But the most remarkable aspect of the S&L debacle was that the American public hardly felt any pain over it. The federal deposit insurance payouts were all subsumed in the gigantic deficits rung up during the Reagan and George H. W. Bush administrations.

Of perhaps greater impact on the finance sector, and on the meaning of money generally, was a set of new speculative activities based on the trading of "creative" financial instruments largely inspired by computerization. Computers could calculate large arrays of variables in ways never before possible. Even if they could not really predict the direction of markets —because of markets' essential nonlinear nature —the computer did increase the number of ways to play markets (and to lift the level of abstraction of "money" ever further away from real value-producing activities). They also enabled money—or, more precisely, the electronic notions and markers of money—to be transferred around the world at the speed of light. This allowed many new opportunities for playing minuscule changes in currency valuations and interest spreads around the world. It led to the "creative" invention of innumerable new "derivatives," or instruments based on values derived from other markers of value, contracts, or bets on the prospect of changing value of— anything. Stocks, mortgages, interest differentials, weather. In short, it was a way of turning all risk, as defined in investment terms, into a casinolike panoply of betting options in a new global investment casino.

During the formative years of the computer revolution, some players assumed that they had super-slick formulas or equations for beating the odds. They also employed strategies for "hedging" their bets so that one potential losing position would be covered by a winning position somewhere else. When extremely large figures were bet, even tiny value spreads could yield fantastic profits. It worked even better if the bets were leveraged —that is, if one had to put up only a fraction of the total bet in one's own cash —using money notionally "borrowed" from other sources to make up the rest, which was especially cool if the player was on a winning streak with snazzy hedging equations and the "borrowed" notional money never really entered the picture except as pixels on a computer. It was also a recipe for disaster. The poster child for the worst-case derivative fiasco was a hedge fund company called Long Term Capital Management (LTCM).

Hedge funds are largely unregulated on the assumption that their customers are wealthy, knowledgeable investors aware of the high risks and not in need of regulatory protection. LTCM was a kind of glorified and very high-toned "boiler room" operation run out of an anonymous corporate box building in suburban Connecticut, far from Wall Street. The firm was started in 1993 by a former vice chairman of Salomon Brothers (and champion bond trader), John Meriwether, who had had to resign from that investment house when an employee under his supervision was caught making false bids on U.S. Treasury auctions. So Meriwether went off with some of his most aggressive Salomon colleagues and opened his own shop far from lower Manhattan. The stars of LTCM's small staff of hotshot traders, computer nerds, econometricians, Ph.D. physicists, and math whizzes included two academic economists who won the Nobel Prize for their "contributions" to the understanding of option pricing—Myron Scholes and Robert Merton. Also on board was a former vice chairman of the Board of Governors of the Federal Reserve System, David Mullins. This crew of geniuses devised extremely complex mathematical models for hedged investment plays in global markets, mostly betting on variations in interest rates between U.S., Japanese, and European sovereign bonds. They would identify patterns in cycles of rising and falling rates and made their plays on the metatheory that markets invariably impose equilibrium on rates, which would revert to predictable norms. Within the range of differentials was a vast realm of small change that they figured no one else but LTCM could see, gazillions of "loose nickels that could be endlessly vacuumed up," in the words of Myron Scholes.

The enterprise, originally capitalized at $1.25 billion, had many leading banks among its clientele. Its moves were ultrasecretive. As a matter of policy, LTCM would not open its books or reveal its trading positions to even its best clients. Five years after its founding, it would control $134 billion in assets. For many of those halcyon years in the mid-1990s, LTCM showed annual returns in the 40 percent range. The principal partners were making scores of millions of dollars a year for themselves in vacuumed-up nickels. Merton and Scholes had such supernatural confidence in their own models that they calculated their chance of failure at zero in the lifetime of the universe and even over numerous repetitions of the universe. (Roger Lowenstein, When Genius Failed, New York: Random House, 2000.) In essence, they believed that computers combined with their own fabulous equations had given them the godlike power to completely eliminate risk in their business.

Disaster came rather sooner than the life of even this universe, in August 1998 to be exact, when the economic basket case, Russia, defaulted on its debt. With $4.8 billion in equity, LTCM had managed to leverage itself to the hilt by "borrowing" (in computer pixels) more than $125 billion from banks and securities firms and had entered into derivatives contracts (bets) with more than $1 trillion at stake. The Russian default set off a chain reaction of international flight away from low-quality sovereign bonds to U.S. Treasury certificates. This upset the reversion to predictable norms, or convergence, built into LTCM's models—interest rates were diverging wildly all of a sudden—which completely queered the immense bets they had placed on the spreads. In addition, LTCM happened to be in exactly the wrong place at the moment of the Russian default—reportedly 8 percent of its book, or $10 billion of LTCM's notional assets, were actually in Russian positions (bets). Russia, a gangster economy with nebulous property laws, shady banking, and little to no legal contract enforcement, was a risky investment, but if one had conquered risk, well.... Only now, instead of vacuuming up gazillions of nickels, LTCM was suddenly hemorrhaging hundreds of millions of dollars in collateral calls from their counterparties, those they had bets with.

With so much notional money at risk, in a world of money so abstracted from any real activity besides the trading of abstractions, the LTCM meltdown raised concern among leading bankers that the entire whirring skein of global digital (hallucinated) capital might unravel in a shitstorm of cross-defaults, leaving behind a lot of notional ruin—including the metanotion that any sort of financial paper issued by any nation or company had value. The chairman of the New York Federal Reserve Bank called in the heads of virtually every major bank in the city—all of which had lent LTCM piles of money to leverage their trades for fantastic profits—and in a matter of days the banks were persuaded to pony up several hundred million dollars each to recapitalize LTCM, under a new agreement that left the firm with only 10 percent of the action. Thus was LTCM saved from actually tanking and taking countless other international entities down with it.

The U.S. Treasury itself never actually contributed any money to the LTCM bailout. But the day after the bailout, September 29, 1998, Federal Reserve Chairman Alan Greenspan reduced interest rates a quarter of a point (25 basis points) to 5.25 percent, in the hopes of stabilizing the wobbling international bond market. He dropped them a second time a month later. From that point on Greenspan's Fed embarked on a long trail of lower interest rates —the magical generation of ever-easier credit—that spawned yet another episode of destructive mischief in the entropic economy: the real estate bubble, perhaps the last act in the sorry drama of the hallucinated economy.

Home: The Last Refuge of Value

The house buying-and-selling orgy of the early twenty-first century was set off by the Federal Reserve's policy, over a five-year period from 1998 to 2003, of steadily reducing to nearly nothing the interest rate that it charged banks to borrow money, which worked its way through the lending system so that mortgage rates fell to historically supernatural lows. The low interest rates were joined by a further decay of lending practices so that practically anyone over age twenty-one with no record of creditworthiness could get a low or even zero down-payment mortgage. Other factors favored a flight of capital from other avenues of investment into houses. With interest rates under 2 percent, normal savings accounts and money market funds had become a joke. Economic pundits beat their breasts about America's pitifully low rate of savings —the conventional means for raising honest capital before the something-for-nothing fever seized the collective national imagination—but only chumps would save in passbook accounts at 1.75 percent. The dot-com meltdown had left a lot of the moneyed middle class feeling hosed by, and wary of, the equities sector. Perhaps even the deep resounding horror of the 9/11 attacks inspired a kind of bunker mentality that translated into a nesting mania. Wasn't that the appeal of Martha Stewart, the goddess of domesticity? So much of the surplus wealth remaining in America at the end of the twentieth century landed in the real estate sector under the theory that real estate was at least real. Finally there was the federal tax policy of the mortgage interest deduction that gave homeowners a substantial advantage over renters, which has always biased the U.S. market not only in terms of personal dwelling choices but in terms of housing typologies offered by the building industry.

True, the population of the United States was growing, but not at a rate that justified the construction of so many new McHouses, as the "units" were called in the pop-up subdivisions. Behind the phenomenal spurt of new construction was the still-accelerating flight not only from the cities, but also from the older suburbs, which were now infected by the spreading rot of the urban core. And propelling that spread was the fact that all through the late 1980s and 1990s, and into the new millennium, oil had only become cheaper in constant dollars until it stood at about $10 a barrel when the younger George Bush took office. This meant that, if nothing else, the nation could continue the suburban sprawl fiesta that had become the virtual replacement for the old manufacturing economy. It was, in short, another self-reinforcing feedback loop, a self-organizing system shaping the American landscape into a nightmarish diagram of motoring hyper-squalor. And underneath all of that was the credit creation machine of Alan Greenspan's Federal Reserve, manufacturing money electronically that wasn't really there, wasn't being accumulated through the traditional, and ultimately only, real means of savings on earnings from doing real work producing real things. It was a magic act.

The supernaturally low interest rates provoked an orgy of buying, and the orgy of buying bid up the prices of the houses, and as the prices of the houses levitated, the owners entered another new and strange zone of hallucinated wealth accumulation using the latest contrivance: the refinanced mortgage. Re-fi's allowed house owners to use their houses as though they were automatic teller machines. Say a person bought a house in 1999 for $250,000 and the house was appraised in 2003 at $400,000; that person could refinance with a substantial "cash out" privilege, converting the imagined increase of value into disposable income, which could then be used to buy motorboats, home theater plasma TV screens, or trips to Las Vegas. Refinancing prestidigitated an estimated $1.6 trillion for the American economy over a five-year period, and much of that "money" was deployed purchasing "consumer" goods—mostly made outside the United States. From 1999 to 2004 roughly a third of all house owners indulged in cash-out re-fi mortgages. The racket seemed without hazard when housing values only went up, up, up. Behind every extravagant cash extraction lay the belief that at some future date the house would be worth a lot more than the re-fi price and could be readily flipped. In super-hot markets such as the Boston suburbs or Long Island or Marin County, properties rarely stayed on the market for more than a few days. Often bidding wars broke out between hysterical buyers, going beyond the asking price.

Lending practices decayed further. New types of lending companies, such as Ditech, came along hawking "miracle loans" on TV with no closing costs, no down payment, making it possible for customers to sleepwalk into owning substantial properties or refinancing existing ones. Outfits like Ditech were a peculiar kind of financial animal, a mutant spawn of what previously had been known as the "sub-prime" market, meaning companies originally designed to serve high-risk borrowers, people with lousy credit records, deadbeats, bottom feeders, habitual bankrupts, schnorrers. After the mid-1990s, there was hardly a technical distinction to be made anymore between high-risk borrowers and everybody else in the casino atmosphere of America society. No one was at risk anymore because in the something-for-nothing economy it was impossible to be a loser. Or so went the herd thinking.

The decay of mortgage standards was abetted by the rise of the giant "government-sponsored entities" (GSEs), Fannie Mae (Federal National Mortgage Association) and Freddie Mac (Federal Home Mortgage Corporation). Fannie Mae started as a part of New Deal policy to stimulate the housing industry. In 1968, President Lyndon Johnson privatized Fannie Mae to get it off the federal budget. It then became a private shareholder-owned company with certain obligations to the public (to make mortgages easier to obtain) in exchange for certain privileges, which included exemption from taxes and oversight, and access to a stupendous line of credit from the U.S. Treasury. Technically, what Fannie Mae does is purchase mortgages from banks where the loans originate. Its sibling, Freddie Mac, was created in 1970 to prevent Fannie Mae from monopolizing the entire secondary mortgage market. Their mortgages are backed by the U.S. government. The existence of these GSEs has diluted, if not eliminated, the discipline inherent to the risky business of mortgage lending, because they are able to purchase such a large percentage of mortgages generated nationwide. The original lenders, knowing they can "flip" the mortgages to the GSEs and be done with them, are far less concerned with the creditworthiness of the borrowers. To the GSEs the borrowers were not even people, merely numbers massed on a video screen. The combined debt possessed by Fannie Mae and Freddie Mac stood at around $3 trillion in 2004 equal to nearly half of the national debt. They are the only Fortune 500 companies that are not obligated to inform the public if they get into financial trouble.

By the time you read this [written in 2005, well before the mortgage meltdown], it is very likely that the housing bubble will have begun to come to grief. With interest rates at rock bottom into the first half of 2004, practically everyone who could have refinanced has now done so. There cannot be another round of re-fi unless interest rates go to zero, which is unlikely to happen and, of course, re-fi doesn't make much sense when interest rates rise, which is what they did in the second half of 2004. In fact, re-fi lending tapered off smartly by late 2004. Housing prices will probably remain inflated for a period of time beyond the end of the re-fi spree because of the end-cycle hangover phenomenon, the persistence of delusional thinking on the part of wishful sellers who refuse to believe that the boom is over and they might have missed out.

In February 2004, Fed Chairman Greenspan made the bizarre suggestion in a public statement that house buyers might consider adjustable-rate mortgages, but the idea seemed insane in a financial climate in which interest rates had nowhere to be adjusted but upward, which would leave many such a house buyer in a terrible predicament of having the mortgage payment go up just when the value of the house had reached its absolute peak and was very likely to fall, as other house owners (especially those with poor credit records, those living marginal lives, those who had lost their jobs since re-fi) lost control of their finances, were forced to sell, or stumbled into default and repossession. Why Greenspan made that suggestion has never been adequately explained. The only possibility is that there was no other way to keep the economy levitated.

The economic wreckage is liable to be impressive. If large numbers of house owners cannot make their mortgage payments, Fannie Mae and Freddie Mac, and by extension the federal government, would be the big losers. The failure of the GSEs would make the S&L fiasco of the 1980s look like a bad night of poker. The failure of the GSEs would pose a far graver situation than the LTCM flameout. It could easily bring on cascading failures that might jeopardize global finance. This time, the American public would feel the pain.

The boom in suburban houses must necessarily be understood as part and parcel of the suburban predicament—the fact that it was part of the greatest misallocation of resources in. world history. The entropic aftereffects are likely to be severe. The housing subdivisions, as much as the freeways, the malls, the office parks, and the fast-food huts, represent an infrastructure for daily living that will not be reusable, except perhaps as salvage. I will discuss the destiny of these places in the final chapter.

Chapter 7. Living in the Long Emergency

I had an odd and illuminating experience a while back, driving from Saratoga ten miles north to the little town of Corinth, New York. The town lies just inside the "blue line" boundary of the state-designated Adirondack “park"—an area actually larger than Yellowstone, but dotted with towns, businesses, factories, a few Wal-Marts, plenty of fast-food establishments, and the usual furnishings of life found in America nowadays. Here in the old Northeast, the land was settled long before the parks movement got going, so the Adirondack Park was an overlay on what already existed.

Corinth (population 2,500) is a paper mill town located on a big bend in the Hudson River upstream of Glens Falls. Above Corinth, the topography gets rugged; the river changes character and becomes increasingly boulder-strewn and riffly. The paper mill closed in 2003 and there is no longer a major employer in the town. No one knows what will become of the town and its inhabitants. For the moment, they seem to get along scrounging a living off the fumes of the cheap-oil economy. They drive long distances on well-maintained state and county roads to low-wage jobs elsewhere—running forklifts in the Target store regional warehouse down in Wilton, being cashiers in the Wal-Mart north of Glens Falls, perhaps frying hamburgers in Saratoga. Or they fix cars, or work on the county highway crew.

On the little two-block Main Street of Corinth, neither the buildings nor the inhabitants look healthy. The buildings show the scrofulous residue of several generations of twentieth-century renovation— exfoliating asphalt shingles from the 1950s, dented aluminum siding from the 1960s, moldy cedar siding from the 1970s (the "environmental" look), and vinyl siding from after that, coated with the inevitable gray-green patina of auto emissions. The shop fronts that are actually occupied —about half the total – contain secondhand stores, hair salons, a pizza joint, and a Chinese takeout. The inhabitants of the town are generally not young. Many are obese, and many of these are cigarette smokers. You see them mostly getting in and out of their cars. No one walks.

If the folks who lived along this highway put in gardens to make up for the escalating inadequacies of an industrial farming system starved for fossil fuel "inputs," would they be able to feed themselves? Did any vernacular knowledge survive in a populace conditioned to think that food came from the supermarket? Did they know anything about cabbage loopers, powdery mildew, or anthracnose? Would they be able to prevent catastrophic crop loss? How would they defend their crops against deer, rabbits, woodchucks? Would any of them know how to build a garden wall, or even a fence? Where would they get fencing material? Would they have to sit out among the potato hills and the bean rows at night with loaded shotguns? And what would they do for light when they heard something munching out there? Would they know how to keep chicken, sheep, cattle, including breeding and birthing them?

The more I thought about everything I was looking at, the more implausible its future all seemed, and the more fraught with ramifying complications. And of course, the accumulated infrastructure of daily life found along Route 9N between Saratoga and Corinth was minor compared to the vast suburban precincts elsewhere across America, much of it in places where nobody could grow an ear of corn or a potato under any circumstances. What would people in the suburban buzzard flats of Phoenix, Las Vegas, or Los Angeles do when the age of cheap oil and natural gas was over? What will this world be like and what will happen to the people of the United States?

I confess I have had occasion to ruminate on these questions before, but each time I actually do it away from my desk, out in the real world of America where the rubber meets the road, so to speak, where people actually live and work and go about their daily lives, I am confronted by a renewed sense of wonder—and nausea. Sometimes my despondency is overwhelming, and one can well understand why the public hasn't wanted to think about these issues, even in the face of obvious and growing peril. A reasonable person could easily conclude that the way of life we have concocted can't possibly go on much longer, and leave it there. I suspect a great many Americans do exactly that because, so far, by early winter 2004, nothing had changed that much. The massive system seemed to have a momentum of its own that defied occurrences such as a doubling of crude oil prices over the past year. Anyway, life in the United States was so frantic—between the grinding job insecurity, and the war in Iraq, and the horrendous traffic, and orange terror alerts, and child abductions, and the maxed-out credit cards, and the hurricane of the week, and the lack of medical insurance—there was already too much in the here-and-now to worry about.

My role as an author is to think about things that the public is indisposed to dwell on, and to present a framework for understanding a particular set of challenges. What follows, then, is admittedly a personal vision. Some of the ideas I'll present may shock you. Social, political, and economic conditions that Americans assumed had been put behind us for all time may return with a vengeance, especially conditions of social inequality in a world roiled by ferocious competition for declining resources—and in a world with so many weapons available. It bears repeating that just because I say a particular unpleasant thing may happen doesn't mean I want it to happen, or that I endorse its happening. On the whole, I retain confidence in human resilience, courage, ingenuity, and even fairness, and I will spell out comprehensively the positive things that can come out of the difficulties that lie ahead.

First, we have to separate what we wish for from what we're actually doing and what can be done. I believe there is a course of action that is appropriate to what we face, and is actually inevitable, whether we go there voluntarily or have to be dragged kicking and screaming into that future: the comprehensive downscaling, rescaling, downsizing, and relocalizing of all our activities, a radical reorganization of the way we live in the most fundamental particulars. Nothing else will permit us to carry on a semblance of civilized life, most especially not wishing for some mysterious deus ex machina, a.k.a. "them," to deliver a miracle energy source to replace our lost oil and natural gas endowments so that we can continue living in a drive-in utopia. Because human social and economic systems are essentially self-organizing in the face of circumstance, the big questions are how much disorder must we endure as things change, and how hard will we struggle to continue a particular way of life with no future?

The Next Economy

The salient fact about life in the decades ahead is that it will become increasingly and intensely local and smaller in scale. It will do so steadily and by degrees as the amount of available cheap energy decreases and the global contest for it becomes more intense. The scale of all human enterprises will contract with the energy supply. We will be compelled by the circumstances of the Long Emergency to conduct the activities of daily life on a smaller scale, whether we like it or not, and the only intelligent course of action is to prepare for it. The downscaling of America is the single most important task facing the American people. As energy supplies decline, the complexity of human enterprise will also decline in all fields, and the most technologically complex systems will be ones most subject to dysfunction and collapse— including national and state governments. Complex systems based on far-flung resource supply chains and long-range transport will be especially vulnerable. Producing food will become a problem of supreme urgency.

The U.S. economy of the decades to come will center on farming, not high-tech, or "information," or "services," or space travel, or tourism, or finance. All other activities will be secondary to food production, which will require much more human labor. Places that are unsuited for local farming will obviously suffer, and I will discuss this later in the chapter. To put it simply, Americans have been eating oil and natural gas for the past century, at an ever-accelerating pace. Without the massive "inputs" of cheap gasoline and diesel fuel for machines, irrigation, and trucking, or petroleum-based herbicides and pesticides, or fertilizers made out of natural gas, Americans will be compelled to radically reorganize the way food is produced, or starve.

For the past hundred years the trend has been for fewer people to be engaged in farming, and for farming to be organized on an ever more gigantic corporate scale. In that short span of time farming has transitioned from work done by people using knowledge and tools to work done by machines with minimal human presence, almost by remote control. There is a reason that farming is called agriculture. The culture part stands for the body of knowledge, skill, principles, and methodology acquired over thousands of years. Most of that knowledge has been jettisoned in the rush to turn farms into something like automated factories. In fact, the current system is explicitly called "factory farming" by those who run it. The technology of factory farming promotes the expansion of farms by orders of magnitude above what had been the upper limit for traditional nonindustrial farms. Increasingly farming has changed from being organized on a family and community basis to being corporate and national, even global, with few benefits for the localities where it takes place and with devastating effects on local ecologies and social relations. The diminishing returns of technology in farming have been especially vicious. Few other human activities demand so much respect for natural systems, and the abuse of natural systems has been monumental under the regime of industrial farming. The genetic modification of monoculture crops is only the latest (and possibly the final) technological insult among many previous ones, and comes at the climax of the industrial blowout. Diminishing returns are nature's way of biting back. The "winners" in recent decades have been corporations that could enjoy the economies of scale conferred by gigantism. Their only benefit has been monetary profit. The "losers" can be summarized generally as the future and its inhabitants. They stand to lose not only future wealth, but also their civilization.

All human enterprise can tend toward diminishing returns and unsustainability, but some modes have far more long-term prospects than others and some are socially suicidal, even in the short term. Many civilizations, from the Sumerians to the Maya, have faltered when overinvestments in the scale and complexity of food production produced ruinous diminishing returns. On American farms in the early 1800s, the balance between calories expended and calories produced as food was about even. This occurred as tools reached a high stage of refinement but before machines replaced human labor and traditional knowledge. It implies a distinction between tools and machines, between work done with tools and work done by machines. Production improved while entropy was kept to a minimum. Under the current industrial farming system it takes sixteen calories of "input" to produce one calorie of grain, and seventy calories of input to produce one calorie of meat. (Douglas Harper, Changing Works: Visions of a Lost Agriculture, Chicago and London: University of Chicago Press, 2001.) A hundred years ago, just before the introduction of the fossil fuel-based technologies, more than 30 percent of the American population was engaged in farming. Now the figure is 1.6 percent. The issue is not moral, academic, or aesthetic. Rather it's a matter of those ratios being made possible only because cheap oil and automation made up for so much human labor. We did what we did in the twentieth century because we could. Of course, not all farm labor amounts to slavery or serfdom. Depending on how farming is organized, it can result in a very satisfactory way of life and rewarding social relations. Agriculture in the United States was organized very differently in Pennsylvania and South Carolina 150 years ago, and not simply because of climatic differences.

As industrial agriculture reached its climax in the early twenty-first century, the fine-grained, hierarchical complex relations between the soil and the human beings and animals associated with food production have been destroyed or replaced by artificial substitutes. Farmland has in effect been strip-mined for short-term gain. Instead of soil stewardship achieved by acquired knowledge of practices such as crop rotation, manuring, and fallowing, corporate farmers just dump industrial fertilizers and toxins on ground that has been transformed from an ecology of organisms to a sterile growth medium for crop monocultures. Iowa prairie soils 150 years ago had about twelve to sixteen inches of topsoil; now they have only about six to eight inches of topsoil. The loss continues. The "Dust Bowl" of the 1930s was the coincidence of a periodic drought with a decade of zealous overplowing as tractors came broadly into use. The diminishing returns of mechanized plowing were not understood until a catastrophe had been set in motion. The human race had no prior experience with tractors.

A natural lack of rainfall on the Great Plains and in the deserts of California has been compensated for in recent times with heroic amounts of irrigation. In the case of the Great Plains, we've been depleting underground reservoirs (aquifers) of what is essentially fossil water accumulated over the geologic ages and not subject to timely restoration. California's industrial farming system has been made possible by colossal water diversion projects, based on dams (built with fossil fuels) that are silting up and are not likely to continue to function beyond the twenty-first century. The distance people live from the sources of their food has expanded to the extent that today the average Caesar salad travels more than 2,500 miles from the place where the lettuce is grown to the table on which it ends up. Fruit wholesalers in New York City get more apples from Chile than from upstate New York. The few remaining farmers in my part of the state don't even cultivate gardens for their own households. They get their food from the supermarket, like everyone else. Their ecological relationship to the land has been rendered minimal and abstract by technology.

The history of industrialized farming has been remarkably short. Mowing, reaping, and threshing machines powered by animals have barely been on the scene for a century and a half, and engine-driven ones much more recently. The tractor came into common use only eighty years ago. Ditto for electric milking machines and refrigerated bulk storage. In upstate New York, the tractor revolution was not complete until after World War II, that is, within the author's lifetime. Many farmers were still using horses as recently as the 1950s. Yet the loss of knowledge and traditional practice since then has been stupendous. Even if we summon the desire to return to smaller-scale and less oil-dependent ways of farming, the knowledge needed to accomplish it will be hard to reclaim.

Certainly the greatest obstacle to restoring local agriculture, where it is even possible, is the need to reallocate land. In most localities east of the Mississippi, what open land remains near any town or city is considered to have value only for suburban development. It has been so many generations since we have collectively thought about land any other way that our culture will be at a loss to form a new consensus and act swiftly as the times will demand. I daresay we will still be debating zoning issues, such as the right to keep chickens in residential subdivisions, while many Americans starve. Suburbanites may squander their remaining energies in all kinds of futile political efforts to prop up the putative entitlements of suburban living and to preserve the illusion that this way of life can continue.

The result could be years of collective paralysis, indecision, and cognitive dissonance, culminating in social upheaval. In the crisis of the Long Emergency, it will be especially difficult to reallocate or transfer open land already owned to widespread freehold ownership by individual new farmers. The aggressive subdivision of property in the twentieth century has produced an extremely fragmented landscape, even in places that once contained excellent farmland. It will be hard to assemble contiguous parcels into holdings large enough to become farms. There may be more people who wish to resettle on rural land than available land for them, but not enough of them will have the necessary skill to run a farm, not to mention the wealth needed to buy land. One implication is that valuable farmland may tend to remain in the hands of those lucky enough to already be in possession of it, and the less lucky may be enlisted to work on it as hired help. Extreme conditions may lead to the formation of a new peasantry—an exploited class of laboring people tied to the land by contract, custom, or desperate circumstance. That, of course, is essentially feudalism, and while it is not an outcome I savor, it is among the "unthinkable" possible futures that we had better get used to thinking about. Such drastically different social arrangements raise other disturbing issues. What will the role of children be? Will they be part of the workforce? Many of the presumed achievements in social reform of the past century may go out the window with the hardships of a post-cheap-energy world.

I doubt that education would continue to exist as we currently know it. A wild card in a neofeudal scenario might be the potential for violent political upheaval against the propertied classes. Another is the spread of epidemic disease among people already suffering from the stresses associated with plummeting standards of living. Feudalism tends to fail when the supply of surplus labor crashes.

In any case, I'm not optimistic that government could intervene in the reallocation of land for farming. This is exactly the kind of problem against which central government as we know it would prove ineffective, as big government will be subject to the same encumbrances of scale as big agriculture or big business, while the competency of government to redistribute wealth is always questionable – and in the Long Emergency land will be wealth. If government does attempt to reallocate land on an emergency basis, it might only foment a resistance that would threaten whatever remaining legitimacy it had. Property rights are at the heart of the nation’s operating system, so to speak, and to mess with them might be explosive. The Long Emergency will present conditions Americans have never experienced, and the non-rich masses may resort to the kind of desperate action that other historically put-upon people have taken. America is just not that special, nor immune to either the hazards or circumstance or the tendencies of human nature. Revolution might occur, nullifying previous land tenure agreements, but with central government already disabled it might be limited to some localities and not others….

The American scene further into the twenty-first century will have to include more working animals. The horse population in the United States reached its height around 1915 at about 21 million (two years after Henry Ford introduced the assembly-line method of production for his Model T) and declined sharply afterward. The 1920s was a kind of horse holocaust as the automobile and tractor came into broad use and the sudden oversupply of horses sent them by the trainload to rendering plants, like so much scrap. The low point of the U.S. horse population came in the mid-1950s at about 500,000. There are about 7 million horses in the United States today. Of these, 725,000 are used in racing. About 2.5 million are in use on farms—up by about a half-million from the early 1990s. A horse is generally able to begin useful work at four years and it can labor for more than twenty years depending on how well it is cared for. Unlike machines, horses can reproduce themselves. A substantial fraction of production on farms organized around horse power has to be dedicated to growing their feed. Obviously, relations between humans and working animals can range from respectful and loving to careless and cruel, and social norms of decent behavior toward them will have to be reestablished. We are likely, however, to find ourselves living in a world in which this kind of cruelty is more visible. People may be killed by horses, mules, oxen, and bulls, but roughly 50,000 people a year are killed in automobile crashes every year in the United States and there is no public outcry about it.

I don't believe that working animals will replace all the things done by engines all of a sudden, but they are sure to be an increasing presence in our lives, and a time may come when we live with far fewer engines indeed and many more working animals. There is no reason to assume, as we move further away from the oil age, that some miracle replacement for oil will allow a return to industrial agriculture—especially insofar as replacing "soil amendments" made out of oil. You can't make fertilizer or pesticides out of wind power alone. Producing hydrogen by electrolysis from nuclear power and then converting that hydrogen into chemical fertilizers and pesticides would be ridiculously expensive, and even under the best circumstances it would take at least a decade to build a new generation of nuclear power plants dedicated to the task at the necessary scale. We will just have to do farming differently, on a smaller scale, locally, the hard way. An obvious model for this kind of agriculture already exists in the American Amish community, which has stubbornly resisted the blandishments of high-tech through the entire oil-drunk extravaganza of the twentieth century.

The Amish are descended from Anabaptists who emerged from the turmoil of the Reformation. A Dutch Catholic priest named Menno Simons united one branch of Anabaptists into the eponymous Mennonites around 1550. Around 1700 a Swiss bishop named Jacob Amman broke away from the Mennonites and his followers called themselves Amish. Both groups are represented in America. Amish settlement in Pennsylvania began around 1720, under William Penn's "holy experiment" in religious tolerance. Their belief system revolves around the central tenet of being separate from the secular world. This has made them seem more extremely different in custom and manners from other Americans as the twentieth century advanced and a hypersecular consumer society evolved. (A hundred years ago, with a third of the U.S. population on farms, an Amish farming family and a non-Amish farming family might appear superficially similar in ways of working, social organization, and even costume.)

Amish farming practices today are impressively productive and efficient, every carried on without electricity or motor vehicles. Wendell Berry has compared their operations favorably with industrial farming in several books. (Wendell Berry, Amish Economy, Versailles, KY: Adela Press, 1996; Home Economics: Fourteen Essays, San Francisco: North Point Press, 1987; The Unsettling of America: Culture and Agriculture, San Francisco: Sierra Club Books, 1977.) Can Amish farming practices be separated from the stringencies of Amish religion and social organization? The Long Emergency could provoke a broad renewal of religious observance in America out of misery and desperation, but I do not imagine that any large numbers of ordinary Americans will rush to become Amish. If anything, I expect Americans to turn to the cruder branches of evangelical and Pentecostal Christianity, which will provide simplistic explanations for the dire circumstances in which we find ourselves (and justifications for extreme behavior). I also believe these denominations will seek to reinforce the very hyperindividualist philosophies that evolved with the consumer economy of the twentieth century, and may therefore do a poor job of supporting the kind of cooperative behavior required to restore agricultural communities. Without strong communities based on integral social and economic roles, the revival of small-scale, nonindustrial farming is apt to be haphazard.

I don't know what it will take in the way of spiritual fortification to enable Americans to feed themselves without cheap oil. In a world short of diesel fuel and natural gas, people will have to find other ways to make crops, whatever they believe in. The models are there and the knowledge is there, but it is not in general circulation the way a knowledge of auto mechanics is today. There are plenty of non-Amish people practicing small-scale organic agriculture, and organizations supporting them, such as the Northeast Organic Farming Association (NOFA), which assists small-scale farmers with local marketing, with the preservation of traditional knowledge and technical help, and with political activism to prepare the public for inevitable change. NOFA farmers are often disparaged by an unappreciative public as "boutique farmers," but their activities are vitally important in keeping knowledge alive. There is also a widespread secular subculture of people working commercially in a diverse range of "obsolete" farm-related crafts—harness makers, smiths, farriers, makers of horse-drawn tilling machinery, breeders of draft horses, mules, and oxen—who advertise in periodicals such as Small Farmer's Journal and Mother Earth News. These craftspeople manage to keep alive skills that Americans will need desperately when no more trips to the Wal-Mart are possible. The existing literature on small-scale organic farming is vast.

Making a transition out of industrial food production will involve the reestablishment of multiple complex systems on a local basis, including systems of social organization once common in America but surrendered in recent decades. The difficulties of this transition will depend on how rapid the onset of the Long Emergency actually is. I believe that the disorders and instabilities of the post-peak oil singularity will assert themselves rather quickly, long before the world runs out of oil. The quicker they come on, the harsher they will be.

The End of Suburbia

The future is now here for a living arrangement that had no future.

We spent all our wealth acquired in the twentieth century building an infrastructure of daily life that will not work very long into the twenty-first century. It's worth repeating that suburbia is best understood as the greatest misallocation of resources in the history of the world. There really is no way to fully calculate the cost of doing what we did in America, even if you try to tote up only the monetary costs (leaving out the social and environmental ones). Certainly it is somewhere up in the tens of trillions of dollars when one figures in all the roads and highways, all the cars and trucks built since 1905, the far-flung networks of electricity, telephone, and water lines, the scores of thousands of housing subdivisions, a similar number of strip malls, thousands of regional shopping malls, power centers, big-box pods, hamburger and pizza shacks, donut shops, office parks, central schools, and all the other constructed accessories of that life. I have described it at length in other books. The question now is: What will become of it?

Suburbia has a tragic destiny. More than half the U.S. population lives in it. The economy of recent decades is based largely on the building and servicing of it. And the whole system will not operate without liberal and reliable supplies of cheap oil and natural gas. Suburbia is going to lose its value catastrophically as it loses its utility. People who made bad choices and invested the bulk of their life savings in high-priced suburban houses will be in trouble. They will be stuck with houses in unfavorable locations – surrounded by similar dysfunctional artifacts of sprawl – and if they are lucky enough to sell them at all, they will only create an identical set of problems for some greater fool of a buyer. Even fantastic bargains will end up being no bargain. The loss of hallucinated wealth will be stupendous and the disruption of accustomed suburban logistics will be a nightmare for those stuck there. Perhaps a greater question is this: Will the collapse of suburbia as a viable mode of living tear the nation apart, both socially and politically?

Cities, Towns, and Country

American life in the twenty-first century has the best chance of adjusting to the Long Emergency in a physical pattern of small towns surrounded by productive farmland. I am not optimistic about our big cities—at least not about them remaining big. America's big cities created themselves in tandem with the industrial revolution. They were products of it and servants of it. They were the setting of the first and second acts of the industrial revolution, as suburbia has been the setting for the third act. There was no medieval Kansas City, no Renaissance Minneapolis. The great cities of America, particularly New York and Chicago, became global symbols for the most dynamic and thrilling aspects of everything associated with the "modern," which is to say the cutting edge of advanced techno-industrial ism —the glamour of skyscrapers, trains, airplanes, the magic of electricity, movies, telephones, and radio, and all the other miracles of the age. For the non-rich, American cities were always problematic, beginning with the dreary building typologies and continuing to their clunky diagrammatic layouts, their poor street detailing, obeisance to the obnoxious operations of industry, gross commercialization, and finally their abject surrender to the needs of cars. Worst was the sheer overwhelming scale that trapped and oppressed the human spirit. The largest of our cities assumed a scale that had never been seen before in history—as industrialism itself had not been seen before—and this demoralizing hypertrophy produced huge diminishing returns in the quality of life for the industrial masses, especially the workers who crowded the extensive tenement slums. Some of these problems were overcome. The awful sanitation and disease of the nineteenth-century city were vanquished by the great public works of the early twentieth century and the germ theory of modern medicine. Electricity became available to the masses of city dwellers around the same time. Electric streetcars and new subway systems improved life for everyone. The thrill, the charge, the zing of the early twentieth-century hypermetropolis, the business, social, and cultural opportunities, were, for many, compensations for the vicissitudes of scale, the oppression of crowds, and the obliteration of any connection to nature. After 1900, of course, it all depended on cheap oil —and during most of the twentieth century, America was the world's leading producer of it.

But by 1950, the growth of America's big cities was complete and they entered a swift and implacable phase of contraction. Their second act was over. The postwar action was moving to the suburbs. For the next five decades the cities struggled with their losses. Some of them did better than others. New York and Los Angeles retained their dynamism as the financial and media capitals of the two coasts. Los Angeles itself was the prototypical suburban metroplex more than it was a city in any historical sense. While the older big cities contracted, cities based on the post-1950 suburban format grew explosively—Atlanta, Charlotte, Orlando, Houston, Dallas, Phoenix, Las Vegas—but their growth was virtually all suburban. Boston and San Francisco each went into a coma between 1930 and 1980, and then both enjoyed a considerable dynamic revival, and for similar reasons: They became the capitals of the computer industry on their respective coasts, as New York and Los Angeles were the money and media centers. For the rest of America's cities, though, the story in the late twentieth century was much grimmer.

Detroit, which was the world's seventh-richest metropolis in 1950, became, by 1975, a giant suburban donut with a burnt-out hole in the middle. Its middle-class population had decamped to the easily built-upon flat terrain of its hinterlands, leaving the center to a large group of deracinated southern sharecroppers who had migrated to Detroit just in time for the domestic automobile industry to shed tens of thousands of the jobs they migrated there for. By the year 2000, there was hardly any city left where central Detroit had been. Wildflower meadows lay where urban blocks once stood. St. Louis was a similar story, though its decline had actually begun earlier in the twentieth century when it lost out to Chicago as the Midwest's transportation and commodities exchange hub. By 2000, St. Louis was a ghost town surrounded by its suburban donut. Buffalo, New York, was the vaunted "city of the future" in 1900, the "electric metropolis," which, because of the immense generating capacity of Niagara Falls, was expected to become a rival to its older sister at the other end of New York state. By 1980, Buffalo was an economic invalid. The story was similar for Philadelphia, Cleveland, Baltimore, Newark, Pittsburgh, Kansas City, Indianapolis, Cincinnati, Milwaukee. Many attempts were made to rescue these cities, often in the form of skyscraper megaprojects, sports stadiums, performing arts complexes, aquariums, and other grand gestures, but the urban sclerosis just got worse. The third act of the industrial age was equally unkind to the cities occupying the next tier down in scale: Syracuse, Rochester, Worcester, Trenton, Akron, Louisville, Nashville, Des Moines, Chattanooga, and a dozen other once-dynamic small cities all entered the twenty-first century as basket cases surrounded by suburbs that sucked the vitality out of them.

The industrial cities will never again be what they were in the twentieth century. They require too much energy to run and the industrial activities they were designed for are already defunct. Like big corporations, big farms, and big governments, big cities will not be suited to the reduced scale of life in the post-cheap-oil future. Their contraction will accelerate in the Long Emergency. What's more, the superdynamic suburban metroplexes of the past five decades—places like Phoenix, Las Vegas, Houston, Atlanta, Orlando, and so on—will decline even more rapidly and catastrophically than the old industrial big cities once the Long Emergency gets going, because everything in them was designed solely in relation to cars. The biggest cities, New York, Chicago, and Los Angeles, will join Detroit and hemorrhage their populations. They are liable to become dangerously unsanitary and unsafe. Unless the United States ramps up a Project Apollo-style program of nuclear power plant construction, the electric grid is going to be in deep trouble, and there is no question that the North American natural gas supply is already in depletion. The ramifications for big cities are tremendous. As cited earlier, in Chapter 3, the U.S. natural gas pipelines have never been seriously interrupted. The pressure has never dropped below the critical stage at which, for instance, furnaces in large buildings go out. What will happen to the water pipes in a sixty-story residential building in Chicago if the regional natural gas pipeline goes down in February for thirty-six hours? What will happen is that the pipes will burst and every apartment will become uninhabitable. What will happen when the gas pipelines are repressurized and pilot lights don’t automatically restart in some buildings? It is a recipe for gas explosions….

In the Long Emergency, the focus of society will have to return to the town or small city and its supporting agricultural hinterland. Those towns and small cities will have to be a lot denser. Most of the towns and small cities of America are in a coma today. The luckier ones, which are generally tourist towns, have had a residue of boutique commerce barely holding the downtown buildings together. Typically, though, downtown buildings in small towns are unoccupied above the ground floors because the landlords will not invest in expensive renovation under strict building codes while new, cheap suburban-style garden apartments pop up on the fringe. The unluckier small towns of our nation —and they are the majority—lie in various stages of dereliction and ruin, their industry gone, their populations aged or idle, the infrastructure rotting. Even solid brick buildings fall apart in a few years when they are not inhabited. Once the roof leaks, all bets are off.

The people who lived in the small industrial towns of America one hundred years ago never would have believed what became of these places at the turn of the new millennium. The desolation and loss would have been inconceivable to a people who had done such a fine job of building communities. Even from the closer vantage point of 1950, the destruction has been incredible—as though World War II had been fought in Schenectady, New York, rather than Bastogne. Tragically, the destruction was all the result of economic suicide, of bad decisions made for bad reasons, by people grown too complacent and greedy to care much about the future. In the Long Emergency, there will be a tremendous cultural reaction to the epic carelessness of those generations responsible. Their legacy will be looked on not with nostalgia, but with contempt and amazement.

The psychosocial infrastructure of our communities is going to also change a great deal. One of the basic confusions inherent to the suburban experiment was the idea that people could live an urban life in the rural setting. The Long Emergency will revise that. People who live in rural areas, or resettle there, must prepare to lead rural lives and follow rural vocations. That means food production, farming. Those who live in towns, even small towns, will work at activities appropriate to the town: trade, education, medicine, and so forth. The distinction between town and country will be much less ambiguous. There will be no more twenty-six-mile drives to the Super Wal-Mart or the vet.

Much of America east of the Mississippi is full of towns like the ones here in upstate New York where I live. They are standing there, waiting to be reused, with much of their original equipment intact. The lucky suburbanites will be the ones with the forethought to trade in their suburban McHouses for property in the towns and small cities, and prepare for a vocational life doing something useful and practical on the small scale, whether it is publishing a newsletter, being a paramedic, or fixing bicycles. I will make a distinction in a section ahead between the destiny of the Old South and the northern states of what I call the Old Union.

Commerce in the Long Emergency

Just as farming evolved to reach gigantic scales of operation and adopted corporate modes of organization the past half century due to the economics of cheap oil, so did everyday commerce evolve to a colossal scale under monopolistic global corporate enterprises that ruthlessly destroyed complex local and regional networks of economic interdependence. I have already described how the process occurred and the character of the national retail chains involved. The Long Emergency, will put all of them out of business. Wal-Mart, Kmart, Target, Home Depot are all going to wither and die. They were strictly manifestations of the cheap oil final blowout and they will not survive beyond it under the conditions of the Long Emergency. Wal-Mart will not be able to profitably run its "warehouse on wheels" when the price of oil fluctuates chronically (always along an upward trend line), and supplies become less than completely reliable. Wal-Mart and the others will not be able to maintain relationships with suppliers twelve thousand miles away in China when we are locked in desperate competition with that nation over oil supplies, or when the Asian shipping lanes are effectively shut down by anarchy on the high seas. Even if Wal-Mart could get ultracheap manufactured goods from some other place in the world, it may have no one to sell to when the American middle class becomes an impoverished former middle class, and the current consumer credit structure has cratered.

The idea of a consumer culture itself is going to die with the national chain stores. We will never again experience the explosion of products, choices, and nonstop marketing that characterized the late twentieth century. The public may look back on the big-box shopping era with deep and mournful nostalgia, but we are apt to discover that happiness is still possible without the extraordinary advertising-driven compulsive materialism of recent decades. We will still have commerce. We will have trade. There will be shopping. We will have some kind of medium of exchange. But we are not going to live in a perpetual blue-light special sale of cornucopias wretched excess. We will not be consumed by our consumption. Nor will our children or grandchildren.

We are going to have to rebuild from the bottom up those complex webs of local economic interdependence that the national chain store movement has destroyed. It is a tremendous and daunting task. The transition out of corporatist hyper-retailing will be very painful. If political friction with China were to occur rapidly—say, in a crisis over Taiwan—we might even endure a period of chaos in the everyday markets. We became addicted to ultracheap household goods as surely as we became addicted to imported oil. Many common products we depend on aren't manufactured in the United States anymore—all sorts of things from bicycle tires to dish towels. We are not going to restart factories for these things overnight. In fact, because we will be living in a far lower-energy society, our basic capacity for manufacturing anything will necessarily be much lower. Many of our pre-1945 factories have been either demolished or stand in a state of hopeless dereliction, and I would argue that we will not be able to do any kind of manufacturing based on the scale of a 1950 factory model anyway. What manufacturing we do manage to reestablish may be what used to be called cottage industry, based more on craft skills than the assembly line of modern production….

The conditions of the Long Emergency will militate against corporate organization as we have known it, that is against commercial enterprises scaled to operate virtually like sovereign states run by oligarchies. Whether their demise is a good or a bad thing remains to be seen. Large-scale corporate enterprise has brought humankind much material comfort in two centuries, but at the price of fantastic unintended consequences (externalized costs) ranging from the destruction of local communities to climate change. Large-scale corporations will be vulnerable to the collapse of capital formation markets that must accompany the end of the cheap-oil fiesta. Corporate enterprise can certainly be reorganized on the small, local community scale, but it will not be the same as General Motors. Corporate enterprise in the Long Emergency may revert to being more public in nature and far less sovereign in power. There may be one exception: The most visible kind of corporate organization that might survive the Long Emergency may be the church. Whether Catholic or Pentecostal or something new we haven't seen yet, the church won't have to rely on oil supplies. Organized religion doesn't have to traffic in awkward material products, only in beliefs, and it can operate at many scales simultaneously. Because American culture is constitutionally allergic to religious governance, we may have problems if churches are the only large organizations left standing —that is, assuming we still have the same constitution.

What We Live In

It is hard to say how much of the existing late-twentieth-century building infrastructure might be adaptively reused in these reestablished local economic networks. There is a lot of it. More than 8c, percent of everything ever built in America was built after World War 11, and most of it was designed solely to be used in connection with cars. There will probably be three considerations: (1) how walkable it is —in many places, especially the metroplexes of the Sunbelt, the strictures of zoning have left retail buildings extremely isolated from both the residential neighborhoods and the original town centers; what had seemed minutes away in an air-conditioned car may feel more like the Bataan Death March on an August afternoon when one must get there on foot; (2) whether the buildings can be heated, or need to be heated—given the depleted state of the North American natural gas supply, it's unlikely that the abandoned big-box buildings will be heatable, which obviously would be more of a problem in the northern tier of the country; an empty Kmart in Biloxi, Mississippi, can, and probably will, become anything from an infirmary to a Pentecostal roller rink, but in Wisconsin it's likely to be a different story; (3) whether the roofs can be kept in repair —unfortunately, most of the commercial structures built in America after World War II have flat roofs. They are penetrated by all kinds of vents and mechanicals, and generally these have been sealed using oil-derived materials over synthetic rubber flashing.

Even with regular maintenance, their design life is probably twenty years at most. Once water gets in under the roof, deterioration proceeds very quickly. Electric wiring is especially susceptible. It is hard to say whether the snow and ice in the colder states would be more damaging than the extreme ultraviolet light and torrential rains of hot places such as Florida or Texas. We should probably conclude that the abandoned big-box structures will not last more than one generation under any circumstances. Pretty much the same thing can be said about malls, strip malls, and chain restaurant buildings. Eventually they will be the salvage yards and mines of the future.

Our communities therefore will have to reorganize physically as well as socially and economically. The existing small cities and towns of America offer the best opportunities for that, as they already contain the appropriate building types deployed in an appropriate walkable street-and-block template. Many of the building codes created over recent decades will probably have to be ignored or abandoned as communities seek to rebuild within the context of a much more austere economy. These building codes were extremely restrictive, especially the recent handicapped-access laws, which required elevators in virtually every building over one story that was not a private house. Because commercial builders of this period generally chose the path of least resistance, this tended to result in nothing but one-story buildings. State fire codes also mandated extremely onerous requirements for multiple stairwells in buildings over one story—often to such a ridiculous degree that they would take up most of the internal space of a proposed building, thus defeating the purpose. This was particularly true where the .renovation of existing older buildings was concerned, including Main Street buildings in towns all over America. By the time a renovator theoretically finished putting in elevators, handicapped ramps, and multiple egresses, there would hardly be any room left for apartments or offices in the building, or they would be awkward to organize around all the internal infrastructure. These excessive regulations were devised not just for safety and fairness, but for reasons of legal liability. They represent, more than anything, a fear of lawyers.

In the Long Emergency we will not be able to afford this over-regulation. We will no longer be living in a strictly horizontal environment defined by cars, parking lots, and one-story buildings deployed arbitrarily all over the landscape. Our daily environments will have to be much more defined by walking distances. Our towns will have to be much more compact. They will be more vertical —within limits. We will have to get a much greater percentage of our two- to five-story buildings back in full service, and we will have to build new ones to fill in the spaces of the old parking lots. As I've already suggested, the skyscraper will be an anachronism. Buildings above seven stories may be out of the question for practical reasons. If we abandon the building codes of the twentieth century, our buildings may become less safe. Winter heating is also likely to be a big problem, not only in terms of available fuel, but also because we will have less central heating and a greater emphasis on individual room heaters, which tend to be more dangerous as a rule. This may have additional implications for our plumbing needs. Modern plumbing went hand-in-hand with central heating. If you can't heat an entire building, the pipes are likely to freeze somewhere, and if they freeze at even one point, the whole building has a problem.

Many of the modular construction materials used by commercial builders—the plastic artificial stuccos, fabricated epoxy panels, and so on—will no longer be available. We will probably have to return to traditional masonry and wood construction. Reinforced concrete may not even be possible if there is a shortage of steel "rebar." Some materials may come from the disassembled strip malls, big-box stores, and other obsolete structures of the twentieth century. As I've already suggested, the increment of redevelopment will be very small compared to the scale we're accustomed to. One new building on a single lot may be a big deal in the Long Emergency. There may be relatively little building activity per se, and the construction "industry" as we have known it almost certainly will not survive— certainly not the so-called "production housing" sector.

Local communities of the kind I have described, where businesses are owned by the people who work in them, and where social and economic roles form a rich matrix of interrelationships, are generally good at creating local institutions to care for those who are weak, disabled, or old—who are their neighbors. America, in fact, was full of such local institutions in an earlier period. Much as they acquired a kind of folkloric stigma, places like poorhouses and county farms of the nineteenth century were arguably far more humane than the conditions that the weak and indigent are subject to today….

Transportation in the Long Emergency

Similarly problematic is whether the project of restoring America's railroads could be organized at a scale equal to the gigantic task. In the Long Emergency, all large-scale enterprises will have trouble operating in virtually every sphere of activity. Rebuilding the railroads is a project perhaps comparable to building the interstate highway system, or nearly so. Could that be done today from scratch? Or was it something that could have been brought off only in a particular moment of history—by a hyperaffluent nation awash in cheap oil led by a generation of victorious World War II veterans schooled in the heroics of war production? We certainly can't return to the special economic conditions of nineteenth-century America that made the initial building of railroads possible—namely, huge amounts of open land or sparsely settled land, coupled with federal right-of-way land grants, allowing railroad companies to profit greatly in real estate on the side.

One answer to this dilemma might be that the U.S. railroads in their glory days never ran as a single monolithic system. Many dozens of regional lines were stitched together, employing standards of gauge, switching, and operating protocols that made the combined organisms seem to function as a single system. It might be argued that the attempt to run freight and passenger rail systems under the near-monolithic administration of Conrail and Amtrak in recent decades has been a management fiasco precisely for reasons of excessive scale.

If the American railroads are to revive at all under the conditions of the Long Emergency, they will probably do so in a piecemeal way as a stitched-together patchwork of regional lines. Some will operate better than others. Their operation could be hampered by social turbulence and political unrest. One final thing worth noting on the subject of rail: From 1890 to about 1920, American localities managed to construct hundreds of local and interurban streetcar lines that added up to a magnificent national system (independent of the national heavy rail system). Except for two twenty-mile gaps in New York state, one could ride the trolley lines from New England clear out to Wisconsin. The story of the conspiracy by General Motors and other companies to destroy the U.S. interurban system is well documented. The salient point, however, is how rapidly the system was created in the first place, and how marvelously well it served the public in the period before the automobile became established. Light rail of some kind—which does not require elaborate roadbeds—may become the basis for regional transportation systems of the future. The lines can be laid along the existing right-of-ways of any of our roads, from the interstates to the streets of our towns, and they can usefully transport people and freight. If energy conditions become really dire, they can use draft animals to convey the cars more efficiently than wagons on roads with broken pavements.

U.S. waterways are probably the most forgotten and neglected elements in our national transportation system. Who, living in the year 1920, would ever have believed that the entire waterfront of Manhattan Island would be devoid of commercial docks at the end of the twentieth century? Or that Boston's harbor would host only a few tour boats? Our current wet dreams about turning so many waterfronts into parks is a good indication of how shortsighted even well-intentioned civic leaders have been in our time. If we want to conduct trade further along in the twenty-first century, whatever our commerce consists of, it will have to rely much more on water transport. It will be slow, but it will be doable, and where inland waterways are concerned, dependable. It would have to be integrated with whatever rail systems we can put back in service. Parts of the American West lacking navigable waterways will obviously not benefit.

Education

Suburban schools may be in better condition physically, with more abundant supplies, but that, too, is changing. Gigantic alienating suburban schools are producing so much anxiety and depression that multiple slayings have occurred at regular intervals in them in recent years. All school in America, whether inner city or suburban, is still based on the obsolete model of a 1911 factory, with several thousand workers in attendance whose routines have to be regimented. Its chief mission is custodial, despite the putative reforms of recent decades. It conditions children and young adults to spend the bulk of their day in one place. It uniformly infantilizes young adults, including even many of those who succeed on its terms. The rest merely suffer in it, while losing the opportunity to learn manual skills that would make them useful and productive members of a community. The system will not endure much longer in this form.

The huge centralized suburban schools that look like medium-security prisons with their fleets of yellow buses will rapidly become obsolete when the first serious oil market disruptions of the Long Emergency occur. The inner-city schools will be too broken to fix. School will have to be reorganized on a local basis, at a much smaller scale, in smaller buildings. Children will have to live closer to the schools they attend.

In the Long Emergency schooling will be required for fewer years, and children may have to work part of the day or part of the year. Because everything will be local, the ability to support education will depend on local economic conditions and the level of social stability, and there will be broad variation. Some localities may become so distressed that public school will cease to exist. The more fortunate localities will be those where small-scale agriculture is possible, but more intensive local agriculture by nonindustrial methods implies a much different division of labor, and older children may have to assume more responsibility and grow up faster. The romanticization of childhood may prove to have been one of the luxuries of the cheap-oil age. Basic schooling, in the formal sense, might not go beyond the equivalent of today's eighth grade. Sorting of children into vocational or academic tracks will probably be based on self-evident social and economic status rather than any formal administrative system. Only a tiny minority of young people will be able to enjoy a college education. Vocational training is much more likely to occur in the context of a workplace rather than the school, as in the apprentice system.

However many grades education entails in the future, children and teachers would benefit tremendously from being in physically smaller institutions, in smaller classes, where all will at least have the chance to know one another. Many of the problems of education today are the unintended consequences of consolidated administration —the attempt by school districts to save money by operating fewer but bigger buildings, running fewer but larger bus fleets, and employing fewer nonteaching managers. This was made possible by a chain of connected circumstances. The suburban development pattern erased the essential quality of locality per se. Typically, by the 1960s, suburban children couldn't walk anywhere, including to school, so wherever they did go to school a bus was required to get them there. Cheap oil made the school bus fleet a normative part of the system. (Think of a school bus fleet as a public transit system that runs only twice a day for people under eighteen and you may grasp the basic profligacy of the system.) Once that was established, school districts gathered their pupils from ever-larger geographic population "sheds" and bused them to ever more gigantic consolidated "facilities." The economies of scale they strove to enjoy were typical of any large enterprise during the cheap-oil age. The effect of all this on the students, though, was always secondary to the administrative benefits, and the purpose of school somehow got lost, so that, paradoxically, even the richest suburban high schools with Olympic swimming pools, food courts, and hectares of playing fields produced alienated students dogged by anomie, depression, and a pervasive anxiety about their future roles in a consumer society.

In the Long Emergency, this scale of educational enterprise will no longer be feasible, and the attenuation of childhood no longer affordable. But the system as it currently exists may be unreformable. For one thing, the psychology of previous investment will weigh heavily on school districts that have built elaborate facilities. They will not surrender to circumstance until it is simply no longer possible to carry on, meaning there is not likely to be any planning or preparation for change. Any effort to reorganize schooling would likely follow a fairly comprehensive collapse of the current system, and would probably occur on a haphazard local basis—perhaps evolving out of home schooling groups. There will certainly be a need for capable teachers, and in a world of vanishing occupational niches, teaching may be considered a desirable job with more social status than it currently offers.

The Regional Outlook: Sunset in the Sunbelt

If the American economy loses traction, Mexico will suffer by another order of magnitude, because the health of its economy is so closely linked to ours. As this occurs, the chance for political turmoil in Mexico will increase. During the Mexican Revolution and its attendant civil struggles that persisted from 1911 to 1940, nearly 10 percent of Mexico's population eloped into the United States. The population of Mexico was under 15 million in 1920. In 2005 Mexico's population will reach 106 million. According to the 2000 census, there were more than 4.8 million illegal Mexican immigrants in the United States out of a total of 8 million legal and illegal immigrants combined. Mexicans comprised almost a third of all 28 million Hispanic immigrants in the United States, and the proportion was increasing steadily. The Mexican immigrant population is highly concentrated, with 78 percent living in just four states, and nearly half living in California alone.

The affluence created in the final decades of the cheap-oil blowout made the United States, and southern California in particular, an irresistible objective for Mexican immigration. Jobs were plentiful and wages, compared with those in Mexico, were high. The U.S.-Mexico border today is under only partial control at best. At what point does illegal immigration become extremely undesirable, perhaps even intolerable? If there is such a point—and some Americans would deny that there is—would the United States have to defend its southern border? If so, how will the U.S. government defend this border in a time when the American military is apt to be overcommitted in the Middle East and elsewhere? Ultra-right-wing militias have already sprung up along the U.S. side of the border, manned by exactly the new class of economic losers who will be increasing in numbers as the Long Emergency deepens—poorly educated, underemployed, angry whites. A state of violent anarchy may grip America's southwestern states as militias array themselves against illegal immigrants. And at some point, the immigrants could certainly arm themselves in kind. There is no shortage of small arms in the world, even among the most impoverished societies, as recent experience has shown. At what point does this border conflict turn into a de facto border war?

The reconquista or Aztlan movement among Mexican nationals living in the United States cannot be dismissed as racist political paranoia, either. (Aztlan: the legendary ancestral homeland of the Aztecs, which they left in journeying southward to found their capital, Tenochtitlan.) Reconquista is established in the streets and in the universities. Charles Truxillo, a professor of Chicano studies at the University of New Mexico, declared in 2000 that a "Republic of the North" should be brought into existence "by any means necessary." He proposed that the "inevitable" creation of a sovereign Chicano nation would comprise the present states of California, Arizona, New Mexico, Texas, and part of Colorado. The belief that this territory was stolen in the nineteenth century is supported unofficially by the Mexican government, which effectively does nothing to stem the flood of immigrants going north—and actually benefits from it as it acts as a safety valve on Mexico's own internal political problems over pervasive poverty.

The acronym MEChA stands for Movimiento Estudiantil Chicano de Aztlan or "Chicano Student Movement of Aztlan." MEChA is an explicitly separatist organization that encourages anti-American activities and civil disobedience on behalf of "la raza" (i.e., the brown race). Other related separatist groups go under the names of Brown Berets de Aztlan, OLA (Organization for the Liberation of Aztlan), La Raza Unida Party, and the Nation of Aztlan. Some of them mix elements of criminal gangsterism with politics. Although the activism of these organizations varies from somewhat radical to extremely radical, they share the same objectives, the "liberation of Aztlan," as the constitution of MEChA puts it. These groups are firmly established in the barrios of Los Angeles and other southwestern cities. The MEChA symbol is an eagle clutching a machete and a stick of dynamite with a burning fuse.

The Land of NASCAR

Electrification in the South happened differently. In 1900, the South had few cities, not many factories, and many agricultural trading towns without the means to supply their own power. The enormous flat expanses afforded few exploitable hydroelectric sites. Lack of electricity across vast reaches of the South kept the region mired in backwardness through the 1920s — just when a depression in farm commodity prices occurred as a result of mechanization. Electrification of the South was therefore a mostly rural project; it was accomplished by harnessing water power in the rugged Tennessee and Cumberland river systems, and sending that power to sparsely settled customers over great distances. The required infrastructure alone in the form of dams, transmission lines, towers, and relay stations was so formidable that it required large federal government subsidies to make it happen. The electrification of the rural South was not begun until the 1930s, with the federally sponsored projects of the Tennessee Valley Authority (TVA). The scale was fantastic. No sooner were they completed than World War II had commenced. When it was over, the South was ready to launch itself as a new economic phenomenon. The elements were all in place: cheap distributed electricity and cheap oil, meaning that geography had lost its constraints. Air conditioning went first into public places—theaters, restaurants, and department stores—in the 1050s, and finally into middle-class homes on a widespread basis in the 1960s.

It is true that the middle class in metro-burbs like Atlanta has been culturally diluted by an influx of people from other parts of the country, an even other nations, but a significant residue of the original stock remains, with all its latent encoded behavior in place.  As suburbia fails in an oil-challenged world, this middle class will lose its entitlements to a comfortable way of life – to large air-conditioned houses beyond waling distance to anything, to secure employment, to easy motoring, bargain shopping at the Wal-Mart, and cheap, mass-produced food products.

This group of people will be very angry and bewildered at the loss of their entitlements —only so recently acquired—and they will express their anger both politically and extralegally. Politically, the southern middle class is the most solidly conservative of any group in the nation. What "conservative" actually means anymore may be subject to debate, but for the sake of this argument let's say it includes the following attitudes: (1) that the United States is an exceptional nation whose citizens enjoy special dispensations from a Christian god for their good works in being a beacon of liberty for the rest of the world; (2) that being an American is characterized by rugged individualism that tends to exalt the family, clan, or tribe while it discounts the common good of the community and even the rights of others not of the family, clan, or tribe; and (3) that firearms should be liberally distributed among the populace with the expectation that they will be used to defend individual liberty. Anyone can see that against a background of grave economic distress, attitudes like these could lead to a great deal of delusional thinking, dangerous politics, and possibly mayhem.

The Old South, like other regions of the United States, will face the necessity of building a new economy and radically reorganizing its land-use practices in order to create that economy, which will surely have to put food production at its center. Both elements of this enormous task would be daunting under the most favorable circumstances, let alone under conditions of political grievance, paranoia, and turmoil. The project of suburbia was itself a radical experiment in the reassignment of land uses, and much of what had been good farmland until a few decades ago has been built on and paved over, especially land at the edge of towns and cities that had been the proximate source of food supply until the era of continental-scale industrial agriculture. Those suburban housing tracts and big-box stores are not going to be disassembled or moved any time soon, if ever. Can families feed themselves on what they grow on a half-acre lot? The idea of suburbanites in such numbers as found around Atlanta all resorting to subsistence gardening on a successful basis seems unlikely. The majority may not even try. Desperation might provoke movements of unprecedented numbers of people away from the southern suburbs and the metroplexes to what remains of the rural landscape across these large states. There is still a lot of undeveloped land there. But how do you reallocate this land? Could the process remain orderly and peaceable, given the number of small arms available and the regional cultural predisposition to violence?

Christianity Inflamed

…In a way, fundamentalist religion made the corporate predations of community-destroyers easier. It made secular community seem optional, dispensable, provisional, something easily replaced by Wal-Mart.  It squared nicely with the ethos of hyperindividualism, in which bargain shopping trumped any aspect of civic amenity. The churches, meanwhile, sought to benefit from the same economies of scale as those enjoyed by the giant retail chains. Increasingly, the churches were organized on a mass basis and housed in buildings that looked like Wal-Mart with gigantic parking facilities. In fact, evangelical churches were renowned for taking over the leases of dead chain stores in dying malls because the rents were so cheap. Southern evangelicalism became a kind of Wal-Mart of the spirit. Political leaders went bargain shopping in them for voting souls.

It was also a rather severe belief system, with its emphasis on sin and punishment, which provided a means of social control needed originally among rural poor folk inclined to casual violence. These themes redounded in the stories of judgment and "end times" that often preoccupied fundamentalists, especially those parts of scripture concerned with actual apocalypse, such as the book of Revelations. It is perhaps a strange and unfortunate coincidence that the hardships presented by the Long Emergency will seem to be a playing out of the fundamentalists' most cherished and extreme prophecies. It is not the author's contention that the Long Emergency represents apocalypse, as my argument is based on the continuation of human life and the project of civilization in particular. But to many evangelicals, the end of oil-based comforts and amenities may amount to the end of the world. It will certainly be the end of many habits, practices, assumptions, and ideas. The conditions of the Long Emergency will probably reinforce the belief among the pious that evil forces or persons are behind the troubles of the post-peak oil world, and offer justification for punitive actions taken in the face of those conditions.

Of course, like anything organized now at the giant scale, southern fundamentalist religion may suffer in the Long Emergency. The huge industrial sheds built as churches will be hard to heat—it does get to freezing in Atlanta in January—and cheap air conditioning will cease to exist. Members will not be able to travel very far to attend. Churches will have to scale down by necessity, just as schools will. If they become smaller, there may be more of them, requiring more pastors, and there may be quite a bit of competition, or even friction, among them, with multiplying factions, denominations, and new sects. Organized religion, and especially action taken in its name, may collide also with the individualist ethos that rules at times other than Sunday morning. Some individualists may peg their allegiance not to organized churches of unrelated semistrangers but to family, or extended family in the form of tribes, clans, and gangs struggling for survival in an energy-scarce world. In the early phases of the Long Emergency, individualist forces might compete for power with fundamentalist pseudocommunities. These families, tribes, clans, and gangs may eventually prevail and mutate into a ruling class, coopting religion and enlisting its ministers to pacify a frightened and surly populace, especially as it is needed for farm labor. What I am describing, of course, is an old story, seen historically in many places, from the Catholicism of medieval Europe, to the Christianization of American slaves, to the orthodox church of nineteenth-century Russia. The populace, however, might be fewer in number than today's due to attrition from starvation, epidemic disease, and incidental violence.

Racial Conflict in the Long Emergency

The question of race relations naturally presents itself in any meditation on America's cities and their future. By this I mean relations between whites and African Americans specifically—and from here I will refer to African Americans as blacks for the sake of brevity. While it might make some readers uncomfortable, it would be irresponsible for the author to duck the issue. The public discussion about race relations has been disingenuous during the still-ongoing era of political correctness. Political correctness itself came about largely as a defense against the partial failure of the social justice project of the late twentieth century. It is probably useful to begin by describing what has happened recently before turning to where things stand now.

The civil rights movement of the 1960s had been primarily about removing the legal obstacles to full participation for blacks in mainstream American society dominated by whites. The presumption by educated "progressives" through the 1970s was that once legal barriers came down, blacks would seek to participate and assimilate into mainstream culture. This assumption must have seemed reasonable at the time, given the obvious earnestness of the era's black leadership—Martin Luther King Jr., A. Philip Randolph, Thurgood Marshall, James Farmer, et al.—but the assumption proved to be wrong as times rapidly changed.

When the legal barriers came down, enough blacks demurred in assimilating culturally to create a crisis in the civil rights movement. There were two responses. One was the creation of extralegal cultural remedies of the type known as affirmative action in order to further stimulate assimilation. The other response was black separatism, a simple opting out of the need to assimilate. Malcolm X was the avatar of a new separatism, and a martyr to it. The "Black Power" movement gained traction after his 1965 assassination, paradoxically, just after the passage of the most sweeping federal legislation mandating equality before the law since the Civil War. That Black Power seemed romantic, sexy, and glamorous tended to conceal its retrograde impact.

As a practical matter, Black Power led quickly to some very counterproductive collective behavior. One was the officially sanctioned re-segregation of facilities only recently desegregated. By 1970—five years after Congress put an end to "Jim Crow" laws—college administrators were caving in to demands by militant student groups to charter separate black student unions and separate dormitories. Black studies followed as a separate academic discipline. This set in motion a debilitating ethos that persists to this day—the complaint that a "structurally racist" white-dominated society prevents blacks from full participation in a culture many have already opted out of. White "progressives" have tragically supported this ideology, along with a repudiation of mainstream culture itself in order to discount the value of assimilating into it in the first place. The result has been extremely unfortunate.

While it is true that many blacks have joined the middle class, at least in terms of jobs and pay, a disturbing aura of cultural separatism persists, supported by the multiculturalists in education, with terribly demoralizing effects on that substantial minority of the minority who never made it into the middle class. For the past two decades, lower-class blacks especially have been encouraged only to become more separate, more different in behavior, more divorced from mainstream norms of speech, manners, and costume. This dislocation is reflected ominously in pop music. Hip-hop has to be taken seriously because it is so pervasive, and it presents a range of compelling cultural meanings. The most threatening, of course, is its association with criminal behavior—the rhetoric of gangsterism, the glorification of gunplay and murder, and the grandiose imagery of unearned riches. Street mythology has it that hip-hop clothes, accessories, and lingo are extensions of jailhouse fashion. Less obvious is how much these childish conventions of manner—exaggerated clumsy body language, pants many times too large, hats worn sideways— infantilize their followers. Children do not engage in politics, and so one of the worst aspects of this sector of pop culture has been the wholesale depoliticization of the black population, especially young adults. Another result of this surrender of politics to entertainment has been an amazing dearth of black political leadership at a time when it couldn't be more desperately needed to resolve the unfinished business of the social justice project.

There are real political issues facing the black underclass minority in America, and the outstanding one would seem to be how much longer significant numbers of them can afford to put off growing up. The twenty-year-long peak oil blowoff has made this experiment in arrested development possible. If nothing else, it has kept enough surplus wealth sloshing through the economy to keep the party going. The Long Emergency will force the issue. No group of Americans will be able to party through it. Even among the nominally poor today, standards of living have a long way to fall. What remains of the post-welfare reform social safety net may unravel altogether.

The grievance and belligerence that smolders under the surface of the hip-hop saturnalia is unattached to any coherent political claims beyond the debatable clichés of "structural racism." But that belligerence is more a fashion statement than a political message. Glowering behind sunglasses in a rap video is a show business convention now, but the stringencies of the Long Emergency will change the way such posturing is interpreted. The Long Emergency will be such a hardship for everybody, of all races and sexes, that claims of prior special grievance will be dismissed. The Long Emergency will demand so much of individuals in terms of personal responsibility, civic cooperation, and adult skills, that large numbers of people will be unprepared to cope, and the rest won't be disposed to excuse the truculence or misbehavior of those who cannot. They will be too busy working to feed themselves and to stay warm. The remaining question is whether this lowered threshold of tolerance will operate within a context of law, or whether the social fabric will be so tattered by hardship and destitution that the mechanisms of justice will no longer be in force.

Since I believe that life in the Long Emergency will become profoundly local, then the answer really depends on how successful a given locality may be at maintaining civic institutions, including the police and the courts, and, of course, how fairly these things might operate. There is liable to be wide variation. We know from historical experience that racial justice has not been well served in the Old South. We might flatter ourselves to think that it has been better served in other parts of the country. It is obvious that the regional demographics have changed. For the past fifty years, lower-class black culture has been identified with inner cities, the result of a "great migration" that sent several waves of southern agricultural serfs north to cities, just as the national economy was well on its way to shedding its manufacturing sector and the good jobs that went with it. The outcome of that has been extremely discouraging for all concerned, both blacks and whites, those trapped in the violent purposelessness of postindustrial city life, and those dispersed to the alienating precincts of the automobile suburbs. But the cities will not remain in their present shape and condition. Since the 1990s, a reverse migration has been under way with northern urban blacks returning to the southeastern states. In many cases they are returning as members of the middle class. Both California and New York saw the largest numbers of these outmigrations. Those left behind in the urban ghettos may find themselves even more economically and culturally isolated as the Long Emergency begins.

At their worst, the rap videos played on cable TV resemble the war chants of a conflict that has not yet been joined. Only among a group as narcissistically lost and clueless as white suburban America would these messages be welcomed as just another species of entertainment. In the disorders of the Long Emergency, when the poor become really poor by world standards, the urban ghettos may explode again, and the next time it happens it will be in the context of a much more desperate society than the one that witnessed the 1992 Rodney King incident and its aftermath. It is unlikely to be confined to the ghettos themselves but will likely resolve into a more generalized and protracted guerilla warfare of the kind that has been going on in third-world countries for decades, and it will occur against a background of widespread turbulence everywhere.

American exceptionalism offers no protection from these potential disorders. Any place can become a Beirut under certain unfavorable circumstances. We can only hope to hear the appeal of those "better angels of our nature" invoked by Lincoln the last time the United States went through an internal convulsion.

Ideas, Morals and Manners in the Long Emergency

The circumstances of the Long Emergency will be the opposite of what we currently experience. There will be hunger instead of plenty, cold where there was once warmth, effort where there was once leisure, sickness where there was health, and violence where there was peace. We will have to adjust our attitudes, values, and ideas to accommodate these new circumstances and we may not recognize the people we will soon become or the people we once were. In a world where sheer survival dominates all other concerns, a tragic view of life is apt to reassert itself. This is another way of saying that we will become keenly aware of the limitations of human nature in general and its relation to ubiquitous mortality in particular. Life will get much more real. The dilettantish luxury of relativism will be forgotten in the boneyards of the future. Irony, hipness, cutting-edge coolness will seem either quaint or utterly inexplicable to people struggling to produce enough food to get through the winter. In the Long Emergency, nobody will get anything for nothing.

I believe these hardships will prompt a return to religious practice in all regions of America, with tendencies toward extremism that will be worse in some places than in others. In the absence of legitimate or effective secular authority, church authority may take its place, perhaps for a long time to come. People desperate for legitimate authority to assist them in organizing their survival will probably accept more starkly hierarchical social relations in general and disdain democracy as a waste of effort. They will be easily led and easily pushed around. This, along with the emergence of a substantial agricultural laboring class, suggests that the ranks of society will be much more distinct in the Long Emergency, with far less movement between the ranks. Do not expect more social equality—expect much less.

Norms of personal conduct may change drastically. Standards of morality will replace the cant of therapeutics. We will be uninterested in the "root causes" of misbehavior and expeditious in dealing with the sheer fact of it, meaning justice is likely to be harsh and swift. Quite a bit of injustice may be a by-product of that, including the persecution of groups and individuals by authorities seeking to impose order at nearly any cost. We will be a lot less inclined to entertain excuses for anything. Personal responsibility will be unavoidable, perhaps excessive. Adolescence as we have known it could disappear and childhood will afford fewer special protections. Reestablished traditional divisions of labor may undo many of the public victories of the feminist revolution. In the context of new circumstances, these altered relations will come to seem normal and inevitable.

My Long Emergency

The United States had an intense interest in halting the rise of energy prices, for reasons beyond the obvious ones. As well as being hurricane season, autumn was the season for debacles in the finance markets, and since the U.S. economy had become virtually synonymous with the housing bubble (i.e., the suburban-sprawl industry), the prospect for continuing this kind of economic behavior seemed rather ominous. The housing bubble was the suburban home-building industry on steroids. Since the NASDAQ crash of 2001, houses had eclipsed paper securities as Americans' favorite investment medium. The Economist reported that 23 percent of the units built in 2005 were bought on pure speculation and were unlived in. Another 13 percent were bought as "second homes." From 2001 through 2005, consumer spending and residential construction together accounted forgo percent of the total growth in GDP (gross domestic product). And over two-fifths of all private-sector jobs created since 2001 were in housing-related sectors, such as construction, real estate, and mortgage brokering.

Mortgage lending was out of control. The traditional mortgage of 20 percent down with a fixed interest rate on the balance was a relic of the past, replaced by "creative" loans that disregarded any notion of risk. So-called sub-prime loans, made to deadbeats and persons with spotty records of creditworthiness, now made up over 50 percent of mortgages. Norms and standards for lending had vanished in the frenzy to pawn off billets of bundled debt onto a reckless bond market where poor judgment was magically converted into tradable "instruments" and those capital flows fed into other dicey investment sectors, such as the derivatives trade.

The houses themselves, going up in the farthest asteroid belts of suburbia, were on steroids, too. The home-building industry had gotten into the habit of building ever larger ones, for the same reason that the car industry liked to sell huge SUVs: because the profit margin per pumped-up unit was much greater. The average size of a new house had gone up by 20 percent since 1987. Yawning atriums, lawyer foyers, and other forms of unusable pretentious space had become standard, home theaters were not unusual, and the master suite mutated into the master resort. In the fall of 2005, sliding into the heating season, with the price of gas higher than ever, the suburban builders were stuck with inventories of houses that even well-off professionals might now be reluctant to buy.

Around Halloween there were signs that the years-long housing bull market was cooling. The Boston Globe reported a 20 percent fall in prices paid below prices listed for high-end houses. The numbers for November were also down in some of the nation's real estate hot spots. The big national production builders like Pulte and KB Homes took hits on their stock. Pulte dropped the price of its Las Vegas ranchburger tract houses 25 percent in 2005. Anyway, the price of housing in most of the hot markets, such as the San Francisco Bay area, had lost any historically comprehensible relationship to salaries or rents. In Miami, sixty thousand condominium units were either under construction or emerging from the permit approval process. An estimated 70 percent of the buyers there were "flippers" or speculators, and those who had bought with creative interest-only or adjustable-rate mortgages stood to be reamed if the market cooled and they could not expedite their flips. Property owners tended to stave off default as long as possible, and the banks often went an extra mile to accommodate them. It might be spring of 2006 before a picture of the full scope of the housing bubble debacle began to emerge. By Christmas of 2005, the financial markets hadn't cratered, despite tremors such as the September bankruptcy filing of Delta Airlines; the demise of Delphi, General Motors' chief supplier of auto parts; and the designation of GM bonds to "junk" status. Altogether, the stock market showed a minuscule loss for 2005.

Apart from the hurricane extravaganza, it had been a mild fall in the United States, weather-wise. By Thanksgiving the price of oil sank back into the high $50s, thanks to those coordinated releases of strategic petroleum reserves from Europe. In early December it ratcheted back above $60 as the SPR releases stopped and the oil already en route worked its way through the distribution system. This was about the same price it had been just before Katrina struck in late August, which seemed a little odd, given the many disruptions, except for another development little noticed by all but the oil-market insiders: Heavy sour crude was replacing production of light sweet crude as a percentage of American imports at an impressive rate. Much of the build in U.S. crude-oil inventories in 2005 may have been heavy sour crude (apparently, no one tracks inventories of light sweet crude versus inventories of heavy sour crude). Heavy sour crude is harder and more expensive to refine and yields less gasoline than light sweet, so it is worth less per barrel. Some grades of heavy sour crude oil sold for as much as $17 per barrel less than light sweet crude oil in 2005, which basically reflected a shortage of light sweet crude oil. In any case, the average price of oil in 2005, $56 a barrel, ended up 40 percent above the price of oil in 2004.


Appendix O. Excerpts from Ellen Hodgson Brown’s The Web of Debt, 2nd revised edition (Third Millennium Press, 2007, 2008)

Foreword by Reed Simpson, M.Sc., Banker and Developer

In my experience, in fact, the chief source of bank robbery is not masked men looting tellers' cash tills but the blatant abuse of the extension of credit by white collar criminals. A common practice is for loan officers to ignore the long-term risk of loans and approve those loan transactions with the highest fees and interest paid immediately - income which can be distributed to the principal executives of the bank. Such distribution is buried within the bank's owner/manager compensation and is distributed to the principal owners as dividends and stock options. That helps explain why, in my home state of Kansas, a major bank in Topeka was run into bankruptcy after its chairman entered into a development and construction loan involving a mortgaged 5,000 acre residential development tract in the "exurbs" far outside of Houston, Texas. The development included curbs, gutters, pavement, street lighting, water, sewer, electricity - everything but homes and families! If the loan had been metered out in small phases to match market absorption, the chairman of that once-fine institution would not have been able to disburse to himself and his friends the enormous up-front loan fees and interest owing to that specific transaction, or to the many loans he made just like it. During the 1980s, developers from across the country beat a path to sleepy Topeka and other areas sporting similar financial institutions, just to have a chance to dance with these corrupt lenders. The managers and developers got rich, leaving the banks' shareholders and the taxpayers to pay the bill.

These are just individual instances of corruption, but they indicate a mind-set to exploit and a system that can be exploited. Ellen Brown's book focuses on a more fundamental fraud in the banking system - the creation and control of money itself by private bankers, in a debt-money system that returns a steady profit in the form of interest to the debt-money producers, saddling the nation with a growing mountain of unnecessary and impossible-to-repay debt. The fact that money creation is nearly everywhere a private affair is largely unknown today, but the issue is not new. The control of the money system by private interests was known to many of our earlier leaders, as shown in a number of quotes reprinted in this book, including these:

The real truth of the matter is, as you and I know, that a financial element in the large centers has owned the Government ever since the days of Andrew Jackson.

-- President Franklin Delano Roosevelt, November 23, 1933, in a letter to Colonel Edward Mandell House

Some people think the Federal Reserve Banks are U.S. government institutions. They are not ... they are private credit monopolies which prey upon the people of the U.S. for the benefit of themselves and their foreign and domestic swindlers, and rich and predatory money lenders. The sack of the United States by the Fed is the greatest crime in history. Every effort has been made by the Fed to conceal its powers, but the truth is the Fed has usurped the government. It controls everything here and it controls all our foreign relations. It makes and breaks governments at will.

-- Congressman Charles McFadden, Chairman, House Banking and Currency Committee, June 10, 1932

Web of Debt gives a blow by blow account of how a network of private bankers has taken over the creation and control of the international money system and what they are doing with that control. Credible evidence is presented of a world power elite intent on gaining absolute control over the planet and its natural resources, including its subservient "human resources" or "human capital." The lifeblood of this power elite is money, and its weapon is fear. The whole of civilization and all of its systems hang on this fulcrum of the money power. In private hands, where it is now, it can be used to enslave nations and ensure perpetual wars and bondage. Internationally, the banksters and their governmental partners use these fraudulent economic tools to weaken or defeat opponents without a shot being fired. Witness the recent East Asian financial crisis of 1997 and the Russian ruble collapse of 1998. Economic means have long been used to spark wars, as a pretext and prelude for the money power to stock and restock the armaments and infrastructure of both sides.

Brown's book is thus about more than just monetary theory and reform. By exposing the present unsustainable situation, it is a first step toward loosening the malign grip on the world held by a very small but powerful financial faction. The book can serve to spark an open dialogue concerning the most important topic of our monetary system, one that is practically off limits today in conventional economic circles due to intimidation and fear of the consequences an honest discourse might bring. Brown is not afraid of stepping on the black patent leather wingtips of the money power and their academic economist servants. Her book is a raised clenched fist of defiance and truth smashing through their finely spun web of disinformation, distortion, deceit, and boldfaced lies concerning money, banking, and economics. It exposes the covert financial enemy that has gotten inside the gates of our Troy, making it our first line of defense against the unrestricted asymmetrical warfare which is presently directed against the people of America and the world.

This book not only exposes the problem but outlines a sound solution for the ever-increasing debt and other monetary woes of the nation and the world. It shows that ending the debt-money fractional reserve banking system and returning to an honest debt-free monetary system could provide Americans with a future that is prosperous beyond our imagining. An editorial directed against Lincoln's debt-free Greenbacks, attributed to The London Times, said it all:

If that mischievous financial policy which had its origin in the North American Republic during the late war in that country, should become indurated dozen to a fixture, then that Government will furnish its own money without cost. It will pay off its debts and be without debt. It will become prosperous beyond precedent in the history of the civilized governments of the World. The brains and wealth of all countries will go to North America. That government must be destroyed or it will destroy every monarchy on the globe.

-- REED SIMPSON, M.Sc., Overland Park, Kansas American Bankers Association Graduate School of Banking London School of Economics, Graduate School of Economics University of Kansas Graduate School of Architecture

-- November 2006

Introduction.  Captured by the Debt Spider

Dr. Carroll Quigley was a writer and professor of history at Georgetown University, where he was President Bill Clinton’s mentor.  Professor Quigley wrote from personal knowledge of an elite clique of global financiers bent on controlling the world. Their aim, he said, was "nothing less than to create a world system of financial control in private hands able to dominate the political system of each country and the economy of the world as a whole." This system was "to be controlled in a feudalist fashion by the central banks of the world acting in concert, by secret agreements.” He called this clique simply the "international bankers." Their essence was not race, religion or nationality but was just a passion for control over other humans. The key to their success was that they would control and manipulate the money system of a nation while letting it appear to be controlled by the government.

The international bankers have succeeded in doing more than just controlling the money supply. Today they actually create the money supply, while making it appear to be created by the government. This devious scheme was revealed by Sir Josiah Stamp, director of the Bank of England and the second richest man in Britain in the 1920s. Speaking at the University of Texas in 1927, he dropped this bombshell:

The modern banking system manufactures money out of nothing. The process is perhaps the most astounding piece of sleight of hand that was ever invented. Banking was conceived in inequity and born in sin.... Bankers own the earth. Take it away from them but leave them the power to create money, and, with a flick of a pen, they will create enough money to buy it back again.... Take this great power away from them and all great fortunes like mine will disappear, for then this would be a better and happier world to live in.... But, if you want to continue to be the slaves of bankers and pay the cost of your own slavery, then let bankers continue to create money and control credit.

Professor Henry C. K. Liu is an economist who graduated from Harvard and chaired a graduate department at UCLA before becoming an investment adviser for developing countries. He calls the current monetary scheme a "cruel hoax." When we wake up to that fact, he says, our entire economic world view will need to be reordered, "just as physics was subject to reordering when man's world view changed with the realization that the earth is not stationary nor is it the center of the universe."' The hoax is that there is virtually no "real" money in the system, only debts. Except for coins, which are issued by the government and make up only about one one-thousandth of the money supply, the entire U.S. money supply now consists of debt to private banks, for money they created with accounting entries on their books.

It is all done by sleight of hand; and like a magician's trick, we have to see it many times before we realize what is going on. But when we do, it changes everything. All of history has to be rewritten.

The following chapters track the web of deceit that has engulfed us in debt, and present a simple solution that could make the country solvent once again. It is not a new solution but dates back to the Constitution: the power to create money needs to be returned to the government and the people it represents. The federal debt could be paid, income taxes could be eliminated, and social programs could be expanded; and this could all be done without imposing austerity measures on the people or sparking runaway inflation. Utopian as that may sound, it represents the thinking of some of America's brightest and best, historical and contemporary, including Abraham Lincoln, Thomas Jefferson and Benjamin Franklin. Among other arresting facts explored in this book are that:

·       The "Federal" Reserve is not actually federal. It is a private corporation owned by a consortium of very large multinational banks. (Chapter 13.)

·       Except for coins, the government does not create money. Dollar bills (Federal Reserve Notes) are created by the private Federal Reserve, which lends them to the government. (Chapter 2.)

·       Tangible currency (coins and dollar bills) together make up less than 3 percent of the U.S. money supply. The other 97 percent exists only as data entries on computer screens, and all of this money was created by banks in the form of loans. (Chapters 2 and 17.)

·       The money that banks lend is not recycled from pre-existing deposits. It is new money, which did not exist until it was lent. (Chapters 17 and 18.)

·       Thirty percent of the money created by banks with accounting entries is invested for their own accounts. (Chapter 18.)

·       The American banking system, which at one time extended productive loans to agriculture and industry, has today become a giant betting machine. By December 2007, an estimated $681 trillion were riding on complex high-risk bets known as derivatives — 10 times the annual output of the entire world economy. These bets are funded by big U.S. banks and are made largely with borrowed money created on a computer screen. Derivatives can be and have been used to manipulate markets, loot businesses, and destroy competitor economies. (Chapters 20 and 32.)

·       The U.S. federal debt has not been paid off since the days of Andrew Jackson. Only the interest gets paid, while the principal portion continues to grow. (Chapter 2.)

·       The federal income tax was instituted specifically to coerce taxpayers to pay the interest due to the banks on the federal debt. If the money supply had been created by the government rather than borrowed from banks that created it, the income tax would have been unnecessary. (Chapters 13 and 43.)

·       The interest alone on the federal debt will soon be more than the taxpayers can afford to pay. When we can't pay, the Federal Reserve's debt-based dollar system will collapse. (Chapter 29.)

·       Contrary to popular belief, creeping inflation is not caused by the government irresponsibly printing dollars. It is caused by banks expanding the money supply with loans. (Chapter 10.)

·       Most of the runaway inflation seen in "banana republics" has been caused, not by national governments over-printing money, but by global institutional speculators attacking local currencies and devaluing them on international markets. (Chapter 25.)

·       The same sort of speculative devaluation could happen to the U.S. dollar if international investors were to abandon it as a global "reserve" currency, something they are now threatening to do in retaliation for what they perceive to be American economic imperialism. (Chapters 29 and 37.)

·       There is a way out of this morass. The early American colonists found it, and so did Abraham Lincoln and some other national leaders: the government can take back the money-issuing power from the banks. (Chapters 8 and 24.)

The bankers' Federal Reserve Notes and the government's coins represent two separate money systems that have been competing for dominance throughout recorded history. At one time, the right to issue money was the sovereign right of the king; but that right got usurped by private moneylenders. Today the sovereigns are the people, and the coins that make up less than one one-thousandth of the money supply are all that are left of our sovereign money. Many nations have successfully issued their own money, at least for a time; but the bankers' debt-money has generally infiltrated the system and taken over in the end. These concepts are so foreign to what we have been taught that it can be hard to wrap our minds around them, but the facts have been substantiated by many reliable authorities….

Today, Federal Reserve Notes and U.S. dollar loans dominate the economy of the world; but this international currency is not money issued by the American people or their government. It is money created and lent by a cartel of international bankers, and this cartel has the United States itself hopelessly entangled in a web of debt. By 2006, combined personal, corporate and federal debt in the United States had reached a staggering 44 trillion dollars - four times the collective national income, or $147,312 for every man, woman and child in the country. The United States is legally bankrupt, defined in the dictionary as being unable to pay one's debts, being insolvent, or having liabilities in excess of a reasonable market value of assets held. By October 2006, the debt of the U.S. government had hit a breathtaking $8.5 trillion. Local, state and national governments are all so heavily in debt that they have been forced to sell off public assets to satisfy creditors. Crowded schools, crowded roads, and cutbacks in public transportation are eroding the quality of American life. A 2005 report by the American Society of Civil Engineers gave the nation's infrastructure an overall grade of D, including its roads, bridges, drinking water systems and other public works. "Americans are spending more time stuck in traffic and less time at home with their families," said the group's president. "We need to establish a comprehensive, long-term infrastructure plan." We need to but we can't, because government at every level is broke.

If governments everywhere are in debt, who are they in debt to? The answer is that they are in debt to private banks. The "cruel hoax" is that governments are in debt for money created on a computer screen, money they could have created themselves.

Money in the Land of Oz

Before World War I, two opposing systems of political economy competed for dominance in the United States.  One operated out of Wall Street, the New York financial district that came to be the symbol of American finance.  Its most important address was 23 Wall Street, known as the “House of Morgan.” J.P. Morgan was an agent of powerful British banking interests.  The Wizards of Wall Street and the Old World bankers pulling their strings sought to establish a national currency that was based on the "gold standard," one created privately by the financial elite who controlled the gold. The other system dated back to Benjamin Franklin and operated out of Philadelphia, the country's first capital, where the Constitutional Convention was held and Franklin's "Society for Political Inquiries" planned the industrialization and public works that would free the new republic from economic slavery to England. The Philadelphia faction favored a bank on the model established in provincial Pennsylvania, where a state loan office issued and lent money, collected the interest, and returned it to the provincial government to be used in place of taxes. President Abraham Lincoln returned to the colonial system of government-issued money during the Civil War; but he was assassinated, and the bankers reclaimed control of the money machine. The silent coup of the Wall Street faction culminated with the passage of the Federal Reserve Act in 1913, something they achieved by misleading Bryan and other wary Congressmen into thinking the Federal Reserve was actually federal.

Today the debate over who should create the national money supply is rarely heard, mainly because few people even realize it is an issue. Politicians and economists, along with everybody else, simply assume that money is created by the government, and that the "inflation" everybody complains about is caused by an out-of-control government running the dollar printing presses. The puppeteers working the money machine were more visible in the 1890s than they are today, largely because they had not yet succeeded in buying up the media and cornering public opinion.

Economics is a dry and forbidding subject that has been made intentionally complex by banking interests intent on concealing what is really going on. It is a subject that sorely needs lightening up, with imagery, metaphors, characters and a plot; so before we get into the ponderous details of the modern system of money-based-on-debt, we'll take an excursion back to a simpler time, when the money issues were more obvious and were still a burning topic of discussion. The plot line for The Wizard of Oz has been traced to the first-ever march on Washington, led by an obscure Ohio businessman who sought to persuade Congress to return to Lincoln's system of government-issued money in 1894. Besides sparking a century of protest marches and the country's most famous fairytale, this little-known visionary and the band of unemployed men he led may actually have had the solution to the whole money problem, then and now....

Chapter 1. Lessons from The Wizard of Oz

In refreshing contrast to the impenetrable writings of economists, the classic fairytale The Wizard of Oz has delighted young and old for over a century. It was first published by L. Frank Baum as The Wonderful Wizard of Oz in 1900. In 1939, it was made into a hit Hollywood movie starring Judy Garland, and later it was made into the popular stage play The Wiz. Few of the millions who have enjoyed this charming tale have suspected that its imagery was drawn from that most obscure and tedious of subjects, banking and finance. Fewer still have suspected that the real-life folk heroes who inspired its plot may actually have had the answer to the financial crisis facing the country today!

The economic allusions in Baum's tale were first observed in 1964 by a schoolteacher named Henry Littlefield, who called the story "a parable on Populism," referring to the People's Party movement challenging the banking monopoly in the late nineteenth century. Other analysts later picked up the theme. Economist Hugh Rockoff, writing in the Journal of Political Economy in 1990, called the tale a "monetary allegory."' Professor Tim Ziaukas, writing in 1998, stated:

"The Wizard of Oz" ... was written at a time when American society was consumed by the debate over the "financial question," that is, the creation and circulation of money.... The characters of "The Wizard of Oz" represented those deeply involved in the debate: the Scarecrow as the farmers, the Tin Woodman as the industrial workers, the Lion as silver advocate William Jennings Bryan and Dorothy as the archetypal American girl.

The Wizard of Oz has been called "the first truly American fairytale.” The Germans established the national fairytale tradition with Grimm's Fairy Tales, a collection of popular folklore gathered by the Brothers Grimm specifically to reflect German populist traditions and national values. Baum's book did the same thing for the American populist (or people's) tradition. It was all about people power, manifesting your dreams, finding what you wanted in your own backyard. According to Littlefield, the march of Dorothy and her friends to the Emerald City to petition the Wizard of Oz for help was patterned after the 1894 march from Ohio to Washington of an "Industrial Army" led by Jacob Coxey, urging Congress to return to the system of debt-free government-issued Greenbacks initiated by Abraham Lincoln. The march of Coxey's Army on Washington began a long tradition of people taking to the streets in peaceful protest when there seemed no other way to voice their appeals. As Lawrence Goodwin, author of The Populist Moment, described the nineteenth century movement to change the money system:

[T]here was once a time in history when people acted.... [F]armers were trapped in debt. They were the most oppressed of Americans, they experimented with cooperative purchasing and marketing, they tried to find their own way out of the strangle hold of debt to merchants, but none of this could work if they couldn't get capital. So they had to turn to politics, and they had to organize themselves into a party.... [T]he populists didn't just organize a political party, they made a movement. They had picnics and parties and newsletters and classes and courses, and they taught themselves, and they taught each other, and they became a group of people with a sense of purpose, a group of people with courage, a group of people with dignity.

Like the Populists, Dorothy and her troop discovered that they had the power to solve their own problems and achieve their own dreams. The Scarecrow in search of a brain, the Tin Man in search of a heart, the Lion in search of courage actually had what they wanted all along. When the Wizard's false magic proved powerless, the Wicked Witch was vanquished by a defenseless young girl and her little dog. When the Wizard disappeared in his hot air balloon, the unlettered Scarecrow took over as leader of Oz.

The Wizard of Oz came to embody the American dream and the American national spirit. In the United States, the land of abundance, all you had to do was to realize your potential and manifest it. That was one of the tale's morals, but it also contained a darker one, a message for which its imagery has become a familiar metaphor: that there are invisible puppeteers pulling the strings of the puppets we see on the stage, in a show that is largely illusion.

The March on Washington that Inspired the March on Oz

Money reform advocates today tend to argue that the solution to the country's financial woes is to return to the "gold standard," which required that paper money be backed by a certain weight of gold bullion. But to the farmers and laborers who were suffering under its yoke in the 1890s, the gold standard was the problem. They had been there and knew it didn’t work. William Jennings Bryan called the bankers' private gold-based money a "cross of gold." There was simply not enough gold available to finance the needs of an expanding economy. The bankers made loans in notes backed by gold and required repayment in notes backed by gold; but the bankers controlled the gold, and its price was subject to manipulation by speculators.  Gold's price had increased over the course of the century, while the prices laborers got for their wares had dropped. People short of gold had to borrow from the bankers, who periodically contracted the money supply by calling in loans and raising interest rates. The result was "tight" money - insufficient money to go around. Like in a game of musical chairs, the people who came up short wound up losing their homes to the banks.

The solution of Jacob Coxey and his Industrial Army of destitute unemployed men was to augment the money supply with government-issued United States Notes. Popularly called "Greenbacks," these federal dollars were first issued by President Lincoln when he was faced with usurious interest rates in the 1860s. Lincoln had foiled the bankers by making up the budget shortfall with U.S. Notes that did not accrue interest and did not have to be paid back to the banks. The same sort of debt-free paper money had financed a long period of colonial abundance in the eighteenth century, until King George forbade the colonies from issuing their own currency. The money supply had then shrunk, precipitating a depression that led to the American Revolution.

To remedy the tight-money problem that resulted when the Greenbacks were halted after Lincoln's assassination, Coxey proposed that Congress should increase the money supply with a further $500 million in Greenbacks. This new money would be used to redeem the federal debt and to stimulate the economy by putting the unemployed to work on public projects. The bankers countered that allowing the government to issue money would be dangerously inflationary. What they failed to reveal was that their own paper banknotes were themselves highly inflationary, since the same gold was "lent" many times over, effectively counterfeiting it; and when the bankers lent their paper money to the government, the government wound up heavily in debt for something it could have created itself. But those facts were buried in confusing rhetoric, and the bankers' "gold standard" won the day.

The Silver Slippers: The Populist Solution to the Money Question

The Greenback Party was later absorbed into the Populist Party, which took up the cause against tight money in the 1890s. Like the Greenbackers, the Populists argued that money should be issued by the government rather than by private banks. William Jennings Bryan, the Populists' loquacious leader, gave such a stirring speech at the Democratic convention that he won the Democratic nomination for President in 1896. Outgoing President Grover Cleveland was also a Democrat, but he was an agent of J. P. Morgan and the Wall Street banking interests. Cleveland favored money that was issued by the banks, and he backed the bankers' gold standard. Bryan was opposed to both. He argued in his winning nomination speech:

We say in our platform that we believe that the right to coin money and issue money is a function of government.... Those who are opposed to this proposition tell us that the issue of paper money is a function of the bank and that the government ought to go out of the banking business. I stand with Jefferson ... and tell them, as he did, that the issue of money is a function of the government and that the banks should go out of the governing business.... [W]hen we have restored the money of the Constitution, all other necessary reforms will be possible, and ... until that is done there is no reform that can be accomplished.

He concluded with these famous lines:

You shall not press down upon the brow of labor this crown of thorns, you shall not crucify mankind upon a cross of gold.

Since the Greenbackers' push for government-issued paper money had failed, Bryan and the "Silverites" proposed solving the liquidity problem in another way. The money supply could be supplemented with coins made of silver, a precious metal that was cheaper and more readily available than gold. Silver was considered to be "the money of the Constitution." The Constitution referred only to the "dollar," but the dollar was understood to be a reference to the Spanish milled silver dollar coin then in common use. The slogan of the Silverites was "16 to.1": 16 ounces of silver would be the monetary equivalent of 1 ounce of gold. Ounces is abbreviated oz, hence "Oz." The Wizard of the Gold Ounce (Oz) in Washington was identified by later commentators as Marcus Hanna, the power behind the Republican Party, who controlled the mechanisms of finance in the administration of President William McKinley.

An Allegory of Money, Politics and Believing in Yourself

The moral also worked for the nation itself. The economy was deep in depression, but the country's farmlands were still fertile and its factories were ready to roll. Its entranced people merely lacked the paper tokens called "money" that would facilitate production and trade. The people had been deluded into a belief in scarcity by defining their wealth in terms of a scarce commodity, gold. The country's true wealth consisted of its goods and services, its resources and the creativity of its people. Like the Tin Woodman in need of oil, all it needed was a monetary medium that would allow this wealth to flow freely, circulating from the government to the people and back again, without being perpetually siphoned off into the private coffers of the bankers.

Sequel to Oz

The Populists did not achieve their goals, but they did prove that a third party could influence national politics and generate legislation. Although Bryan the Lion failed to stop the bankers, Dorothy's prototype Jacob Coxey was still on the march. In a plot twist that would be considered contrived if it were fiction, he reappeared on the scene in the 1930s to run against Franklin D. Roosevelt for President, at a time when the "money question" had again become a burning issue. In one five-year period, over 2,000 schemes for monetary reform were advanced. Needless to say, Coxey lost the election; but he claimed that his Greenback proposal was the model for the "New Deal," Roosevelt's plan for putting the unemployed to work on government projects to pull the country out of the Depression. The difference was that Coxey's plan would have been funded with debt-free currency issued by the government, on Lincoln's Greenback model. Roosevelt funded the New Deal with borrowed money, indebting the country to a banking cartel that was surreptitiously creating the money out of thin air, just as the government itself would have been doing under Coxey's plan without accruing a crippling debt to the banks.

After World War II, the money question faded into obscurity. Today, writes British economist Michael Rowbotham, "The surest way to ruin a promising career in economics, whether professional or academic, is to venture into the 'cranks and crackpots' world of suggestions for reform of the financial system." Yet the claims of these cranks and crackpots have consistently proven to be correct. The U.S. debt burden has mushroomed out of control, until just the interest on the federal debt now threatens to be a greater tax burden than the taxpayers can afford. The gold standard precipitated the problem, but unbuckling the dollar from gold did not solve it. Rather, it caused worse financial ills. Expanding the money supply with increasing amounts of "easy" bank credit just put increasing amounts of money in the bankers' pockets, while consumers sank further into debt. The problem has proven to be something more fundamental: it is in who extends the nation’s credit. As long as the money supply is created as a debt owed back to private banks with interest, the nation’s wealth will continue to be drained off into private vaults, leaving scarcity in its wake.

Today's monetary allegory goes something like this: the dollar is a national resource that belongs to the people. It was an original invention of the early American colonists, a new form of paper currency backed by the "full faith and credit" of the people. But a private banking cartel has taken over its issuance, turning debt into money and demanding that it be paid back with interest. Taxes and a crushing federal debt have been imposed by a financial ruling class that keeps the people entranced and enslaved. In the happy storybook ending, the power to create money is returned to the people and abundance returns to the land. But before we get there, the Yellow Brick Road takes us through the twists and turns of history and the writings and insights of a wealth of key players. We're off to see the Wizard....

Chapter 2. Behind the Curtain: The Federal Reserve and the Federal Debt

A Game of Smoke and Mirrors

As for keeping "reserves," Wright Patman [Chairman of the House of Representatives’ Banking and Currency Committee in the 1960s] decided to see for himself. Having heard that Federal Reserve Banks hold large amounts of cash, he visited two regional Federal Reserve banks, where he was led into vaults and shown great piles of government securities (I.O.U.s representing debt). When he asked to see their cash, the bank officials seemed confused. He repeated the request, only to be shown some ledgers and bank checks. Patman wrote:

The cash, in truth, does not exist and never has existed. What we call "cash reserves" are simply bookkeeping credits entered upon the ledgers of the Federal Reserve Banks. These credits are created by the Federal Reserve Banks and then passed along through the banking system.

Where did the Federal Reserve get the money to acquire all the government bonds in its vaults? Patman answered his own rhetorical question:

It doesn't get money, it creates it. When the Federal Reserve writes a check for a government bond it does exactly what any bank does, it creates money, it created money purely and simply by writing a check. [When] the recipient of the check wants cash, then the Federal Reserve can oblige him by printing the cash - Federal Reserve notes - which the check receiver's commercial bank can hand over to him. The Federal Reserve, in short, is a total money-making machine.

Taking It to Court

First National Bank of Montgomery vs. Daly was a courtroom drama worthy of a movie script.  Defendant Jerome Daly opposed the bank’s foreclosure on his $14,000 home mortgage loan on the ground that there was no consideration for the loan.  “Consideration” (“the thing exchanged”) is an essential element of a contract.  Daly, an attorney representing himself, argued that the bank had put up no real money for his loan.

The courtroom proceedings were recorded by Associate Justice Bill Drexler, whose chief role, he said, was to keep order in a highly charged courtroom where the attorneys were threatening a fist fight. Drexler hadn’t given much credence to the theory of the defense, until Mr. Morgan, the bank's president, took the stand. To everyone's surprise Morgan admitted that the bank routinely created the money it lent “out of thin air," and that this was standard banking practice.

It sounds like fraud to me," intoned Presiding Justice Martin Mahoney amid nods from the jurors. In his court memorandum, Justice Mahoney stated:

Plaintiff admitted that it, in combination with the Federal Reserve Bank of Minneapolis … did create the entire $14,000.00 in money and credit upon its own books by bookkeeping entry. That this was the consideration used to support the Note dated May 8, 1964, and the Mortgage of the same date. The money and credit first came into existence when they created it. Mr. Morgan admitted that no United States Law or Statute existed which gave him the right to do this. A lawful consideration must exist and be tendered to support the Note.

The court rejected the bank's claim for foreclosure, and the defendant kept his house. To Daly, the implications were enormous. If bankers were indeed extending credit without consideration - without backing their loans with money they actually had in their vaults and were entitled to lend - a decision declaring their loans void could topple the power base of the world. He wrote in a local news article:

This decision, which is legally sound, has the effect of declaring all private mortgages on real and personal property, and all U.S. and State bonds held by the Federal Reserve, National and State banks to be null and void. This amounts to an emancipation of this Nation from personal, national and state debt purportedly owed to this banking system. Every American owes it to himself … to study this decision very carefully ... for upon it hangs the question of freedom or slavery.

Needless to say, however, the decision failed to change prevailing practice, although it was never overruled. It was heard in a Justice of Peace Court, an autonomous court system dating back to those frontier days when defendants had trouble traveling to big cities to respond to summonses. In that system (which has now largely been phased out), judges and courts were pretty much on their own. Justice Mahoney went so far as to threaten to prosecute and expose the bank.  He died less than six months after the Daly trial, in a mysterious accident that appeared to involve poisoning.

The "Impossible Contract"

There are other legal grounds on which the bankers' fractional reserve loans might be challenged besides failure of consideration and fraud. In theory, at least, these loan contracts could be challenged because they are collectively impossible to perform. Under state civil codes, a contract that is impossible to perform is void. The impossibility in this case arises because the banks create the principal but not the interest needed to pay back their loans. The debtors scramble to find the interest somewhere else, but there is never enough money to go around. Like in a grand game of musical chairs, when the music stops, somebody has to default. In an 1850 treatise called The Importance of Usury Laws, a writer named John Whipple did the math. He wrote:

If 5 English pennies ... had been [lent] at 5 per cent compound interest from the beginning of the Christian era until the present time (say 1850), it would amount in gold of standard fineness to 32,366,648,157 spheres of gold each eight thousand miles in diameter, or as large as the earth.

Thirty-two billion earth-sized spheres! Such is the nature of compound interest -- interest calculated not only on the initial principal but on the accumulated interest of prior payment periods. The interest “compounds" in a parabolic curve that is virtually flat at first but goes nearly vertical after 100 years. Debts don't usually grow to these extremes because most loans are for 30 years or less, when the curve remains relatively flat. But the premise still applies: in a system in which money comes into existence only by borrowing at interest, the system as a whole is always short of funds, and somebody has to default.

Bernard Lietaer helped design the single currency system (the Euro) and has written several books on monetary reform. He explains the interest problem like this:

When a bank provides you with a $100,000 mortgage, it creates only the principal, which you spend and which then circulates in the economy. The bank expects you to pay back $200,000 over the next 20 years, but it doesn't create the second $100,000 – the interest. Instead, the bank sends you out into the tough world to battle against everybody else to bring back the second $100,000.

The problem is that all money except coins now comes from banker – created loans, so the only way to get the interest owed on old loans is to take out new loans, continually inflating the money supply; either that, or some borrowers have to default. Lietaer concluded:

[G]reed and competition are not a result of immutable human temperament.... [G]reed and fear of scarcity are in fact being continuously created and amplified as a direct result of the kind of money we are using.... [W]e can produce more than enough food to feed everybody, and there is definitely enough work for everybody in the world, but there is clearly not enough money to pay for it all. The scarcity is in our national currencies. In fact, the job of central banks is to create and maintain that currency scarcity. The direct consequence is that we have to fight with each other in order to survive."

The Money Supply and the Federal Debt

To keep the economic treadmill turning, not only must the money supply continually inflate but the federal debt must continually expand. The reason was revealed by Marriner Eccles, Governor of the Federal Reserve Board, in hearings before the House Committee on Banking and Currency in 1941. Wright Patman asked Eccles how the Federal Reserve got the money to buy government bonds.

"We created it," Eccles replied.

"Out of what?"

"Out of the right to issue credit money."

"And there is nothing behind it, is there, except our government's credit?"

"That is what our money system is," Eccles replied. "If there were mo debts in our money system, there wouldn't be any money.”

That explains why the federal debt never gets paid off but just continues to grow. The federal debt hasn’t been paid off since the presidency of Andrew Jackson nearly two centuries ago. Rather, in all but five fiscal years since 1961 (1969 and 1998 through 2001), the government has exceeded its projected budget, adding to the national debt.  Economist John Kenneth Galbraith wrote in 1975:

In numerous years following the [civil] war, the Federal Government ran a heavy surplus. [But] it could not pay off its debt, retire its securities, because to do so meant there would be no bonds to back the national bank notes. To pay off the debt was to destroy the money supply.

The federal debt has been the basis of the U.S. money supply ever since the Civil War, when the National Banking Act authorized private banks to issue their own banknotes backed by government bonds deposited with the U.S. Treasury. (This complicated bit of chicanery is explored in Chapter 9.) When President Clinton announced "the largest budget surplus in history" in 2000, and President Bush predicted a $5.6 trillion surplus by the end of the decade, many people got the impression that the federal debt had been paid off; but this was another illusion. Not only did the $5.6 trillion budget "surplus" never materialize (it was just an optimistic estimate projected over a ten-year period based on an anticipated surplus for the year 2001 that never materialized), but it entirely ignored the principal owing on the federal debt. Like the deluded consumer who makes the minimum monthly interest payment on his credit card bill and calls his credit limit "cash on hand," politicians who speak of "balancing the budget" include in their calculations only the interest on the national debt. By 2000, when President Clinton announced the largest-ever budget surplus, the federal debt had actually topped $5 trillion; and by October 2005, when the largest-ever projected surplus had turned into the largest-ever budget deficit, the federal debt had mushroomed to $8 trillion. M3 was $9.7 trillion the same year, not much more. It is hardly an exaggeration to say that the money supply is the federal debt and cannot exist without it. Commercial loans alone cannot sustain the money supply because they zero out when they get paid back. In order to keep money in the system, some major player has to incur substantial debt that never gets paid back; and this role is played by the federal government.

That is one reason the federal debt can't be paid off, but today there is an even more compelling reason: the debt has simply grown too large. To get some sense of the magnitude of an 8-plus trillion dollar debt, if you took 7 trillion steps you could walk to the planet Pluto, which is a mere 4 billion miles away. If the government were to pay $100 every second, in 317 years it would have paid off only one trillion dollars of debt. And that's just for the principal. If interest were added at the rate of only 1 percent compounded annually, the debt could never be paid off in this way, because the debt would grow faster than it was being repaid. Paying an $8-plus trillion debt off in a lump sum through taxation, on the other hand, would require increasing the tax bill by more than $100,000 for every family of four, a non-starter for most families.

In the 1980s, policymakers openly declared that "deficits don't matter." The government could engage in "deficit spending" and simply allow the debt to grow. This policy continues to be cited with approval by policymakers today. The truth is that nobody even expects the debt to be paid off, because it can't be paid off - at least, it can't while money is created as a debt to private banks. The government doesn't have to pay the principal so long as it keeps "servicing" the debt by paying the interest. But according to David M. Walker, Director of he U.S. General Accounting Office and Comptroller General of the United States, just the interest tab will soon be more than the taxpayers can afford to pay. When the government can’t pay the interest, it will lave to renege on the debt, and the economy will collapse.

How did we get into this witches' cauldron, and how can we get out of it? The utopian vision of the early American colonists involved a money system that was quite different from what we have today. To understand what we lost and how we lost it, we'll take a journey back down the Yellow Brick Road to eighteenth century America.

Chapter 3. Experiments in Utopia: Colonial Paper Money as Legal Tender

Representation Without Taxation

The new paper money did more than make the colonies independent of the British bankers and their gold. It actually allowed the colonists to finance their local governments without taxing the people. Alvin Rabushka, a senior fellow at the Hoover Institution at Stanford University, traces this development in a 2002 article called "Representation Without Taxation." He writes that there were two main ways the colonies issued paper money. Most colonies used both, in varying proportions. One was a direct issue of notes, usually called "bills of credit" or "treasury notes." These were I.O.U.s of the government backed by specific future taxes; but the payback was deferred well into the future, and sometimes the funds never got returned to the treasury at all. Like in a bathtub without a drain, the money supply kept increasing without a means of recycling it back to its source. However, the funds were at least not owed back to private foreign lenders, and no interest was due on them. They were just credits issued and spent into the economy on goods and services.

The recycling problem was solved when a second method of issue was devised. Colonial assemblies discovered that provincial loan offices could generate a steady stream of revenue in the form of interest by taking on the lending functions of banks. A government loan office called a "land bank" would issue paper money and lend it to residents (usually farmers) at low rates of interest….

King George Steps In

…The directors of the Bank of England asked [Benjamin Franklin] what was responsible for the booming economy of the young colonies.  Franklin replied:

That is simple.  In the colonies we issue our own money.  It is called Colonial Scrip.  We issue it to pay the government’s approved expenses and charities.  We make sure it is issued in proper proportion to make the goods pass easily from the producers to the consumers.... In this manner, creating for ourselves our own paper money, we control its purchasing power, and we have no interest to pay to no one. You see, a legitimate government can both spend and lend money into circulation, while banks can only lend significant amounts of their promissory bank notes, for they can neither give away nor spend but a tiny fraction of the money the people need. Thus, when your bankers here in England place money in circulation, there is always a debt principal to be returned and usury to be paid. The result is that you have always too little credit in circulation to give the workers full employment. You do not have too many workers, you have too little money in circulation, and that which circulates, all bears the endless burden of unpayable debt and usury.

Banks were limited to lending money into the economy; and since more money was always owed back in principal and interest (or “usury”) than was lent in the original loans, there was never enough money in circulation to pay the interest and still keep workers fully employed. The government, on the other hand, had two ways of getting money into the economy: it could both lend and spend the money into circulation. It could spend enough new money to cover the interest due on the money it lent, keeping the money supply in "proper proportion" and preventing the "impossible contract" problem — the problem of having more money owed back on loans than was created by the loans themselves.

After extolling the benefits of colonial scrip to the citizens of Pennsylvania, Franklin told his listeners, "New York and New Jersey have also increased greatly during the same period, with the use of paper money; so that it does not appear to be of the ruinous nature ascribed to it." Jason Goodwin observes that it was a tricky argument to make. The colonists had been stressing to the mother country how poor they were — so poor, they were forced to print paper money for lack of precious metals. Franklin's report demonstrated to Parliament and the British bankers that the pretext for allowing paper money had been removed. The point of having colonies was not, after all, to bolster the colonies' economies. It was to provide raw materials at decent rates to the mother country. In 1764, the Bank of England used its influence on Parliament to get a Currency Act passed that made it illegal for any of the colonies to print their own money. The colonists were forced to pay all future taxes to Britain in silver or gold. Anyone lacking in those precious metals had to borrow them at interest from the banks.

Chapter 4.  How the Government Was Persuaded to Borrow Its Own Money

President John Adams is quoted as saying, "There are two ways to conquer and enslave a nation. One is by the sword. The other is by debt." Sheldon Emry, expanding on this concept two centuries later, observed that conquest by the sword has the disadvantage that the conquered are likely to rebel. Continual force is required to keep them at bay. Conquest by debt can occur so silently and insidiously that the conquered don’t even realize they have new masters. On the surface, nothing has changed. The country is merely under new management. "Tribute" is collected in the form of debts and taxes, which the people believe they are paying for their own good. "Their captors," wrote Emry, "become their 'benefactors' and 'protectors.'... Without realizing it, they are conquered, and the instruments of their own society are used to transfer their wealth to their captors and make the conquest complete."

Colonies in the seventeenth and eighteenth centuries all had the same purpose - to enhance the economy of the mother country. That was how the mother country saw it, but the American colonists had long opposed any plan that would systematically drain their money supply off to England. The British had considered the idea of a land bank as far back as 1754, as a way to provide a circulating medium of exchange for the colonies; but the idea was rejected by the colonists when they learned that the interest the bank generated would be subject to appropriation by the King. It was only after the American Revolution that British bankers and their Wall Street vassals succeeded in pulling this feat off by stealth, by acquiring a controlling interest in the stock of the new United States Bank.

The first step in that silent conquest was to discredit the paper scrip issued by the revolutionary government and the States. By the end of the Revolution, that step had been achieved. Rampant counterfeiting and speculation had so thoroughly collapsed the value of the Continental that the new country's leaders were completely disillusioned with what they called "unfunded paper." At the Constitutional Convention, Alexander Hamilton, Washington’s new Secretary of the Treasury, summed up the majority view when he said:

To emit an unfunded paper as the sign of value ought not to continue a formal part of the Constitution, nor ever hereafter to be employed; being, in its nature, repugnant with abuses and liable to be made the engine of imposition and fraud.

The Founding Fathers were so disillusioned with paper money that they simply omitted it from the Constitution. Congress was given the power only to "coin money, regulate the value thereof," and "to borrow money on the credit of the United States…." An enormous loophole was thus left in the law. Creating and issuing money had long been considered the prerogative of governments, but the Constitution failed to define exactly what "money" was. Was "to coin money" an eighteenth-century way of saying "to create money"? Did this include creating paper money? If not, who did have the power to create paper money? Congress was authorized to "borrow" money, but did that include borrowing paper money or just gold? The presumption was that the paper notes borrowed from the bankers were "secured" by a sum of silver or gold; but in the illusory world of finance, then as now, things were not always as they seemed....

The Bankers' Paper Money Comes in Through the Back Door

While the Founding Fathers were pledging their faith in gold and silver as the only "sound" money, those metals were quickly proving inadequate to fund the new country's expanding economy. The national war debt had reached $42 million, with no silver or gold coins available to pay it off. The debt might have been avoided if the government had funded the war with Continental scrip that was stamped legal tender," making it "money" in itself; but the revolutionary government and the States had issued much of their paper money as promissory notes payable after the war. The notes represented debt, and debt had now come due. The bearers expected to get their gold, and the gold was not to be had. There was also an insufficient supply of money for conducting trade. Tightening the money supply by limiting it to coins had quickly precipitated another depression. In 1786, a farmers' rebellion broke out in Massachusetts, led by Daniel Shays. Farmers brandishing pitchforks complained of going heavily into debt when paper money was plentiful. When it was no longer available and debts had to be repaid in the much scarcer "hard" coin of the British bankers, some farmers lost their farms. The rebellion was defused, but visions of anarchy solidified the sense of an urgent need for both a strong central government and an expandable money supply.

The solution of Treasury Secretary Hamilton was to "monetize" the national debt, by turning it into a source of money for the country. (To monetize means to convert government debt from securities evidencing debt (bills, bonds and notes) into currency that can be used to purchase goods and services.) He proposed that a national bank be authorized to print up banknotes and swap them for the government's bonds. The government would pay regular interest on the debt, using import duties and money from the sale of public land. Opponents said that acknowledging the government's debt at face value would unfairly reward the speculators who had bought up the country's I.O.U.s for a pittance from the soldiers, farmers and small businessmen who had actually earned them; but Hamilton argued that the speculators had earned this windfall for their "faith in the country." He thought the government needed to enlist the support of the speculators, or they would do to the new country's money what they had done to the Continental. Vernon Parrington, a historian writing in the 1920s, said:

In developing his policies as Secretary of the Treasury, [Hamilton] applied his favorite principle, that government and property must join in a close working alliance. It was notorious that during the Revolution men of wealth had forced down the continental currency for speculative purposes; was it not as certain that they would support an issue in which they were interested? The private resources of wealthy citizens would thus become an asset of government, for the bank would link "the interest of the State in an intimate connection with those of the rich individuals belonging to it."

Hamilton thought that the way to keep wealthy speculators from destroying the new national bank was to give them a financial stake in it. His proposal would do this and dispose of the government's crippling debts at the same time, by allowing creditors to trade their government bonds or I.O.U.s for stock in the new bank.

Jefferson, Hamilton’s chief political opponent, feared that giving private wealthy citizens an ownership interest in the bank would link their interests too closely with it. The government would be turned into an oligarchy, a government by the rich at war with the working classes. A bank owned by private stockholders, whose driving motive was profit, would be less likely to be responsive to the needs of the public than one that was owned by the public and subject to public oversight. Stockholders of a private bank would make their financial decisions behind closed doors, without public knowledge or control.

But Hamilton's plan had other strategic advantages, and it won the day. Besides neatly disposing of a crippling federal debt and winning over the "men of wealth," it secured the loyalty of the individual States by making their debts too exchangeable for stock in the new Bank. The move was controversial; but by stabilizing the States' shaky finances, Hamilton got the States on board, thwarting the plans of the pro-British faction that hoped to split them up and establish a Northern Confederacy.

Promoting the General Welfare: The American System Versus the British System

Hamilton’s goal was first and foremost a strong federal government. He was the chief author of The Federalist Papers, which helped to get the votes necessary to ratify the Constitution and formed the basis for much of it. The Preamble to the Constitution made promoting the general welfare a guiding principle of the new Republic. Hamilton's plan for achieving this ideal was to nurture the country's fledgling industries with protective measures such as tariffs (taxes placed on imports or exports) and easy credit provided through a national bank. Production and the money to finance it would all be kept "in house," independent of foreign financiers.

Senator Henry Clay later called this the "American system' to distinguish it from the "British system' of "free trade.” (The term "free trade" is used to mean trade between nations unrestricted by such things as import duties and trade quotas. Critics say that in more developed nations, it results in jobs being "exported" abroad, while in less developed orations, workers and the environment are exploited by foreign financiers.) Clay was a student of Matthew Carey, a well-known printer and publisher who had been tutored by Benjamin Franklin. What Clay called the "British system" was rooted in the dog-eat-dog world of Thomas Hobbes, John Locke and Scottish economist Adam Smith. Smith maintained in his 1776 book The Wealth of Nations that if every man pursued his own greed, all would automatically come out right, as if by some "invisible hand." Proponents of the American system rejected this laissez-faire approach in favor of guiding and protecting the young country with a system of rules and regulations. They felt that if the economy were left to the free market, big monopolies would gobble up small entrepreneurs; foreign bankers and industrialists could exploit the country's labor and materials; and competition would force prices down, ensuring subjugation to British imperial interests.

The British model assumed that one man's gain could occur only through another's loss. The goal was to reach the top of the heap by climbing on competitors and driving them down. In the American vision of the "Common Wealth," all men would rise together by leavening the whole heap at once. A Republic of sovereign States would work together for their mutual benefit, improving their collective lot by promoting production, science, industry and trade, raising the standard of living and the technological practice of all by cooperative effort. It was an idealistic reflection of the American dream, which assumed the best in people and in human potential. You did not need to exploit foreign lands and people in pursuit of "free trade." Like Dorothy in The Wizard of Oz, you could find your heart's desire in your own backyard.

That was the vision, but in the sort of negotiated compromise that has long characterized politics, it got lost somewhere in the details.

Hamilton Charters a Bank

Hamilton argued that to promote the General Welfare, the country needed a monetary system that was independent of foreign masters; and for that, it needed its own federal central bank. The bank would handle the government's enormous war debt and create a standard form of currency. Jefferson remained suspicious of Hamilton and his schemes, but Jefferson also felt strongly that the new country's capital city should be in the South, in his home state of Virginia. Hamilton (who did not care where the capital was) agreed on the location of the national capital in exchange for Jefferson’s agreement on the bank.

When Hamilton called for a tax on whiskey to pay the interest on the government's securities, however, he went too far. Jefferson's supporters were furious. In the type of political compromise still popular today, President Washington proposed moving the capital even closer to Mt. Vernon. In 1789, Congress passed Hamilton bill; but the President still had to sign it. Washington was concerned about the continued opposition of Jefferson and the Virginians, who thought the bill was unconstitutional. The public would have to use the bank, but the bank would not have to serve the public. Hamilton assured the President that to protect the public, the bank would be required to retain a percentage of gold in "reserve" so that it could redeem its paper notes in gold or silver on demand. Hamilton was eloquent; and in 1791, Washington signed the bill into law.

The new banking scheme was hailed as a brilliant solution to the nation’s economic straits, one that disposed of an oppressive national debt, stabilized the economy, funded the government's budget, and created confidence in the new paper dollars. If the new Congress had simply printed its own paper money, speculators would have challenged the currency's worth and driven down its value, just as they had during the Revolution. To maintain public confidence in the national currency and establish its stability, the new Republic needed the illusion that its dollars were backed by the bankers' gold, and Hamilton’s bank successfully met that challenge. It got the country up and running, but it left the bank largely in private hands, where it could still be manipulated for private greed. Worse, the government ended up in debt for money it could have generated itself, indeed should have generated itself under the Constitution.

How the Government Wound Up Borrowing Its Own Bonds

The charter for the new bank fixed its total initial capitalization at ten million dollars. Eight million were to come from private stockholders and two million from the government. But the government did not actually have two million dollars, so the bank (now a chartered lending institution) lent the government the money at interest.  The bank, of course, did not have the money either. The whole thing was sleight of hand.

The rest of the bank's shares were sold to the public, who bought some in hard cash and some in government securities (the I.O.U.s that had been issued by the revolutionary government and the States). The government had to pay six percent interest annually on all the securities now held by the bank - those exchanged for the "loan" of the government's own money, plus the bonds accepted by the bank from the public. The bank's shareholders were supposed to pay one-fourth the cost of their shares in gold; but only the first installment was actually paid in hard money, totaling $675,000. The rest was paid in paper banknotes. Some came from the Bank of Boston and the Bank of New

York; but most of this paper money was issued by the new U.S. Bank itself and lent back to its new shareholders, through the magic of “fractional reserve" lending.

Within five years, the government had borrowed $8.2 million from the bank. The additional money was obviously created out of thin air, just as it would have been if the government had printed the money itself; but the government now owed principal and interest back to the bank. To reduce its debt to the bank, the government was eventually forced to sell its shares, largely to British financiers. Zarlenga reports that Hamilton, to his credit, opposed these sales. But the sales went through, and the first Bank of the United States wound up largely under foreign ownership and control.

When Political Duels Were Deadly

He remains a controversial figure, but Hamilton earned his place in history. He succeeded in stabilizing the shaky new economy and getting the country on its feet, and his notions of "monetizing" debt and "federalizing" the banking system were major innovations. He restored the country's credit, gave it a national currency, made it economically independent, and incorporated strong federal provisions into the Constitution that would protect and nurture the young country according to a uniquely American system founded on "promoting the General Welfare."

Those were his positive contributions, but Hamilton also left a darker legacy. Lurking behind the curtain in his new national bank, a privileged class of financial middlemen were now legally entitled to siphon off a perpetual tribute in the form of interest; and because they controlled the money spigots, they could fund their own affiliated businesses with easy credit, squeezing out competitors and perpetuating the same class divisions that the "American system" was supposed to have circumvented. The money power had been delivered into private hands; and they were largely foreign hands, the same interests that had sought to keep America in a colonial state, subservient to an elite class of oligarchical financiers.

Who were these foreign financiers, and how had they acquired so much leverage? The Yellow Brick Road takes us farther back in history, back to when the concept of "usury" was first devised....

Chapter 5. From Matriarchies of Abundance to Patriarchies of Debt

When Money Could Grow

It was only after the Indo-European invasions of the second millennium B.C. that moneylending became the private enterprise of the infamous moneychangers. The Goddess Inanna was superseded as the source of supreme kingship by the male god Enlil of Nippur, and the matriarchal system of shared communal abundance was forcibly emplaced by a militant patriarchal system. The cornucopia of the Horned Goddess became the bull horns of the Thunder God, representing masculine power, virility and force.

In the temple system, the community extended credit and received the money back with interest. In the system that displaced it, interest on debts went into private vaults to build the private fortunes of the moneychangers. Interest was thus transformed from a source of income for the community into a tool for impoverishing and enslaving people and nations. Unlike corn and cows, the gold the moneylenders lent was inorganic. It did not "grow," so there was never enough to cover the additional interest charges added to loans. When there was insufficient money in circulation to cover operating expenses, farmers had to borrow until harvest time; and the odd man out in the musical chairs of finding eleven coins to repay ten wound up in debtor's prison. Historically, most slavery originated from debt.

The Proscription Against Usury

"Usury" is now defined as charging "excess" interest, but originally it meant merely charging a fee or interest for the use of money. Usury was forbidden in the Christian Bible, and anti-usury laws were strictly enforced by the Catholic Church until the end of the Middle Ages. But in Jewish scriptures, which were later joined to the Christian books as the "Old Testament," usury was forbidden only between brothers. Charging interest to foreigners was allowed and even encouraged. The "moneychangers" thus came to be associated with Jews, but they were not actually the Jewish people. In fact, the Jewish people may have suffered more than any other people from the moneychangers' schemes, which were responsible for much antisemitism.

(See Deuteronomy (New World Translation) -- 15:6 [Y]ou will certainly lend on pledge to many nations, whereas you yourself will not borrow; and you must dominate over many nations, whereas over you they will not dominate. 23:19 You must not make your brother pay interest.... 23:20 You may make a foreigner pay interest, but your brother you must not make pay interest.)

In the informative documentary video The Money Masters, Bill Still and Patrick Carmack point out that when Jesus threw the moneychangers out of the temple, it was actually to protect the Jewish people. Half-shekels, the only pure silver coins of assured weight without the image of a pagan Emperor on them, were the only coins considered acceptable for paying the Temple tax, a tribute to God. But half-shekels were scarce, and the moneychangers had cornered the market for them. Like the modern banking cartel, they had monopolized the medium of exchange and were exacting a charge for its use.

Despite the injunctions in the New Testament, there were times when the king needed money. In the Middle Ages, England was short of gold, which had left during the Crusades. In 1087, when King William (Rufus) needed gold to do business with the French, he therefore admitted the moneylenders, on condition that the interest be demanded in gold and that half be paid to the king. But the moneylenders eventually became so wealthy at the expense of the people that the Church, with urgings from the Pope, prohibited them from taking interest; and in 1290, when they had lost their usefulness to the king, most Jews were again expelled from the country. This pattern, in which Jews as a people have been persecuted for the profiteering of a few and have been used as scapegoats to divert attention from the activities of the rulers, has been repeated over the centuries.

Money as a Simple Tally of Accounts

Meanwhile, England was faced with the problem of what to use for money when the country was short of gold. The coinage system was commodity-based. It assumed that "money" was something having value in itself (gold or silver), which was bartered or traded for goods or services of equal value. But according to Stephen Zarlenga, who has traced the origins and history of money in his revealing compendium The Lost Science of Money, the use of coins as money did not originate with merchants trading in the marketplace. The first known coins were issued by governments; and their value was the value stamped on them, not the price at which the metal traded. Zarlenga quotes Aristotle, who said:

Money exists not by nature but by law. [It acts] as a measure [that] makes goods commensurate and equates them.... There must then be a unit, and that fixed by agreement.

Money was a mere fiat of the law. Fiat means "let it be done" in Latin. "Fiat money" is money that is legal tender by government decree. It is simply a "tally," something representing units of value that can be traded in the market, a receipt for goods or services that can legally be tendered for other goods or services. In Mandarin China, where paper money was invented in the ninth century, this sort of fiat currency funded a long and prosperous empire. Fiat money was also used successfully in medieval England, but in England it was made of wood.

The English tally system originated with King Henry I, son of William the Conqueror, who took the throne in 1100 A.D. The printing press had not yet been invented, and taxes were paid directly with goods produced by the land. Under King Henry's innovative system, payment was recorded with a piece of wood that had been notched and split in half. One half was kept by the government and the other by the recipient. To confirm payment, the two halves were matched make sure they "tallied." Since no stick splits in an even manner, since the notches tallying the sums were cut right through both pieces of wood, the method was virtually foolproof against forgery. The tally system has been called the earliest form of bookkeeping, according to historian M. T. Clanchy in From Memory to Written Record, England 1066-1307:

Tallies were ... a sophisticated and practical record of numbers. They were more convenient to keep and store than parchments, less complex to make, and no easier to forge.

Only a few hundred tallies survive, Clanchy writes, but millions were made. Tallies were used by the government not only as receipts for the payment of taxes but to pay soldiers for their service, farmers for their wheat, and laborers for their labor. At tax time, the treasurer accepted the tallies in payment of taxes. By the thirteenth century, the financial market for tallies was sufficiently sophisticated that they could be bought, sold, or discounted. Tallies were used by individuals and institutions to register debts, record fines, collect rents, and enter payments for services rendered. In the 1500s, King Henry VIII gave them the force of a national currency when he ordered that tallies mist be used to evidence the payment of taxes. That meant everyone had to have them. In War Cycles, Peace Cycles, Richard Hoskins writes that by the end of the seventeenth century, about 14 million pounds’ worth of tally-money was in circulation. Zarlenga cites a historian named Spufford, who said that English coinage had never exceeded half a million pounds up to that time. The tally system was thus not a minor monetary experiment, as some commentators have suggested. During most of the Middle Ages, tallies may have made up the bulk of the English money supply. The tally system was in use for more than five centuries before the usury bankers' gold-based paper banknotes took root, helping to fund a long era of leisure and abundance that flowered into the Renaissance.

A Revisionist View of the Middle Ages

Modern schoolbooks generally portray the Middle Ages as a time of poverty, backwardness, and economic slavery, from which the people were freed only by the Industrial Revolution; but reliable early historians painted a quite different picture. Thorold Rogers, a nineteenth century Oxford historian, wrote that in the Middle Ages, "a labourer could provide all the necessities for his family for a year by working 14 weeks." Fourteen weeks is only a quarter of a year! The rest of the time, some men worked for themselves; some studied; some fished. Some helped to build the cathedrals that appeared all over Germany, France and England during the period, massive works of art that were built mainly with volunteer labor. Some used their leisure to visit these shrines. One hundred thousand pilgrims had the wealth and leisure to visit Canterbury and other shrines yearly. William Cobbett, author of the definitive History of the Reformation, wrote that Winchester Cathedral "was made when there were no poor rates; when every labouring man in England was clothed in good woollen cloth; and when all had plenty of meat and bread…." Money was available for inventions and art, supporting the Michelangelos, Rembrandts, Shakespeares, and Newtons of the period.

Chapter 6. Pulling the Strings of the King: The Moneylenders Take England

The image of puppet and puppeteer has long been a popular metaphor for describing the Money Power pulling the strings of government. Benjamin Disraeli, British Prime Minister from 1868 to 1880, said, "The world is governed by very different personages from what is imagined by those who are not behind the scenes." Nathan Rothschild, who controlled the Bank of England after 1820, notoriously declared:

I care not what puppet is placed upon the throne of England to rule the Empire on which the sun never sets. The man who controls Britain's money supply controls the British Empire, and I control the British money supply.

In the documentary video The Money Masters, narrator Bill Still uses the puppet metaphor to describe the transfer of power from the royal line of English Stuarts to the German royal House of Hanover in the eighteenth century:

England was to trade masters: an unpopular King James II for a hidden cabal of Money Changers pulling the strings of their usurper, King William III, from behind the scenes.  This symbiotic relationship between the Money Changers and the highest British aristocracy continues to this day.  The monarch has no real power but serves as a useful shield for the Money Changers who rule the City…. In its 20 June 1934 issue, New Britain magazine of London cited a devastating assertion by former British Prime Minister, David Lloyd George, that “Britain is the slave of an international financial bloc.”

A Dutch-bred King Charters the Bank of England on Behalf of Foreign Moneylenders

The man who would become King William III began his career as a Dutch aristocrat. He was elevated to Captain General of the Dutch Forces and then to Prince William of Orange with the backing of Dutch moneylenders. His marriage was arranged to Princess Mary of York, eldest daughter of the English Duke of York, and they were married in 1677. The Duke, who was next in line to be King of England, died in 1689, and William and Mary became King and Queen of England.

William was soon at war with Louis XIV of France. To finance his war, he borrowed 1.2 million pounds in gold from a group of moneylenders, whose names were to be kept secret. The money was raised by a novel device that is still used by governments today: the lenders would issue a permanent loan on which interest would be paid but the principal portion of the loan would not be repaid. The loan also came with other strings attached. They included:

(1) The lenders were to be granted a charter to establish a Bank of England, which would issue banknotes that would circulate as the national paper currency.

(2) The Bank would create banknotes out of nothing, with only a fraction of them backed by coin. Banknotes created and lent to the government would be backed mainly by government I.O.U.s, which would serve as the "reserves" for creating additional loans to private parties.

(3) Interest of 8 percent would be paid by the government on its loans, marking the birth of the national debt.

(4) The lenders would be allowed to secure payment on the national debt by direct taxation of the people. Taxes were immediately imposed on a whole range of goods to pay the interest owed to the Bank.

The Bank of England has been called "the Mother of Central Banks." It was chartered in 1694 to William Paterson, a Scotsman who had previously lived in Amsterdam. A circular distributed to attract subscribers to the Bank's initial stock offering said, "The Bank bath benefit of interest on all moneys which it, the Bank, creates out of nothing." The negotiation of additional loans caused England's national debt to go from 1.2 million pounds in 1694 to 16 million pounds in 1698. By 1815, the debt was up to 885 million pounds, largely due to the compounding of interest. The lenders not only reaped huge profits, but the indebtedness gave them substantial political leverage.

The Bank's charter gave the force of law to the "fractional reserve" banking scheme that put control of the country's money in a privately owned company. The Bank of England had the legal right to create paper money out of nothing and lend it to the government at interest. It did this by trading its own paper notes for paper bonds representing the government's promise to pay principal and interest back to the bank -- the same device used by the U.S. Federal Reserve and other central banks today.

John Law Proposes a National Paper Money Supply

Popular acceptance of the bankers' privately-issued money scheme is credited to the son of a Scottish goldsmith named John Law, who has been called "the father of finance." In 1705, Law published a series of pamphlets on trade, money and banking, in which he claimed to have found the true "Philosopher's Stone," referring to a mythical device used by medieval alchemists to turn base material into gold. Paper could be converted into gold, Law said, through the alchemy of paper money. He proposed the creation of a national paper money supply consisting of banknotes redeemable in "specie" (hard currency in the form of gold or silver coins), which would be officially recognized as money. Paper money could be expanded indefinitely and was much cheaper to make than coins. To get public confidence, Law suggested that a certain fraction of gold should be kept on hand for the few people who actually wanted to redeem their notes. The goldsmiths had already established through trial and error that specie could support about ten times its value in paper notes. Thus a bank holding $10 in gold could safely print and lend about $100 in paper money. This was the "secret" that the Chicago Federal Reserve said was discovered by the goldsmiths: a bank could lend about ten times as much money as it actually had, because a trusting public, assuming their money was safely in the bank, would not come to collect more man about 10 percent of it at any one time.

This scheme became the basis of the banking system known as "central banking," which remains in use today. A private central bank is chartered as the nation's primary bank and lends to the national government. It lends the central bank's own notes (printed paper money), which the government swaps for bonds (its promises to pay) and circulates as a national currency. The government's debt is never paid off but is just rolled over from year to year, becoming the basis of the national money supply.

Until the twentieth century, banks followed the model of the goldsmiths and literally printed their own supply of notes against their own gold reserves. These were then multiplied many times over on the "fractional reserve" system. The bank's own name was printed on the notes, which were lent to the public and the government. Today, federal governments have taken over the printing; but in most countries the notes are still drawn on private central banks. In the United States, they are printed by the U.S. Bureau of Engraving and Printing at the request of the Federal Reserve, which "buys" them for the cost of printing them and calls them "Federal Reserve Notes." Today, however, there is no gold on "reserve" for which the notes can be redeemed. Like the illusory ghosts in the Haunted House at Disneyland, the dollar is the fractal of a hologram, the reflection of a debt for something that does not exist.

The Tallies Leave Their Mark

Although the tallies were wiped off the books and fell down the memory hole, they left their mark on the modern financial system. The word "stock," meaning a financial certificate, comes from the Middle English for the tally stick. Much of the stock in the Bank of England was originally purchased with tally sticks. The holder of the stock was said to be the "stockholder," who owned "bank stock." One of the original stockholders purchased his shares with a stick representing £25,000, an enormous sum at the time. A substantial share of what would become the world's richest and most powerful corporation was thus bought with a stick of wood! According to legend, the location of Wall Street, the New York financial district, was chosen because of the presence of a chestnut tree enormous enough to supply tally sticks for the emerging American stock market.

Stock issuance was developed during the Middle Ages, as a way of financing businesses when usury and interest-bearing loans were forbidden. In medieval Europe, banks run by municipal or local governments helped finance ventures by issuing shares of stock in them. These municipal banks were large, powerful, efficient operations that fought the moneylenders' private usury banks tooth and nail. The usury banks prevailed in Europe only when the revolutionary government of France was forced to borrow from the international bankers to finance the French Revolution (1789-1799), putting the government heavily in their debt.

In the United States, the usury banks fought for control for two centuries before the Federal Reserve Act established the banks' private monopoly in 1913. Today, the U.S. banking system is not a topic of much debate; but in the nineteenth century, the fight for and against the Bank of the United States defined American politics. And that brings us back to Jefferson and his suspicions of foreign meddling….

Chapter 7. While Congress Dozes in the Poppy Fields: Jefferson and Jackson Sound the Alarm

…Jefferson is quoted as saying:

If the American people ever allow the banks to control the issuance of their currency, first by inflation and then by deflation, the banks and corporations that will grow up around them will deprive the people of all property, until their children will wake up homeless on the continent their fathers occupied.

A similar wakeup call is attributed to Jackson, who told Congress in 1829:

If the American people only understood the rank injustice of our money and banking system, there would be a revolution before morning.

Jefferson was instrumental in Congress's refusal to renew the charter of the first U.S. Bank in 1811. When the Bank was liquidated, Jefferson's suspicions were confirmed: 18,000 of the Bank's 25,000 shares were owned by foreigners, mostly English and Dutch. The foreign domination the Revolution had been fought to eliminate had crept back in through the country's private banking system. Congressman Desha of Kentucky, speaking in the House of Representatives, declared that "this accumulation of foreign capital was one of the engines for overturning civil liberty," and that he had "no doubt King George III was a principal stockholder."'

When Congress later renewed the Bank's charter, Andrew Jackson vetoed it. He too expressed concern that a major portion of the Bank's shareholders were foreigners. He said in his veto bill:

Is there no danger to our liberty and independence in a bank that in its nature has so little to bind it to our country? ... Of the course which would be pursued by a bank almost wholly owned by the subjects of a foreign power, ...there can be no doubt.... Controlling our currency, receiving our public monies, and holding thousands of our citizens in dependence, it would be more formidable and dangerous than a naval and military power of the enemy.

Who were these "subjects of a foreign power" who owned the bank? In The History of the Great American Fortunes, published in 1936, Gustavus Myers pointed to the formidable British banking dynasty the House of Rothschild. Myers wrote:

Under the surface, the Rothschilds long had a powerful influence in dictating American financial laws. The law records show that they were the power in the old Bank of the United States.

Jefferson Realizes Too Late the Need for a National Paper Currency Issued by the Government

Jefferson was out of town when the Constitution was drafted, serving as America's minister to France during the dramatic period leading up to the French Revolution. But even if he had been there, he would probably have gone along with the majority and voted to omit paper money from the Constitution. After watching the national debt mushroom, he wrote to John Taylor in 1798, "I wish it were possible to obtain a single amendment to our constitution ... taking from the federal government the power to borrow money. I now deny their power of making paper money or anything else a legal tender."

It would be several decades before Jefferson realized that the villain was not paper money itself. It was private debt masquerading as paper money, a private debt owed to bankers who were merely "pretending to have money." Jefferson wrote to Treasury Secretary Gallatin in 1815:

The treasury, lacking confidence in the country, delivered itself bound hand and foot to bold and bankrupt adventurers and bankers pretending to have money, whom it could have crushed at any moment.

Jefferson wrote to John Eppes in 1813, "Although we have so foolishly allowed the field of circulating medium to be filched from us by private individuals, I think we may recover it.... The states should be asked to transfer the right of issuing paper money to Congress, in perpetuity." He told Eppes, "the nation may continue to issue its bills [paper notes] as far as its needs require and the limits of circulation allow. Those limits are understood at present to be 200 millions of dollars."

Writing to Gallatin in 1803, Jefferson said of the private national bank, "This institution is one of the most deadly hostility against the principles of our Constitution.... [S]uppose a series of emergencies should occur.... [A]n institution like this ... in a critical moment might overthrow the government." He asked, "Could we start toward independently using our own money to form our own bank?"

The Constitution gave Congress the power only to "coin money," but Jefferson argued that Constitutions could be amended….

Jackson Battles the Hydra-headed Monster

Jackson believed in a strong Presidency and a strong union. He stood up to the bankers on the matter of the bank, which he viewed as operating mainly for the upper classes at the expense of working people. He warned in 1829:

The bold efforts the present bank has made to control the government are but premonitions of the fate that awaits the American people should they be deluded into a perpetuation of this institution or the establishment of another like it.

Whether Congress itself had the right to issue paper money, Jackson said, was not clear; but "If Congress has the right under the Constitution to issue paper money, it was given them to be used by themselves, not to be delegated to individuals or to corporations." His grim premonitions about the Bank appeared to be confirmed, when mismanagement under its first president led to financial disaster, depression, bankruptcies, and unemployment. But the Bank began to flourish under its second president, Nicholas Biddle, who petitioned Congress for a renewal of its charter in 1832. Jackson, who was then up for re-election, expressed his views to this bid in no uncertain terms. "You are a den of vipers and thieves," he railed at a delegation of bankers discussing the Bank Renewal Bill. "I intend to rout you out, and by the eternal God, I will rout you out." He called the bank "a hydra-headed monster eating the flesh of the common man." He swore to do battle with the monster and to slay it or be slain by it.

In the 1832 election, Jackson ran on the Democratic Party ticket against Henry Clay, whose party was now called the National Republican Party. Its members considered themselves "nationalists" because they saw the country as a nation rather than a loose confederation of States, and because they promoted strong nation-building measures such as the construction of inter-state roads. Clay advocated a strongly protectionist platform that kept productivity and financing within the country, allowing it to grow up "in its own backyard," free from economic attack from abroad. It was Clay who first called this approach the "American system" to distinguish it from the "British system" of "free trade." The British system was supported by Jackson and opposed by Clay, who thought it would open the country to exploitation by foreign financiers and industrialists. To prevent that, Clay advocated a tariff favoring domestic industry, congressionally-financed national improvements, and a national bank.

More than three million dollars were poured into Clay's campaign, then a huge sum; but Jackson again won by a landslide.

Chapter 8. Scarecrow with a Brain: Lincoln Foils the Bankers

Both Jackson and Lincoln were targets of assassination attempts, but for Lincoln they started before he was even inaugurated. He had to deal with treason, insurrection, and national bankruptcy within the first days of taking office. Considering the powerful forces arrayed against him, his achievements in the next four years were nothing short of phenomenal. His government built and equipped the largest army in the world, smashed the British-financed insurrection, abolished slavery, and freed four million slaves. Along the way, the country managed to become the greatest industrial giant the world had ever seen. The steel industry was launched, a continental railroad system was created, the Department of Agriculture was established, a new era of farm machinery and cheap tools was promoted, a system of free higher education was established through the Land Grant College System, land development was encouraged by passage of a Homestead Act granting ownership privileges to settlers, major government support was provided to all branches of science, the Bureau of Mines was organized, governments in the Western territories were established, the judicial system was reorganized, labor productivity increased by 50 to 75 percent, and standardization and mass production was promoted worldwide.

How was all this accomplished, with a Treasury that was completely broke and a Congress that hadn’t been paid themselves? As Benjamin Franklin might have said, "That is simple." Lincoln tapped into the same cornerstone that had gotten the impoverished colonists through the American Revolution and a long period of internal development before that: he authorized the government to issue its own paper fiat money. National control was reestablished over banking, and the economy was jump-started with a 600 percent increase in government spending and cheap credit directed at production. A century later, Franklin Roosevelt would use the same techniques to pull the country through the Great Depression; but Roosevelt's New Deal would be financed with borrowed money. Lincoln's government used a system of payment that was closer to the medieval tally. Officially called United States Notes, these nineteenth century tallies were popularly called "Greenbacks" because they were printed on the back with green ink (a feature the dollar retains today). They were basically just receipts acknowledging work done or goods delivered, which could be traded in the community for an equivalent value of goods or services. The Greenbacks represented man-hours rather than borrowed gold. Lincoln is quoted as saying, "The wages of men should be recognized as more important than the wages of money." Over 400 million Greenback dollars were printed and used to pay soldiers and government employees, and to buy supplies for the war.

The Greenback system was not actually Lincoln's idea, but when pressure grew in Congress for the plan, he was quick to endorse it. The South had seceded from the Union soon after his election in 1860. To fund the War Between the States, the Eastern banks had offered a loan package that was little short of extortion - $150 million advanced at interest rates of 24 to 36 percent. Lincoln knew the loan would be impossible to pay off. He took the revolutionary approach because he had no other real choice. The government could either print its own money or succumb to debt slavery to the bankers.

The Wizard Behind Lincoln's Curtain

Lincoln's economic advisor was Henry Carey, the son of Matthew Carey, the printer and publisher mentioned earlier who was tutored by Benjamin Franklin and tutored Henry Clay. Clay was the leader of the Philadelphia-based political faction propounding the "American system' of economics. In the 1920s, historian Vernon Parrington called Henry Carey "our first professional economist." Thomas DiLorenzo, a modern libertarian writer, has called him "Lincoln's (and the Republican Party's) economic guru." Carey was known around the world during the Civil War and its aftermath, and his writings were translated into many European and Asian languages.

According to Farrington, Carey began his career as a classical laissez-faire economist of the British school; but he came to believe that American industrial development was being held back by a false financial policy imposed by foreign financiers. To recognize only gold bullion as money gave the bankers who controlled the gold a lock on the money supply and the economy. The price of gold was established in a world market, and the flow of bullion was always toward the great financial centers that were already glutted with it. To throw the world's money into a common pool that drained into these financial capitals was to make poorer countries the servants of these hubs. Since negative trade balances were settled in gold, gold followed the balance of trade; and until America could build up an adequate domestic economy, its gold would continue to drain off, leaving too little money for its internal needs.

Carey came to consider "free trade" and the "gold standard" to be twin financial weapons forged by England for its own economic conquest. His solution to the gold drain was for the government to create an independent national currency that was non-exportable, one that would remain at home to do the country's own work. He advocated a currency founded on "national credit," something he defined as "a national system based entirely on the credit of the government with the people, not liable to interference from abroad." Like the wooden tally, this paper money would simply be a unit of account that tallied work performed and goods delivered. Carey also supported expanding the monetary base with silver.

Carey's theories were an elaboration of the "American system" propounded by Henry Clay and the National Republican Party. Their platform was to nurture local growth and development using local raw materials and local money, freeing the country from dependence on foreign financing. Where Jackson's Democratic Party endorsed "free trade," the National Republican Party sought another sort of freedom, the right to be free from exploitation by powerful foreign financiers and industrialists. Free traders wanted freedom from government. Protectionists looked to the government to keep them free from foreign marauders. Clay's protectionist platform included:

Government regulation of banking and credit to deter speculation and encourage economic development;

Government support for the development of science, public education, and national infrastructure; (Infrastructure is defined as "the set of interconnected structural elements that provide the framework for supporting the entire structure." In a country, it consists of the basic facilities needed for the country's functioning, providing a public framework under which private enterprise can operate safely and efficiently.)

Regulation of privately-held infrastructure to ensure it met the nation’s needs;

A program of government-sponsored railroads, and scientific and other aid to small farmers;

Taxation and tariffs to protect and promote productive domestic activity; and

Rejection of class wars, exploitation and slavery, physical or economic, in favor of a "Harmony of Interests" between capital and labor.

Lincoln also endorsed these goals. He eliminated slavery, established a national bank, and implemented and funded national education, national transportation, and federal development of business and farming. He also set very high tariffs. He made this common-sense observation:

I don't know much about the tariff, but I know this much: When we buy manufactured goods abroad we get the goods and the foreigner gets the money. When we buy the manufactured goods at home, we get both the goods and the money.

The Legal Tender Acts and the Legal Tender Cases

The Greenback system undergirded Lincoln's program of domestic development by providing a much-needed national paper money supply. After Jackson had closed the central bank, the only paper money in circulation were the banknotes issued privately by individual state banks; and they were basically just private promises to pay later in hard currency (gold or silver). The Greenbacks, on the other hand, were currency. They were "legal tender" in themselves, money that did not have to be repaid later but was "as good as gold" in trade. Like metal coins, the Greenbacks were permanent money that could continue to circulate in their own right. The Legal Tender Acts of 1862 and 1863 made all the "coins and currency" issued by the U.S. Government "legal tender for all debts, public and private." Government-issued paper notes were made a legal substitute for gold and silver, even for the payment of pre-existing debts.

In the twentieth century, the Legal Tender Statute (31 U.S.C. Section 5103) applied this definition of "legal tender" to Federal Reserve Notes; but it was an evident distortion of the intent of the original Acts, which made only currency issued by the United States Government legal tender. Federal Reserve Notes are issued by the Federal Reserve, a private banking corporation; but that rather obvious discrepancy was slipped past the American people with the smoke-and-mirrors illusion that the Federal Reserve was actually federal.

Did the Greenbacks Cause Price Inflation?

Lincoln's Greenback program has been blamed for the price inflation occurring during the Civil War, but according to Irwin Unger in The Greenback Era (1964): "It is now clear that inflation would have occurred even without the Greenback issue."7 War is always an inflationary venture. What forced prices up during the Civil War was actually a severe shortage of goods. Zarlenga quotes historian J. G. Randall, who observed in 1937:

The threat of inflation was more effectively curbed during the Civil War than during the First World War. Indeed as John K. Galbraith has observed, "it is remarkable that without rationing, price controls, or central banking, [Treasury Secretary] Chase could have managed the federal economy so well during the Civil War.

Greenbacks were not the only source of funding for the Civil War. Bonds (government I.O.U.s) were also issued, and these too increased the money supply, since the banks that bought the bonds were also short of gold and had no other way of paying for the bonds than with their own newly-issued banknotes. The difference between the government-issued Greenbacks and the bank-issued banknotes was that the Greenbacks were debt-free legal tender that did not have to be paid back. As Thomas Edison reasonably observed in an interview reported in The New York Times in 1921:

If the Nation can issue a dollar bond it can issue a dollar bill. The element that makes the bond good makes the bill good also. The difference between the bond and the bill is that the bond lets the money broker collect twice the amount of the bond and an additional 20%. Whereas the currency, the honest sort provided by the Constitution pays nobody but those who contribute in some useful way. It is absurd to say our Country can issue bonds and cannot issue currency. Both are promises to pay, but one fattens the usurer and the other helps the People.

The Greenbacks did lose value as against gold during the war, but us was to be expected, since gold was a more established currency that people naturally preferred. Again the problem for the Greenback was that it had to compete with other forms of currency. People remained suspicious of paper money, and the Greenback was not accepted for everything. Particularly, it could not be used for the government's interest payments on its outstanding bonds. Zarlenga notes that by December 1865, the Greenback was still worth 68 cents to one gold dollar, not bad under the circumstances. Meanwhile, the Confederates' paper notes had become devalued so much that they were worthless. The Confederacy had made the mistake of issuing that money that was not legal tender but was only a bond or promise to pay after the War. As the defeat of the Confederacy became more and more certain, its currency's value plummeted.

In 1972, the United States Treasury Department was asked to compute the amount of interest that would have been paid if the $400 million in Greenbacks had been borrowed from the banks instead. According to the Treasury Department's calculations, in his short tenure Lincoln saved the government a total of $4 billion in interest, just by avoiding this $400 million loan.

Chapter 9. Lincoln Loses the Battle with the Masters of European Finance

The Confederacy was not the only power that was bent on destroying Lincoln's Union government. Lurking behind the curtain pulling the strings of war were powerful foreign financiers. Otto von Bismarck, Chancellor of Germany in the second half of the nineteenth century, called these puppeteers "the masters of European finance." He wrote:

I know of absolute certainty, that the division of the United States into federations of equal force was decided long before the Civil War by the high financial powers of Europe. These bankers were afraid that the United States, if they remained in one block and as one nation, would attain economic and financial independence, which would upset their financial domination over Europe and the world. Of course, in the "inner circle" of Finance, the voice of the Rothschilds prevailed. They saw an opportunity for prodigious booty if they could substitute two feeble democracies, burdened with debt to the financiers, …in place of a vigorous Republic sufficient unto herself. Therefore, they sent their emissaries into the field to exploit the question of slavery and to drive a wedge between the two parts of the Union. ... The rupture between the North and the South became inevitable; the masters of European finance employed all their forces to bring it about and to turn it to their advantage.

The European bankers wanted a war that would return the United States to its colonial status, but they were not necessarily interested in preserving slavery. Slavery just meant that the owners had to feed and care for their workers. The bankers preferred "the European plan" - capital could exploit labor by controlling the money supply, while letting the laborers feed themselves. In July 1862, this ploy was revealed in a notorious document called the Hazard Circular, which was circulated by British banking interests among their American banking counterparts. It said:

Slavery is likely to be abolished by the war power and chattel slavery destroyed. This, I and my European friends are glad of, for slavery is but the owning of labor and carries with it the care of the laborers, while the European plan, led by England, is that capital shall control labor by controlling wages. This can be done by controlling the money. The great debt that capitalists will see to it is made out of the war, must be used as a means to control the volume of money. To accomplish this, the bonds must be used as a banking basis.... It will not do to allow the greenback, as it is called, to circulate as money any length of time, as we cannot control that.

The system the bankers wanted to preserve was what Henry Clay and Henry Carey had called the "British system," with its twin weapons of "free trade" and the "gold standard" keeping the less industrialized countries in a colonial state, supplying raw materials to Britain’s factories. The American South had already been subjugated in this way, and the bankers had now set their sights on the North, to be reeled in with usurious war loans; but Lincoln had refused to take the bait. The threat the new Greenback system posed to the bankers' game was reflected in an editorial that is of uncertain origin but was reportedly published in the The London Times in 1865. It warned:

[I]f that mischievous financial policy, which had its origin in the North American Republic, should become indurated down to a fixture, then that Government will furnish its own money without cost. It will pay off debts and be without a debt. It will have all the money necessary to carry on its commerce. It will become prosperous beyond precedent in the history of the civilized governments of the world. The brains and the wealth of all countries will go to North America. That government must be destroyed, or it will destroy every monarchy on the globe.

Bismarck wrote in 1876, "The Government and the nation escaped the plots of the foreign financiers. They understood at once, that the United States would escape their grip. The death of Lincoln was resolved upon." Lincoln was assassinated in 1865.

The Worm in the Apple: The National Banking Act of 1863-64

The European financiers had failed to trap Lincoln's government with usurious war loans, but they achieved their ends by other means. While one faction in Congress was busy getting the Greenbacks issued to fund the war, another faction was preparing a National Banking Act that would deliver a monopoly over the power to create the nation's money supply to the Wall Street bankers and their European affiliates. The National Banking Act was promoted as establishing safeguards for the new national banking system; but while it was an important first step toward a truly national bank, it was only a compromise with the bankers, and buried in the fine print, it gave them exactly what they wanted. A private communication from a Rothschild investment house in London to an associate banking firm in New York dated June 25, 1863, confided:

The few who understand the system will either be so interested in its profits or so dependent upon its favors that there will be no opposition from that class while, on the other hand, the great body of people, mentally incapable of comprehending ... will bear its burdens without complaints

The Act looked good on its face. It established a Comptroller of the Currency, whose authority was required before a National Banking association could start business. It laid down regulations covering minimum capitalization, reserve requirements, bad debts, and reporting. The Comptroller could at any time appoint investigators to look into the affairs of any national bank. Every bank director had to be an American citizen, and three-quarters of the directors of a bank had to be residents of the State in which the bank did business. Interest rates were limited by State usury laws; and if no laws were in effect, then to 7 percent. Banks could not hold real estate for more than five years, except for bank buildings. National banks were not allowed to circulate notes they printed themselves. Instead, they had to deposit U.S. bonds with the Treasury in a sum equal to at least one-third of their capital. They got government-printed notes in return.

So what was the problem? Although the new national banknotes were technically issued by the Comptroller of the Currency, this was just a formality, like the printing of Federal Reserve Notes by the Bureau of Engraving and Printing today. The currency bore the name of the bank posting the bonds, and it was issued at the bank's request. In effect, the National Banking Act authorized the bankers to issue and lend their own paper money. The banks "deposited" bonds with the Treasury, but they still owned the bonds; and they immediately got their money back in the form of their own banknotes. Topping it off, the National Banking Act effectively removed the competition to these banknotes. It imposed a heavy tax on the notes of the state-chartered banks, essentially abolishing them. It also curtailed competition from the Greenbacks, which were limited to specific issues while the bankers' notes could be issued at will. Treasury Secretary Salmon P. Chase and others complained that the bankers were buying up the Greenbacks with their own banknotes. Zarlenga cites a historian named Dewey, who wrote in 1903:

The banks were accused of absorbing the government notes as fast as they were issued and of putting out their own notes in substitution, and then at their convenience converting the notes into bonds on which they earned interest [in gold].

The government got what it needed at the time - a loan of substantial sums for the war effort and a sound circulating currency for an expanding economy - but the banks were the real winners. They not only got to collect interest on money of which they still had the use, but they got powerful leverage over the government as its creditors. The Act that was supposed to regulate the bankers wound up chartering not one but a whole series of private banks, which all had the power to create the currency of the nation.

The National Banking Act was recommended to Congress by Treasury Secretary Chase, ironically the same official who had sponsored the Greenback program the Act effectively eliminated. In a popular 1887 book called Seven Financial Conspiracies That Have Enslaved the American People, Sarah Emery wrote that Chase acquiesced only after several days of meetings and threats of financial coercion by bank delegates. He is quoted as saying later:

My agency in procuring the passage of the National Bank Act was the greatest financial mistake of my life. It has built up a monopoly that affects every interest in the country. It should be repealed. But before this can be accomplished, the people will be arrayed on one side and the banks on the other in a contest such as we have never seen in this country.

Although Lincoln was assassinated in 1865, it would be another fifty years before the promise of his debt-free Greenbacks were erased from the minds of a people long suspicious of the usury bankers and their gilded paper money. The "Gilded Age" - the period between the Civil War and World War I - was a series of battles over who should issue the country's currency and what it should consist of.

Skirmishes in the Currency Wars

Chase appeared on the scene again in 1869, this time as Chief Justice of the Supreme Court. He wrote the opinion in Hepburn v. Griswold 75 U.S. 603, holding the Legal Tender Acts to be unconstitutional. Chase considered the Greenbacks to be a temporary war measure. He wrote that the Constitution prohibits the States from passing "any ... law impairing the obligation of contracts," and that to compel holders of contracts calling for payment in gold and silver to accept payment in "mere promises to pay dollars" was "an unconstitutional deprivation of property without due process of law."

In 1871, however, with two new justices on the bench, the Supreme Court reversed and found the Legal Tender Acts constitutional. In the Legal Tender cases (Knox v. Lee, 79 U.S. 457, 20 L.Ed. 287; and Juilliard v. Greenman, 110 U.S. 421, 4 S.Ct. 122, 28 L.Ed. 204), the Court declared that Congress has the power "to coin money and regulate its value" with the objects of self-preservation and the achievement of a more perfect union, and that "no obligation of contract can extend to the defeat of legitimate government authority."

In 1873, an Act the Populists would call the "Crime of '73" eliminated the free coinage of silver. Like when King George banned the use of locally-issued paper scrip a century earlier, the result was "tight" money and hard times. A bank panic followed, which hit the western debtor farmers particularly hard.

In 1874, the politically powerful farmers responded by forming the Greenback Party. Their proposed solution to the crisis was for the government to finance the building of roads and public projects with additional debt-free Greenbacks, augmenting the money supply and putting the unemployed to work, returning the country to the sort of full employment and productivity seen in Benjamin Franklin’s time. The Greenbacks could also be used to redeem the federal debt. Under the "Ohio Idea," all government bonds not specifying payment in gold or silver would be repaid in Greenbacks. The plan was not adopted, but the Scarecrow had shown he had a brain. The Timid Lion had demonstrated the courage and the collective will to organize and make a difference.

In 1875, a Resumption Act called for redemption by the Treasury of all Greenbacks in "specie." The Greenbacks had to be withdrawn and replaced with hard currency, producing further contraction of the money supply and deeper depression.

In 1878, the Scarecrow and the Tin Woodman joined forces to form the Greenback-Labor Party. They polled over one million votes and elected 14 Representatives to Congress. They failed to get a new issue of Greenbacks, but they had enough political clout to stop further withdrawal of existing Greenbacks from circulation. The Greenbacks then outstanding ($346,681,016 worth) were made a permanent part of the nation's currency.

In 1881, James Garfield became President. He boldly took a stand against the bankers, charging:

Whosoever controls the volume of money in any country is absolute master of all industry and commerce ... And when you realize that the entire system is very easily controlled, one way or another, by a few powerful men at the top, you will not have to be told how periods of inflation and depression originate.

President Garfield was murdered not long after releasing this statement, when he was less than four months into his presidency. Depression deepened, leaving masses of unemployed to face poverty and starvation at a time when there was no social security or unemployment insurance to act as a safety net. Produce was left to rot in the fields, because there was no money to pay workers to harvest it or to buy it with when it got to market. The country was facing poverty amidst plenty, because there was insufficient money in circulation to keep the wheels of trade turning. The country sorely needed the sort of liquidity urged by Lincoln, Carey and the Greenbackers; but the bankers insisted that allowing the government to print its own money would be dangerously inflationary. That was their argument, but critics called it "humbuggery”….

Chapter 10. The Great Humbug: The Gold Standard and the Straw Man of Inflation

The Remarkable Island of Guernsey

While U.S. bankers were insisting that the government must borrow rather than print the money it needed, the residents of a small island state off the coast of England were quietly conducting a 200-year experiment that would show the bankers' inflation argument to be a humbug. Guernsey is located among the British Channel Islands, about 75 miles south of Great Britain. In 1994, Dr. Bob Blain, Professor of Sociology at Southern Illinois University, wrote of this remarkable island:

In 1816 its sea walls were crumbling, its roads were muddy and only 4 1/2 feet wide. Guernsey's debt was 19,000 pounds. The island's annual income was 3,000 pounds of which 2,400 had to be used to pay interest on its debt. Not surprisingly, people were leaving Guernsey and there was little employment.

Then the government created and loaned new, interest-free state notes worth 6,000 pounds. Some 4,000 pounds were used to start the repairs of the sea walls. In 1820, another 4,500 pounds was issued, again interest-free. In 1821, another 10,000; 1824, 5,000; 1826, 20,000. By 1837, 50,000 pounds had been issued interest free for the primary use of projects like sea walls, roads, the marketplace, churches, and colleges. This sum more than doubled the island's money supply during this thirteen year period, but there was no inflation. In the year 1914, as the British restricted the expansion of their money supply due to World War I, the people of Guernsey commenced to issue another 142,000 pounds over the next four years and never looked back. By 1958, over 542,000 pounds had been issued, all without inflation.

Guernsey has an income tax, but the tax is relatively low (a "flat" 20 percent), and it is simple and loophole-free. It has no inheritance tax, no capital gains tax, and no federal debt. Commercial banks service private lenders, but the government itself never goes into debt. When it wants to create some public work or service, it just issues the money it needs to pay for the work. The Guernsey government has been issuing its own money for nearly two centuries. During that time, the money supply has mushroomed to about 25 times its original size; yet the economy has not been troubled by price inflation, and it has remained prosperous and stable.

Many other countries have also successfully issued their own money, but Guernsey is one of the few to have stayed under the radar long enough to escape the covert attacks of an international banking cartel bent on monopolizing the money-making market. As we'll see later, governments that have dared to create their own money have generally wound up dealing with a presidential assassination, a coup, a boycott, a war, or a concerted assault on the national currency by international speculators. The American colonists operated successfully on their own sovereign money until British moneylenders leaned on Parliament to halt the practice, prompting the American Revolution. England had a thriving economy that operated on the sovereign money of the king until Oliver Cromwell let the moneylenders inside the gates. After 1700, the right to create money was transferred to the private Bank of England, based on a fraudulent "gold standard" that allowed it to duplicate the gold in its vaults many times over in the form of paper banknotes. Today governments are in the position of the disenfranchised king, having to borrow money created by the banks rather than issuing it themselves.

The Gold Humbug

In 1863, Eleazar Lord, a New York banker, called the gold standard itself a humbug. He wrote:

The so-called specie basis [or gold standard], whenever there is a foreign demand for coin, proves to be a mere fiction, a practical humbug; and whenever, by an excess of imports, this pretended basis is exported to pay foreign debts, the bank-notes are withdrawn from circulation or become worthless, the currency for the time is annihilated, prices fall, business is suspended, debts remain unpaid, panic and distress ensue, men in active business fail, bankruptcy, ruin, and disgrace reign.

The requirement that paper banknotes be backed by a certain weight of gold bullion, Lord said, was a fiction. Banks did not have nearly enough gold to "redeem" all the paper money that was supposed to be based on it, and there was no real reason the nation’s paper money had to be linked to gold at all. The gold standard just put America at the mercy of the foreign financiers who controlled the gold. When national imports exceeded exports, gold bullion left the country to pay the bill; and when gold stores shrank, the supply of paper money "based" on it shrank as well.

The real issue, as Vernon Farrington pointed out, was not what money consisted of but who created it. Whether the medium of exchange was gold or paper or numbers in a ledger, when it was lent into existence by private lenders and was owed back to them with interest, more money would always be owed back than was created in the first place, spiraling the economy into perpetual debt. A dollar borrowed at 6 percent interest, compounded annually, grows in 100 years to be a debt of $13,781. That is true whether the money takes the form of gold or paper or accounting entries. The banks lend the dollar into existence but not the additional $13,780 needed to pay the loan off, forcing the public to go further and further into debt in search of the ephemeral interest due on their money-built-on-debt. Merchants continually have to raise their prices to try to cover this interest tab, producing perpetual price inflation. Like the Tin Woodman whose axe was enchanted by the Witch to chop off parts of his own body, the more people work, the less they seem to have left for themselves. They cannot keep up because their money keeps shrinking, as sellers keep raising their prices in a futile attempt to pay off loans that are collectively impossible to repay.

Challenging Corporate Feudalism

…Abraham Lincoln is quoted as saying:

I see in the near future a crisis approaching that unnerves me and causes me to tremble for the safety of my country. Corporations have been enthroned, an era of corruption in high places will follow, and the money power of the country will endeavor to prolong its reign by working upon the prejudices of the people until the wealth is aggregated in the hands of a few and the Republic is destroyed.

Lincoln may not actually have said this. As with many famous quotations, its authorship is disputed. But whoever said it, the insight was prophetic. In a January 2007 article called "Who Rules America?", Professor James Petras wrote, "Today it is said 2% of the households own 80% of the world's assets. Within this small elite, a fraction embedded in financial capital owns and controls the bulk of the world's assets and organizes and facilitates further concentration of conglomerates." Professor Petras observed:

Within the financial ruling class … political leaders come from the public and private equity banks, namely Wall Street - especially Goldman Sachs, Blackstone, the Carlyle Group and others. They organize and fund both major parties and their electoral campaigns. They pressure, negotiate and draw up the most comprehensive and favorable legislation on global strategies (liberalization and deregulation) and sectoral policies…. They pressure the government to "bail out" bankrupt and failed speculative firms and to balance the budget by lowering social expenditures instead of raising taxes on speculative "windfall" profits. …[T]hese private equity banks are involved in every sector of the economy, in every region of the world economy and increasingly speculate in the conglomerates which are acquired. Much of the investment funds now in the hands of US investment banks, hedge funds and other sectors of the financial ruling class originated in profits extracted from workers in the manufacturing and service sector."

It seems that the Tin Man has indeed been stripped of his heart and soul by the Witch of the East — the Wall Street bankers — just as Lincoln, the Greenbackers and the Populists foresaw….

Chapter 13. Witches’ Coven: The Jekyll Island Affair and the Federal Reserve Act of 1913

If the Wall Street bankers were the Wicked Witches of the Gilded Age, the coven where they conjured up their grandest of schemes was on Jekyll Island, a property off the coast of Georgia owned by J. P. Morgan. The coven was hosted in 1910 by Senator Nelson Aldrich of Rhode Island, a business associate of Morgan and the father-in-law of John D. Rockefeller Jr. The Republican "whip" in the Senate, Aldrich was known as the Wall Street Senator, a spokesman for big business and banking.

Although Aldrich hosted the meeting, credit for masterminding it is attributed to a German immigrant named Paul Warburg, who was a partner of Kuhn, Loeb, the Rothschild's main American banking operation after the Civil War. Other attendees included Benjamin Strong, then head of Morgan's Bankers Trust Company; two other heads of Morgan banks; the Assistant Secretary of the U.S. Treasury; and Frank Vanderlip, president of the National City Bank of New York, then the most powerful New York bank (now called Citibank), which represented William Rockefeller and Kuhn, Loeb. Morgan was the chief driver behind the plan, and the Morgan and Rockefeller factions had long been arch-rivals; but they had come together in this secret rendezvous to devise a banking scheme that would benefit them both. Vanderlip wrote later of the meeting:

We were instructed to come one at a time and as unobtrusively as possible to the railroad terminal... where Senator Aldrich's private car would be in readiness.... Discovery, we knew, simply must not happen.... If it were to be exposed publicly that our particular group had written a banking bill, that bill would have no chance whatever of passage by Congress.... [A]lthough the Aldrich Federal Reserve plan was defeated its essential points were contained in the plan that was finally adopted.

Congressional opposition to the plan was led by William Jennings Bryan and Charles Lindbergh Sr., who were strongly against any bill suggesting a central bank or control by Wall Street money. It took a major bank panic to prompt Congress even to consider such a bill. The panic of 1907 was triggered by rumors that the Knickerbocker Bank and the Trust Company of America were about to become insolvent. Later evidence pointed to the House of Morgan as the source of the rumors. The public, believing the rumors, proceeded to make them come true by staging a run on the banks. Morgan then nobly helped to avert the panic by importing $100 million worth of gold from Europe to stop the bank run. The mesmerized public came to believe that the country needed a central banking system to stop future panics. Robert Owens, a co-author of the Federal Reserve Act, later testified before Congress that the banking industry had conspired to create such financial panics in order to rouse the people to demand "reforms” that served the interests of the financiers. Congressman Lindbergh charged:

The Money Trust caused the 1907 panic.... [T]hose not favorable to the Money Trust could be squeezed out of business and the people frightened into demanding changes in the banking and currency laws which the Money Trust would frame.

The 1907 panic prompted the congressional inquiry headed by Senator Aldrich, and the clandestine Jekyll Island meeting followed. The result was a bill called the Aldrich Plan, but the alert opposition saw through it and soundly defeated it. Bryan said he would not support any bill that resulted in private money being issued by private banks. Federal Reserve Notes must be Treasury currency, issued and guaranteed by the government; and the governing body must be appointed by the President and approved by the Senate.

Morgan's Man in the White House

Morgan had another problem besides the opposition in Congress. He needed a President willing to sign his bill. William Howard Taft, the President in 1910, was not a Morgan man. McKinley had been succeeded by his Vice President Teddy Roosevelt, who was in the Morgan camp and had been responsible for breaking up Rockefeller's Standard Oil. Taft, who followed Roosevelt, was a Republican from Rockefeller's state of Ohio. He took vengeance on Morgan by filing antitrust suits to break up the two leading Morgan trusts, International Harvester and United States Steel. Taft was a shoo-in for reelection in 1912. To break his hold on the Presidency, Morgan deliberately created a new party, the Progressive or Bull Moose Party, and brought Teddy Roosevelt out of retirement to run as its candidate. Roosevelt took enough votes away from Taft to allow Morgan to get his real candidate, Woodrow Wilson, elected on the Democratic ticket in 1912. Roosevelt walked away realizing he had been duped, and the Progressive Party was liquidated soon afterwards. Wilson was surrounded by Morgan men, including "Colonel" Edward Mandell House, who had his own rooms at the White House. Wilson called House his "alter ego."

To get their bill passed, the Morgan faction changed its name from the Aldrich Bill to the Federal Reserve Act and brought it three days before Christmas, when Congress was preoccupied with departure for the holidays. The bill was so obscurely worded that no one really understood its provisions. The Aldrich team knew it would not pass without Bryan's support, so in a spirit of apparent compromise, they made a show of acquiescing to his demands. He said happily, "The right of the government to issue money is not surrendered to the banks; the control over the money so issued is not relinquished by the government.... So he thought; but while the national money supply would be printed by the U.S. Bureau of Engraving and Printing, it would be issued as an obligation or debt of the government, a debt owed back to the private Federal Reserve with interest. And while Congress and the President would have some input in appointing the Federal Reserve Board, the Board would work behind closed doors with the regional bankers, without Congressional oversight or control.

The bill passed on December 22,1913, and President Wilson signed it into law the next day. Later he regretted what he had done. He is reported to have said before he died, "I have unwittingly ruined my country." Bryan was also disillusioned and soon resigned as Secretary of State, in protest over President Wilson's involvement in Europe's war following the suspect sinking of the Lusitania.

The first chairmanship of the Federal Reserve was offered to Paul Warburg, but he declined. Instead he became vice chairman, a position he held until the end of World War I, when he relinquished it to avoid an apparent conflict of interest. He would have had to negotiate with his brother Max Warburg, who was then financial advisor to the Kaiser and Director of the Reichsbank, Germany's private central bank.

The Incantations of Fedspeak

The Federal Reserve Act of 1913 was a major coup for the international bankers. They had battled for more than a century to establish a private central bank with the exclusive right to "monetize" the government's debt (that is, to print their own money and exchange it for government securities or I.O.U.s). The Act's preamble said that its purposes were "to provide for the establishment of Federal Reserve Banks, to furnish an elastic currency, to afford a means of rediscounting commercial paper, to establish a more effective supervision of banking in the United States, and for other purposes." It was the beginning of Fedspeak, abstract economic language that shrouded the issues in obscurity. "Elastic currency" is credit that can be expanded at will by the banks. "Rediscounting" is a technique by which banks are allowed to magically multiply funds by re-lending them without waiting for outstanding loans to mature. In plain English, the Federal Reserve Act authorized a private central bank to create money out of nothing, lend it to the government at interest, and control the national money supply, expanding or contracting it at will. Representative Lindbergh called the Act "the worst legislative crime of the ages." He warned:

[The Federal Reserve Board] can cause the pendulum of a rising and falling market to swing gently back and forth by slight changes in the discount rate, or cause violent fluctuations by greater rate variation, and in either case it will possess inside information as to financial conditions and advance knowledge of the coming change, either up or down.

This is the strangest, most dangerous advantage ever placed in the hands of a special privilege class by any Government that ever existed.... The financial system has been turned over to ... a purely profiteering group. The system is private, conducted for the sole purpose of obtaining the greatest possible profits from the use of other people's money.

In 1934, in the throes of the Great Depression, Representative Louis McFadden would go further, stating on the Congressional record:

Some people think that the Federal Reserve Banks are United States Government institutions. They are private monopolies which prey upon the people of these United States for the benefit of themselves and their foreign customers; foreign and domestic speculators and swindlers, and rich and predatory money lenders. In that dark crew of financial pirates there are those who would cut a man's throat to get a dollar out of his pocket; there are those who send money into states to buy votes to control our legislatures; there are those who maintain International propaganda for the purpose of deceiving us into granting of new concessions which will permit them to cover up their past misdeeds and set again in motion their gigantic train of crime.

These twelve private credit monopolies were deceitfully and disloyally foisted upon this Country by the bankers who came here from Europe and repaid us our hospitality by undermining our American institutions.

Who Owns the Federal Reserve?

The "Federal" Reserve is actually an independent, privately-owned corporation. It consists of twelve regional Federal Reserve banks owned by many commercial member banks. The amount of Federal Reserve stock held by each member bank is proportional to its size. The Federal Reserve Bank of New York holds the majority of shares in the Federal Reserve System (53 percent). The largest shareholders of the Federal Reserve Bank of New York are the largest commercial banks in the district of New York.

In 1997, the New York Federal Reserve reported that its three largest member banks were Chase Manhattan Bank, Citibank, and Morgan Guaranty Trust Company. In 2000, JP Morgan and Chase Manhattan merged to become JPMorgan Chase Co., a bank holding company with combined assets of $668 billion. That made it the third largest bank holding company in the country, after Citigroup (at $791 billion) and Bank of America (at $679 billion). Bank of America was founded in California in 1904 and remains concentrated in the western and southwestern states. Citigroup is the cornerstone of the Rockefeller empire.

The Information Monopoly

By 1983, according to Dean Ben Bagdikian in The Media Monopoly, fifty corporations owned half or more of the media business. By 2000, that number was down to six corporations, with directorates interlocked with each other and with major commercial banks. Howard Zinn observes:

[W]hether you have a Republican or a Democrat in power, the Robber Barons are still there.... Under the Clinton administration, more mergers of huge corporations took place [than] had ever taken place before under any administration.... [W]hether you have Republicans or Democrats in power, big business is the most powerful voice in the halls of Congress and in the ears of the President of the United States.

In The Underground History of American Education, published in 2000, educator John Taylor Gatto traces how Rockefeller, Morgan and other members of the financial elite influenced, guided, funded, and at times forced compulsory schooling into the mainstream of American society. They needed three things for their corporate interests to thrive: (1) compliant employees, (2) a guaranteed and dependent population, and (3) a predictable business environment. It was largely to promote these ends, says Gatto, that modern compulsory schooling was established.

Harnessing the Tax Base

The Robber Barons had succeeded in monopolizing the money spigots, the oil spigots, and the public's access to information; but Morgan wanted more. He wanted to secure the banks' loans to the government with a reliable source of taxes, one that was imposed directly on the incomes of the people. There was just one snag in this plan: a federal income tax had consistently been declared unconstitutional by the U.S. Supreme Court….

Chapter 15. Reaping the Whirlwind: The Great Depression

The Blame Game

Who was to blame for this decade-long cyclone of debt and devastation? Milton Friedman, professor of economics at the University of Chicago and winner of a Nobel Prize in economics, stated:

The Federal Reserve definitely caused the Great Depression by contracting the amount of currency in circulation by one-third from 1929 to 1933.

The Honorable Louis T. McFadden, Chairman of the House Banking and Currency Committee, went further. He charged:

[The depression] was not accidental. It was a carefully contrived occurrence.... The international bankers sought to bring about a condition of despair here so that they might emerge as rulers of us all.

Representative McFadden could not be accused of partisan politics. He had been elected by the citizens of Pennsylvania on both the Democratic and Republican tickets, and he had served as Chairman of the Banking and Currency Committee for more than ten years, putting him in a position to speak with authority on the vast ramifications of the gigantic private credit monopoly of the Federal Reserve. In 1934, he filed a Petition for Articles of Impeachment against the Federal Reserve Board, charging fraud, conspiracy, unlawful conversion and treason. He told Congress:

This evil institution has impoverished and ruined the people of these United States, has bankrupted itself, and has practically bankrupted our Government. It has done this through the defects of the law under which it operates, through the maladministration of that law by the Fed and through the corrupt practices of the moneyed vultures who control it.

... From the Atlantic to the Pacific, our Country has been ravaged and laid waste by the evil practices of the Fed and the interests which control them. At no time in our history, has the general welfare of the people been at a lower level or the minds of the people so full of despair....

Recently in one of our States, 60,000 dwelling houses and farms were brought under the hammer in a single day. 71,000 houses and farms in Oakland County, Michigan, were sold and their erstwhile owners dispossessed. The people who have thus been driven out are the wastage of the Fed. They are the victims of the Fed. Their children are the new slaves of the auction blocks in the revival of the institution of human slavery.

A document called "The Bankers Manifesto of 1934" added weight to these charges. An update of "The Bankers Manifesto of 1892," it was reportedly published in The Civil Servants' Yearbook in January 1934 and in The New American in February 1934, and was circulated privately among leading bankers. It read in part:

Capital must protect itself in every way, through combination [monopoly] and through legislation. Debts must be collected and loans and mortgages foreclosed as soon as possible. When through a process of law, the common people have lost their homes, they will be more tractable and more easily governed by the strong arm of the law applied by the central power of wealth, under control of leading financiers. People without homes will not quarrel with their leaders. This is well known among our principal men now engaged in forming an imperialism of capital to govern the world.

That was the sinister view of the Great Depression. The charitable explanation was that the Fed had simply misjudged. Whatever had happened, the monetary policy of the day had clearly failed. Change was in the wind. Over 2,000 schemes for monetary reform were advanced, and populist organizations again developed large followings.

Return to Oz: Coxey Runs for President

Nearly four decades after he had led the march on Washington that inspired the march on Oz, Jacob Coxey reappeared on the scene to run on the Farmer-Labor Party ticket for President. Coxey, who was nothing if not persistent, actually ran for office thirteen times between 1894 and 1936. He was elected only twice, as mayor of Massillon, Ohio, in 1932 and 1933; but he did succeed in winning a majority in the Ohio presidential primary in 1932. Franklin Roosevelt came from banking and railroad money and had the support of big business along with the general public. He easily won the presidential election. But Coxey maintained that it was his own plan for government-financed public works that was the blueprint for the "New Deal," the program widely credited with pulling the country out of the Depression. It was the same plan Coxey had proposed in the 1890s: Congress could jump-start the economy by "priming the pump" with various public projects that would put the unemployed to work, using government-issued, debt-free money to pay for the labor and materials. Roosevelt adopted the pump-priming part but not the proposal to finance it with debt-free Greenbacks. A bill called the Thomas Amendment was passed during his tenure that actually authorized new issues of government Greenbacks, but no Greenbacks were issued under it. Instead, Roosevelt financed the New Deal with deficit spending and tax increases.

In 1944, Coxey was honored for his work by being allowed to deliver a speech on the Capitol steps, with the formal blessing of the Vice President and the Speaker of the House. It was the same speech he had been barred from giving there half a century before. In 1946, at the age of 92, he published a new plan to avoid unemployment and future wars. He died in 1951, at the age of 97.

Another Aging Populist Returns

Another blast from the past on the presidential campaign trail was William Hope Harvey, author of Coins Financial School and economic adviser to William Jennings Bryan in the 1890s. Harvey ran for President in 1932 on the Liberty Party ticket. Like Coxey, he was an obscure candidate who was later lost to history; but his insights would prove to be prophetic. Harvey stressed that people who took out loans at a bank were not actually borrowing "money." They were borrowing debt; and the commercial oligarchy to whom it was owed would eventually end up running the country. The workers would live on credit and buy at the company store, becoming wage-slaves who owned nothing of their own.

Harvey considered money to be a direct representation of a man's labor, and usury and debt to be a scheme to put middleman bankers between a man's labor and his property. Even efficient farmers operating on the debt-money system would eventually have some bad years, and some would default on their loans. Every year there would be a certain number of foreclosures and the banks would get the land, which would be sold to the larger farm owners. The country's property would thus gradually become concentrated in fewer and fewer hands. The farms, factories and businesses would wind up owned by a few individuals and corporations that were controlled by the bankers who controlled the money supply. At the heart of the problem, said Harvey, was the Federal Reserve System, which allowed banks to issue debt and pretend it was money. This sleight of hand was what had allowed the bankers to slowly foreclose on the country, moving ownership to the Wall Street banks, brokerage houses and insurance companies. The ultimate culprit was the English banking system, which had infected and corrupted America's banking system. It was the English who had first demonetized silver in 1816, and who had decreased the value of everything else by hoarding gold. Debts to English banks had to be paid in gold, and countries that did not produce gold had to buy it to pay their debts to England. The result was to drive down the value of the goods those countries did produce, indenturing them to the English bankers. In a fictionalized book called A Tale of Two Countries, Harvey wrote of a fat English banker named Baron Rothe, who undertook to corrupt the American economy and government in order to place the reins of the country in the hands of his worldwide banking system.

Harvey's solution was to return the Money Power to the people, something he proposed doing by nationalizing the banks. He would have nationalized other essential industries as well - those that operated on a large scale and produced basic commodities, including public utilities, transportation, and steel. The profits would have gone into the public coffers, replacing taxes, which Harvey thought should be abolished. The Populists of the 1890s had campaigned to expand the money supply by adding silver to the gold that backed paper money, but Harvey now felt that both gold and silver should be demonetized. The national currency did not need precious metal backing. It could be what Franklin and Lincoln said it was - simply a receipt for labor. Paper money could be backed by government services. That is a novel idea today, but it has a familiar precedent: the postage stamp is a kind of money redeemable in government services. One postage stamp represents the amount of government labor required to transport one letter from one place to another. Postage stamps are fungible and can be saved or traded.

Although Harvey and Coxey both failed in their political aspirations, elements of the platforms of both were adopted in the New Deal. The dollar was taken off the gold standard, just as Harvey had advocated; and the economy was jump-started by putting the unemployed to work, just as Coxey had advocated. Roosevelt came from banker money and had the support of big business, but he also had a strong streak of the can-do Populist spirit….

Chapter 16. Oiling the Rusted Joints of the Economy: Roosevelt, Keynes and the New Deal

A farm policy of "parity pricing" was enacted that ensured that the prices received by farmers covered the prices they paid for input plus a reasonable profit. If the farmers could not get the parity price, the government would buy their output, put it into storage, and sell it later. The government actually made a small profit on these transactions; food prices were kept stable; and the family farm system was preserved as the safeguard of the national food supply. With the push for "globalization" in later decades, thousands of family farmers were forced out of the farming business. Farm parity was replaced with farm "subsidies" that favored foods for export and were insufficient to cover rising costs in fuel, feed and fertilizer.

Where did Roosevelt get the money for all the pump-priming programs in his New Deal? Coxey's plan was to issue the money outright, but Roosevelt did not go that far. Even for the government to borrow the money was considered radical at the time. The dogma of the day was that the federal budget must be balanced at all costs. The novel idea that the government could borrow the money it needed was suggested by John Maynard Keynes, a respected British economist, who argued that this was a more sensible course than austerely trying to balance the budget when funds were not to be had. In an open letter in The New York Times, Keynes advised Roosevelt that "only the expenditures of public authority" could reverse the Depression. The government had to spend to get money into circulation.

Keynes has been called an elitist, because he was an intellectual with expensive tastes, wealthy friends and banker affiliations; but like Roosevelt, he had a strong streak of the can-do Populist spirit. At a time when conventional economists were gloomy naysayers maintaining that nothing could be done, Keynes was an optimist who thought like the Wizard of Oz. There was no reason to put up with recession, depression and unemployment. Balancing the budget by cutting jobs, at a time when people were already out of work, he thought was economic folly. The way to get the ball rolling again was just to roll up your sleeves and get busy. It could all be paid for on credit!

But Keynes would not go so far as to advocate that the government should issue the money outright. "Increasing the quantity of money is like trying to get fat by buying a larger belt," he said. It was a colorful analogy but a questionable one. The money supply had just shrunk by a third. The emaciated patient needed to be fattened up with a good infusion of liquidity just to replace the money that had been lost.

Challenging Classical Economic Theory

Roosevelt was slow to go along with Keynes' radical notions, but as the Depression got worse, he decided to give them a try. He told the country in a fireside chat, "We suffer from a failure of consumer demand because of a lack of buying power." When the United States entered World War II, Roosevelt had no choice but to test the limits of the national credit card; and in a dramatic empirical display, the pump-priming theory was proven to work. Unemployment dropped from more than 17 percent to just over 1 percent. The economy grew along with the money supply to double its original size, the fastest growth in U.S. history. The country was pulled out of the Depression by priming the pump with liquidity, funding new production that put new wages in consumers' pockets.

Keynes had turned classical theory on its head. The classical assumption was that output ("supply") was fixed and that prices were flexible. Increasing "demand" (money) would therefore increase prices. Keynes said that prices tended to be fixed and output to be flexible. When the economy was operating at less than full employment, adding money would not increase prices. It would increase productivity. As long as there were idle resources to draw from, watering a liquidity-starved economy with new money would not produce inflation; it would produce abundance.

And that was how it actually worked, for a while; but adding liquidity by borrowing money into existence did not actually create money. It created debt; and to service the debt, the taxpayers had to pay interest compounded annually. Roosevelt's plan put people to work, putting more money in their pockets; but much of this money was taken out again in the form of taxes, which went largely to pay the burgeoning interest tab. From 1933 to 1940, federal taxes tripled. In the New Deal years, the average annual federal budget deficit was about $3 billion out of an entire federal budget of $6 billion to $9 billion -- a greater percentage even than today, when deficit spending has reached record levels. Wholesale endorsement of Keynesian deficit spending caused the federal debt to balloon from $22 billion in 1933 to $8 trillion in 2005, a 364-fold increase in just 72 years. The money supply increased along with the debt. In 1959, when the Fed first began reporting M3, it was a mere $288.8 billion. By 2004, it had reached $9 trillion. In only 45 years, M3 had multiplied by over 30 times. In 2007, the federal debt also hit $9 trillion; and little of this borrowed money goes to improve infrastructure or to increase employment. Jobs are being out-sourced abroad, while taxpayers struggle to make the interest payments on the federal debt.

Prices have gone up in tandem. Many people still remember when ice cream cones and comic books were 25 cents each. Today they are $2.50 or more. What was once a 10 cent cup of coffee is now $1.50 to $2.00. A house that was $30,000 in 1970 is now more than $300,000. In 1970, it could have been bought by a single-breadwinner family. For most families today, both parents have to work outside the home to make the mortgage payments. These parabolic price increases reflect a parabolic increase in the money supply. Where did all this new money come from? No gold was added to the asset base of the country, which went off the gold standard in the 1930s. All of this increase came into existence as accounting-entry bank loans. More specifically, it came from government loans, which never get paid back but just get rolled over from year to year. Under the plan of Coxey and the Greenbackers, rather than borrowing from banks that pulled the money out of an empty hat, Uncle Sam could have pulled the money out of his own tall hat and avoided a mushrooming debt.

Roosevelt in the Middle

Coxey was not alone in urging the Greenback cure for the economy's ills. Some influential federal officials also thought it was the way to reverse the depression. In a congressional address in 1933, Representative Louis McFadden quoted a Hearst newspaper article by Robert Hemphill, credit manager of the Atlanta Federal Reserve, in Which Hemphill argued:

We are rapidly approaching a situation where the government must issue additional currency. It will very soon be the only move remaining. It should have been the first step in the recovery program. Immediately upon a revival of the demand that the government increase the supply of currency, we shall again be subjected to a barrage of skillfully designed and cunningly circulated propaganda by means of which a small group of international bankers have been able, for two centuries to frighten the peoples of the civilized world against issuing their own good money in sufficient quantities to carry on their necessary commerce. By this simple, but amazingly successful device these "money changers" - parasites in a busy world intent on creating and exchanging wealth - have been able to preserve for their private and exclusive right the monopoly of manufacturing an inferior substitute for money which they have hypnotized civilized nations into using, because of their pressing need to exchange goods and services. We shall never recover on credit. Even if it were obtainable, it is uncertain, unreliable, does not expand in accordance with demand, and contracts unexpectedly and for causes unrelated to the needs of commerce and industry.... In our present situation the issue of additional currency is the only way out.

Hemphill said the government needed to issue enough new, debt-free currency to replace what had been lost. Congressman Wright Patman went further: he urged the government to take over ownership and operation of the banks. In an address to Congress on March 13, 1933, he asked rhetorically:

Why is it necessary to have Government ownership and operation of banks? Let us go back to the Constitution of the United States and follow it.... The Constitution of the United States says that Congress shall coin money and regulate its value. That does not mean ... that the Congress of the United States, composed of the duly elected representatives of the people, have a right to farm out the great privilege to the banking system, until today a few powerful bankers control the issuance and distribution of money - something that the Constitution of the United States says Congress shall do

Flanked on the right by the classical laissez-faire economists who said the money supply and the banking scheme should not be tampered with at all, and on the left by the radical reformers who said that the power to create money and perhaps even the banking system itself should be taken over by the government, Roosevelt took the middle road and opted for the Keynesian deficit spending alternative. He expanded the money supply, but he did it without unseating the private banking cartel.

Instead, Roosevelt tried to regulate the bankers. In 1934, the Federal Reserve System was overhauled to provide additional safeguards for the economy and the money supply. The old Federal Reserve Board was dissolved and replaced by a seven-member Board of Governors, appointed by the U.S. President for 14-year terms. The Board was given greatly increased powers, including the power to appoint the presidents of the 12 Federal Reserve Banks. The Open Market Committee was created, with one representative from each Federal Reserve Bank. It was empowered to inject new money into the economy by using newly-created money to purchase government bonds, and to remove old money from the economy by selling government bonds.

Chapter 17. Wright Patman Exposes the Money Machine

In his role as Chairman of the House Banking and Currency Committee, Patman penetrated the official Fedspeak to expose what was really going on. After a probing investigation of the Federal Reserve, he charged:

The Open Market Committee of the Federal Reserve System ... has the power to obtain, and does obtain, the printed money of the United States -- Federal Reserve Notes -- from the Bureau of Engraving and Printing, and exchanges these printed notes, which of course are not interest bearing, for United States government obligations that are interest bearing. After making the exchange, the interest bearing obligations are retained by the 12 Federal Reserve banks and the interest collected annually on these government obligations goes into the funds of the 12 Federal Reserve banks.... These funds are expended by the system without an adequate accounting to the Congress.

The Open Market Committee was the group formed in 1934 to take charge of "open market operations," the Fed's buying and selling of government securities (the bills, bonds and notes by which the government borrows money). Then as now, the Open Market Committee acquired Federal Reserve Notes from the Federal Bureau of Engraving and Printing, essentially for the cost of printing them. The average cost today is about 4 cents per bill. In deft card-shark fashion, these dollar bills are then swapped for an equivalent stack of notes labeled Treasury securities. Turning Treasury securities (or debt) into "money" (Federal Reserve Notes) is called "monetizing" the debt. The government owes this money back to the Fed, although the Fed has advanced nothing but printed paper to earn it. In a revealing treatise called A Primer on Money, Patman concluded:

The Federal Reserve is a total moneymaking machine. It can issue money or checks. And it never has a problem of making its checks good because it can obtain the $5 and $10 bills necessary to cover its check simply by asking the Treasury Department's Bureau of Engraving to print them.

This statement was confirmed by Marriner Eccles, then Chairman of the Federal Reserve Board, in testimony before the House Banking and Currency Committee in 1935. Eccles acknowledged:

In purchasing offerings of Government bonds, the banking system as a whole creates new money, or bank deposits. When the banks buy a billion dollars of Government bonds as they are offered ... the banks credit the deposit account of the Treasury with a billion dollars. They debit their Government bond account a billion dollars; or they actually create, by a bookkeeping entry, a billion dollars.

Economist John Kenneth Galbraith would later comment, "The process by which banks create money is so simple that the mind is repelled." The mind is repelled because the process is sleight of hand and is completely foreign to what we have been taught. In a phenomenon called "cognitive dissonance," we can read the words and still doubt whether we have read them right. To make sure that we have, then, here is another credible source --

In 1993, National Geographic Magazine published an article by assistant editor Peter White titled "Do Banks Really Create Money Out of Thin Air?" White began by observing that 92 percent of the money supply consists, not of bills or coins, but of checkbook and other non-tangible money. To find out where this money comes from, he asked a Federal Reserve official, who said that every day, the Federal Reserve Bank of New York buys U.S. government securities from major banks and brokerage houses. That's if the Fed wants to expand the money supply. If it wants to contract the money supply, it sells government securities. White wrote:

Say today the Fed buys a hundred million dollars in Treasury bills from those big securities dealers, who keep a stock of them to trade with the public. When the Fed pays the dealers, a hundred million dollars will thereby be added to the country's money supply, because the dealers will be credited that amount by their banks, which now have that much more on deposit. But where did the Fed get that hundred million dollars? "We created it," a Fed official tells me. He means that anytime the central bank writes a check, so to speak, it creates money. "It's money that didn’t exist before," he says. Is there any limit on that? "No limit. Only the good judgement and the conscience of the responsible Federal Reserve people." And where did they get this vast authority? "It was delegated to them in the Federal Reserve Act of 1913, based on the Constitution, Article I, Section 8, 'Congress shall have the power ... to coin money, regulate the value thereof….’”

Andrew Jackson would probably have said "vipers and thieves!" He stressed that the Constitution gives Congress the power only to coin money; and if "coining" money means "creating" money, it gives that power only to Congress. The Tenth Amendment provides that powers not delegated to the United States or forbidden to the States are reserved to the States or the people. In 1935, the U.S. Supreme Court held that "Congress may not abdicate or transfer to others its legitimate functions." (Schechter Pultry v. U.S., 29 U.S. 495, 55 U.S. 837, 842.)

The Real Windfall

After relentless agitation by Patman’s Committee, the Fed finally agreed to rebate most of the interest it received on its government bonds to the U.S. Treasury. Congressman Jerry Voorhis, another early Fed watchdog, said that the agreement was a tacit admission that the Fed wasn’t entitled to interest. It wasn’t entitled to interest because its own money wasn’t being lent. Fed apologists today argue that since the interest, or most of it, is now rebated to the government, no net advantage has accrued to the Fed. But that argument overlooks a far greater windfall to the banks that are the Fed's owners and real constituents. The bonds that have been acquired essentially for free become the basis of the Fed's "reserves" - the phantom money that is advanced many times over by commercial banks in the form of loans.

Virtually all money in circulation today can be traced to government debt that has been "monetized" by the Federal Reserve and the banking system. This money is then multiplied many times over in the form of bank loans. In 2006, M3 (the broadest measure of the money supply) was nearly $10 trillion, and the Treasury securities held by the Federal Reserve came to about one-tenth that sum. Thus the money supply has expanded by a factor of about 10 for every dollar of federal debt monetized by the Federal Reserve, and all of this monetary expansion consists of loans on which the banks have been paid interest." It is this interest, not the interest paid to the Federal Reserve, that is the real windfall to the banks - this and the fact that the banks now have a money-making machine to back them up whenever they get in trouble with their "fractional reserve" lending scheme. The Jekyll Island plan had worked beautifully: the bankers succeeded in creating a secret source of unlimited funds that could be tapped into whenever they were caught short-handed. And to make sure their scheme remained a secret, they concealed this money machine in obscure Fedspeak that made the whole subject seem dull and incomprehensible to the uninitiated, and was misleading even to people who thought they understood it.

In The Creature from Jekyll Island, Ed Griffin writes that "modern money is a grand illusion conjured by the magicians of finance and politics." The function of the Federal Reserve, he says, "is to convert debt into money. It's just that simple." The mechanism may seem complicated at first, but "it is simple if one remembers that the process is not intended to be logical but to confuse and deceive." The process by which the Fed converts debt into money begins after the government's bonds are offered to the public at auction. Griffin explains:

[T]he Fed takes all the government bonds which the public does not buy and writes a check to Congress in exchange for them…. There is no money to back up this check. These fiat dollars are created on the spot for that purpose. By calling these bonds "reserves," the Fed then uses them as the base for creating additional dollars for every dollar created for the bonds themselves. The money created for the bonds is spent by the government, whereas the money created on top of those bonds is the source of all the bank loans made to the nation's businesses and individuals. The result of this process is the same as creating money on a printing press, but the illusion is based on an accounting trick rather than a printing trick.

The result is the same with this difference: in the minds of most people, printing press money is created by the government. The accounting trick that generates 99 percent of the U.S. money supply today is the sleight of hand of private banks.

The Magical Multiplying Reserves

The shell game devised by the seventeenth century goldsmiths is now called "fractional reserve" banking. The fraction of a bank's outstanding loans that must be held in "reserve" is called the "reserve requirement," and it is set by the Fed. The website of the Federal Reserve Bank of New York (FRBNY) explains:

Reserve requirements … are computed as percentages of deposits that banks must hold as vault cash or on deposit at a Federal Reserve Bank…. As of December 2006, the reserve requirement was 10% on transaction deposits, and there were zero reserves required for time deposits…. If the reserve requirement is 10%, for example, a bank that receives a $100 deposit may lend out $90 of that deposit. If the borrower then writes a check to someone who deposits the $90, the bank receiving that deposit can lend out $81. As the process continues, the banking system can expand the initial deposit of $100 into a maximum of $1,000 of money ($100+$90+81+$72.90+ ... =$1,000).

It sounds reasonable enough, but let's have a closer look. First, some definitions: a time deposit is a bank deposit that cannot be withdrawn before a date specified at the time of deposit. Transaction deposit is a term used by the Federal Reserve for "checkable" deposits (deposits on which checks can be drawn) and other accounts that can be used directly as cash without withdrawal limits or restrictions. Transaction deposits are also called demand deposits: they can be withdrawn on demand at any time without notice. All checking accounts are demand deposits. Some savings accounts require funds to be kept on deposit for a minimum length of time, but most savings accounts also permit unlimited access to funds. As long as enough money is kept in "reserve" to satisfy depositors who come for their money, "transaction deposits" can be lent many times over. The 90 percent the bank lends is redeposited, and 90 percent of that is relent, in a process that repeats about 20 times, until the $100 becomes $1,000.

But wait! These funds belong to the depositors and must remain available at all times for their own use. How can the money be available to the depositor and lent out at the same time? Obviously, it can't. The money is basically counterfeited in the form of loans. The 10 percent reserve requirement harkens back to the seventeenth century goldsmiths, who found through trial and error that depositors collectively would not come for more than about 10 percent of their money at one time. The money could therefore be lent 9 times over without anyone being the wiser. Today the scheme gets obscured because many banks are involved, but the collective result is the same: when the banks receive $1 million in deposits, they can "lend" not just $900,000 (90 percent of $1 million) but $9 million in computer-generated funds. As we'll see shortly, "reserves" are being phased out, so the multiple is actually higher than that; but to keep it simple, we'll use that figure. Consider this hypothetical case:

You live in a small town with only one bank. You sell your house for $100,000 and deposit the money into your checking account at the bank. The bank then advances 90 percent of this sum, or $90,000, to Miss White to buy a house from Mr. Black. The bank proceeds to collect from Miss White both the interest and the principal on this loan. Assume the prevailing interest rate is 6.25 percent. Interest at 6.25 percent on $90,000 over the life of a 30-year mortgage comes to $109,490. Miss White thus winds up owing $199,490 in principal and interest on the loan - not to you, whose money it allegedly was in the first place, but to the bank. (In practice, you probably wouldn't keep $100,000 in a checking account that paid no interest; you would invest it somewhere. But when the bank makes loans based on its collective checking account deposits, the result is the same: the bank keeps the interest.) Legally, Miss White has title to the house; but the bank becomes the effective owner until she pays off her mortgage.

Mr. Black now takes the $90,000 Miss White paid him for his house and deposits it into his checking account at the town bank. The bank adds $90,000 to its reserve balance at its Federal Reserve bank and advances 90 percent of this sum, or $81,000, to Mrs. Green, who wants to buy a house from Mr. Gray. Over 30 years, Mrs. Green owes the bank $81,000 in principal plus $98,541 in interest, or $179,541; and the bank has become the effective owner of another house until the loan is paid off.

Mr. Gray then deposits Mrs. Green's money into his checking account. The process continues until the bank has "lent" $900,000, on which it collects $900,000 in principal and $985,410 in interest, for a total of $1,885,410. The bank has thus created $900,000 out of thin air and has acquired effective ownership of a string of houses, at least temporarily, all from an initial $100,000 deposit, and it is owed $985,410 in interest on this loan. The $900,000 principal is extinguished by an entry on the credit side of the ledger when the loans are paid off; but the other half of this conjured $2 million - the interest - remains solidly in the coffers of the bank, and if any of the borrowers should default on their loans, the bank becomes the owner of the mortgaged property.

Instead of houses, let's try it with the $100 million in Treasury bills bought by the Fed in a single day in the National Geographic example, using $100 million in book-entry money created out of thin air. At a reserve requirement of 10 percent, $100 million can generate $900 million in loans. If the interest rate on these loans is 5 percent, the $900 million will return $45 million the first year in interest to the banks that wrote the loans. At compound interest, then, a $100 million "investment" in money created out of thin air is doubled in about two years!

To Audit or Abolish?

The Fed reports that 95 percent of its profits are now returned to the U.S. Treasury." But a review of its balance sheet, which is available on the Internet, shows that it reports as profits only the interest received from the federal securities it holds as reserves. No mention is made of the much greater windfall afforded to the banks that are the Fed's corporate owners, which use the securities as the "reserves" that get multiplied many times over in the form of loans. The Federal Reserve maintains that it is now audited every year by Price Waterhouse and the Government Accounting Office (GAO), an arm of Congress; but some functions remain off limits to the GAO, including its transactions with foreign central banks and its open market operations (the operations by which it creates money with accounting entities)." Thus the Fed's most important - and most highly suspect - functions remain beyond public scrutiny.

Wright Patman proposed cleaning up the books by abolishing the Open Market Committee and nationalizing the Federal Reserve, reclaiming it as a truly federal agency under the auspices of Congress. The dollars the Fed created would then be government dollars, issued debt-free without increasing the debt burden of the country. Jerry Voorhis also advocated skipping the middleman and letting the government issue its own money. But neither proposal was passed by Congress. Rather, Patman was removed as head of the House Banking and Currency Committee, after holding that position for twelve years; and Voorhis lost the next California Congressional election to Richard Nixon, after being targeted by an aggressive smear campaign financed by the American Bankers’ Association.

Chapter 18. A Look Inside the Fed’s Playbook

Banks as Traders

Where do these "vast sums of borrowed money" come from? Although investment banks are not allowed to take in deposits or make loans of imaginary money based on "fractional reserves," commercial banks are. Now that the lines between these two forms of banking have become blurred, it is not hard to envision bank traders having ready access to some very favorable loans.

Thornton continues:

[M]any investment banks now do more trading than all but the biggest hedge funds, those lightly regulated investment pools that almost brought down the financial system in 1998 when one of them, Long-Term Capital Management, blew up. What's more, banks are jumping into the realm of private equity, spending billions to buy struggling businesses as far afield as China that they hope to turn around and sell at a profit.

Equity is ownership interest in a corporation, and the equity market is the stock market. These banks are not just investing in short-term Treasury bills on which they collect a modest interest, as commercial banks have traditionally done.  They are buying whole businesses with borrowed money, and they are doing it not to develop the productive potential of the business but just to reap a quick profit on resale.

Leading the attack in this lucrative new field, says Thornton, is the very successful investment bank Goldman Sachs, headed until recently by Henry Paulson Jr. Paulson left the firm to become U.S. Treasury Secretary in June 2006, but neither Goldman nor its cronies, Thornton says, are showing signs of easing up:

With $25 billion of capital under management, Goldman's private equity arm itself is one of the largest buyout firms in the world.... All of them are ramping up teams of so-called proprietary traders who play with the banks' own money.... Banks are paying up, offering some traders $10 million to $20 million a year.

The practice of buying whole corporations in order to bleed them of their profits has been given the less charitable name of "vulture capitalism." Why the term fits was underscored in a January 2006 article by Sean Corrigan called "Speculation in the Late Empire." He writes:

When the buy-out merchants and private equity partnerships can borrow what are effectively limitless sums of cheap, tax-advantaged debt with which to buy out corporate shareholders (not all of them willing sellers, remember); when they can then proceed to ruin the target business' balance sheet in a flash, by ordering payment of special dividends and by weighing it down with junk debt, in order to return their funds at the earliest juncture; when their pecuniary motives are mollified by so little pretense of undertaking any genuine entrepreneurial restructuring with which to enhance economic efficiency; when they can rake in an even greater haul of loot by selling the firm smartly back to the next debt-swollen suckers in line (probably into the little man's sagging pension funds via the inevitable, well-hyped IPO (initial public offering)); when they can scatter fees and commissions (and often political "contributions") liberally along the way -then we're clearly well past the point of reason or endorsement.

Noting the "outrageously skewed" incomes made by bank traders at the top of the field -- including Henry Paulson, who made over $30 million at Goldman Sachs the previous year -- Corrigan asks rhetorically:

Why train to be a farmer or a pharmacologist, when you can join Merrill Lynch and become a millionaire in your mid-20s, using someone else's "capital" and benefiting from being an insider in the great Ponzi scheme in which we live?

All major markets are now thought to be subject to the behind-the-scenes maneuverings of big financial players, and these manipulations are being done largely with what Corrigan calls "phantom money." A June 2006 article in Barron's noted that the bond market today is dominated by banks and government entities, and that they are not buying the bonds for their interest income. Rather, "The reality is that [they] are only interested in currency manipulation and market contrivement."

To understand what is really going on behind the scenes, we need to understand the tools used by Big Money to manipulate markets. In the next chapter, we'll take a look at the investment vehicle known as the "short sale," which underlies many of those more arcane tools known as "derivatives." A massive wave of short selling was blamed for turning the Roaring Twenties into the Great Depression. The same sort of manipulations are going on today under different names….

Chapter 19. Bear Raids and Short Sales: Devouring Capital Markets

The Nefarious, Ubiquitous Naked Short Sale

According to a November 2005 article in Time Magazine:

[N]aked short selling is illegal, barring certain exceptions for brokers trying to maintain an orderly market. In naked short selling, you execute the sale without borrowing the stock. The SEC noted in a report last year the "pervasiveness" of the practice. When not caught, this kind of selling has no limits and allows a seller to drive down a stock.

A May 2004 Dow Jones report confirmed that naked short selling is "a manipulative practice that can drive a company's stock price sharply lower." The exception that has turned the rule into a sham is a July 2005 SEC ruling allowing the practice by "market makers." A market maker is a bank or brokerage that stands ready to buy and sell a particular stock on a continuous basis at a publicly quoted price. The catch is that market makers are the brokers who actually do most of the buying and selling of stock today. Ninety-five percent of short sales are now done by broker-dealers and market makers. Market making is one of the lucrative pursuits of those ten giant U.S. banks called "money center banks," which currently hold almost half the country's total banking assets. (More on this in Chapter 34.)

A story run on FinancialWire in March 2005 underscored the pervasiveness and perniciousness of naked short selling. A man named Robert Simpson purchased all of the outstanding stock of a small company called Global Links Corporation, totaling a little over one million shares. He put all of this stock in his sock drawer, then watched as 60 million of the company's shares traded hands over the next two days. Every outstanding share changed hands nearly 60 times in those two days, although they were safely tucked away in his sock drawer. The incident substantiated allegations that a staggering number of "phantom' shares are being traded around by brokers in naked short sales. Short sellers are expected to "cover" by buying back the stock and returning it to the pool, but Simpson’s 60 million shares were obviously never bought back, since they were not available for purchase; and the same thing is believed to be going on throughout the market.

The role of market makers is supposedly to provide liquidity in the markets, match buyers with sellers, and ensure that there will always be someone to supply stock to buyers or to take stock off sellers' hands. The exception allowing them to engage in naked short selling is justified as being necessary to allow buyers and sellers to execute their orders without having to wait for real counterparties to show up. But if you want potatoes or shoes and your local store runs out, you have to wait for delivery. Why is stock investment different?

It has been argued that a highly liquid stock market is essential to ensure corporate funding and growth. That might be a good argument if the money actually went to the company, but that is not where it goes. The issuing company gets the money only when the stock is sold at an initial public offering (IPO). The stock exchange is a secondary market - investors buying from other stockholders, hoping they can sell the stock for more than they paid for it. Basically, it is gambling. Corporations have an easier time raising money through new IPOs if the buyers know they can turn around and sell their stock quickly; but in today's computerized global markets, real buyers should show up quickly enough without letting brokers sell stock they don't actually have to sell.

Short selling is sometimes justified as being necessary to keep a brake on the "irrational exuberance" that might otherwise drive popular stocks into dangerous "bubbles." But if that were a necessary feature of functioning markets, short selling would also be rampant in the markets for cars, television sets and computers, which it obviously isn't. The reason it isn't is that these goods can't be "hypothecated" or duplicated on a computer screen the way stock shares can. Like fractional reserve lending, short selling is made possible because the brokers are not dealing with physical things but are simply moving numbers around on a computer monitor. Any alleged advantages to a company from the liquidity afforded by short selling are offset by the serious harm this sleight of hand can do to companies targeted for take-down in bear raids.

Financial Weapons of Mass Destruction?

Short selling is the modern version of the counterfeiting scheme used to bring down the Continental in the 1770s. When a currency is sold short, its value is diluted just as it would be if the market were flooded with paper currency. The short sale is the basis of many of those sophisticated trades called "derivatives," which have become weapons for destroying competitor businesses by parasitic mergers and takeovers. Billionaire investor Warren Buffett calls derivatives "financial weapons of mass destruction." The term fits not only because these speculative bets are very risky for investors but because big institutional investors can use them to manipulate markets, cause massive currency devaluations, and force small vulnerable countries to do their bidding. Derivatives have been used to destroy the value of the national currencies of competitor countries, allowing national assets to be picked up at fire sale prices, just as the assets of the American public were snatched up by wealthy insiders after the crash of 1929. Defenders of free markets blame the targeted Third World countries for being unable to manage their economies, when the fault actually lies in a monetary scheme that opens their currencies to manipulation by foreign speculators who have access to a flood of "phantom money" borrowed into existence from foreign banks.

(The term "Third World" is now an anachronism, since there is no longer a "Second World" (the Soviet bloc). But the term is used here because it has a popularly understood meaning and is still widely used, and because the alternatives - "developing world" and "underdeveloped world" - may be misleading. Citizens of ancient Third World civilizations tend to consider their cultures more "developed" than some in the First World.)

To clarify all this, we'll to take another short detour into the shady world of "finance capitalism," to shed some light on the obscure topic of derivatives and the hedge funds that largely trade in them.

Chapter 20. Hedge Funds and Derivatives: A Horse of a Different Color

In the 1920s, wealthy investors engaged in "pooling" - combining their assets to influence the markets for their collective benefit. Like trusts and monopolies, pooling was considered to be a form of collusive interference with the normal market forces of supply and demand. Hedge funds are the modern-day variants of this scheme. They are usually run in off-shore banking centers such as the Cayman Islands to avoid regulation. Off-shore funds are exempt from margin requirements that restrict trading on credit, and from uptick rules that limit short sales to assets that are rising in price.

Hedge funds were originally set up to "hedge the bets" of investors, insuring against currency or interest rate fluctuations; but they quickly became instruments for manipulation and control. Many of the largest hedge funds are run by former bank or investment bank dealers, who have left with the blessings of their former employers. The banks' investment money is then placed with the hedge funds, which can operate in a more unregulated environment than the banks can themselves. Hedge funds are now often responsible for over half the daily trading in the equity markets, due to their huge size and the huge amounts of capital funding them. That gives them an enormous amount of control over what the markets will do. In the fall of 2006, 8,282 of the 9,800 hedge funds operating worldwide were registered in the Cayman Islands, a British Overseas Territory with a population of 57,000 people. The Cayman Islands Monetary Authority gives each hedge fund at registration a 100-year exemption from any taxes, shelters the fund's activity behind a wall of official secrecy, allows the fund to self-regulate, and prevents other nations from regulating the funds.

Derivatives are key investment tools of hedge funds. Derivatives are basically side bets that some underlying investment (a stock, commodity, market, etc.) will go up or down. They are not really "investments," because they don’t involve the purchase of an asset. They are outside bets on what the asset will do. All derivatives are variations on futures trading, and all futures trading is inherently speculation or gambling. The more familiar types of derivatives include "puts" (betting the asset will go down) and "calls" (betting the asset will go up). Over 90 percent of the derivatives held by banks today, however, are "over-the-counter" derivatives - investment devices specially tailored to financial institutions, often having exotic and complex features, not traded on standard exchanges. They are not regulated, are hard to trace, and are very hard to understand. Some critics say they are impossible to understand, because they were designed to be so complex and obscure as to mislead investors.

At one time, tough rules regulated speculation of this sort. The Glass-Steagall Act passed during the New Deal separated commercial banking from securities trading; and the Commodities Futures Trading Commission (CFTC) was created in 1974 to regulate commodity futures and option markets and to protect market participants from price manipulation, abusive sales practices, and fraud. But again the speculators have managed to get around the rules. Derivative traders claim they are not dealing in "securities" or "futures" because nothing is being traded; and just to make sure, they induced Congress to empower the head of the CFTC to grant waivers to that effect, and they set up offshore hedge funds that remained small, unregistered and unregulated. They also had the Glass-Steagall Act repealed.

A Bubble on a Ponzi Scheme

Executive Intelligence Review (EIR), The New Federalist and The American Almanac are publications associated with Lyndon LaRouche, a political figure who is personally controversial but whose research staff was described by a former senior staffer of the National Security Council as "one of the best private intelligence services in the world.” Their writings on the derivatives crisis are quite colorful and readable. In a 1998 interview, John Hoefle, the banking columnist for EIR, clarified the derivatives phenomenon like this:

During the 1980s, you had the creation of a huge financial bubble. ...[Y]ou could look at that as fleas who set up a trading empire on a dog.... They start pumping more and more blood out of the dog to support their trading, and then at a certain point, the amount of blood that they're trading exceeds what they can pump from the dog, without killing the dog. The dog begins to get very sick. So being clever little critters, what they do, is they switch to trading in blood futures. And since there's no connection - they break the connection between the blood available and the amount you can trade, then you can have a real explosion of trading, and that's what the derivatives market represents. And so now you've had this explosion of trading in blood futures which is going right up to the point that now the dog is on the verge of dying. And that's essentially what the derivatives market is. It's the last gasp of a financial bubble!

What has broken the connection between "the blood available and the amount you can trade" is that derivatives are not assets. They are just bets on what the asset will do, and the bet can be placed with very little "real" money down. Most of the money is borrowed from banks that create it on a computer screen as it is lent. The connection with reality has been severed so completely that the market for over-the-counter derivatives has now reached many times the money supply of the world. Since these private bets are unreported and unregulated, nobody knows exactly how much money is riding on them. However, the Bank for International Settlements (BIS) reported that in the first half of 2006, the "notional value" of derivative trades had soared to a record $370 trillion; and by December 2007, the figure was up to a breathtaking $681 trillion.

The notional value of a derivative is a hypothetical number described as "the number of units of an asset underlying the contract, multiplied by the spot price of the asset." Synonyms for "notional" include "fanciful, not based on fact, dubious, imaginary." Just how fanciful these values are is evident from the numbers: $681 trillion is over 50 times the $13 trillion gross domestic product (GDP) of the entire U.S. economy. In 2006, the total GDP of the world was only $66 trillion - one-tenth the "notional" value of derivative trade in 2007. In a September 2006 article in MarketWatch, Thomas Kostigen wrote:

[I]t's worth wondering how so much extra value can be squeezed out of instruments that are essentially fake.... Wall Street manufactures these products and trades them in a rather shadowy way that keeps the average investor in the dark. You cannot exactly look up the price of an equity derivative in your daily newspaper's stock table.... [I]t wouldn't take all that much to create a domino effect of market mishap. And there is no net. The Securities Investor Protection Corporation, which insures brokerage accounts in the event of a brokerage-firm failure, recently announced its reserves. It has about $1.38 billion. That may sound like a lot. Compared with half a quadrillion, it's a pittance. Scary but true.

How are these astronomical sums even possible? The answer, again, is that derivatives are just bets, and gamblers can bet any amount they want. Gary Novak is a scientist with a website devoted to simplifying complex issues. He writes, "It's like two persons flipping a coin for a trillion dollars, and afterwards someone owes a trillion dollars which never existed." He calls it "funny money." Like the Mississippi Bubble, the derivatives bubble is built on something that doesn't really exist; and when the losers cannot afford to pay up on their futures bets, the scheme must collapse. Either that, or the taxpayers will be saddled with the bill for the largest bailout in history.

In a report presented at the request of the House Committee on Banking, Finance and Urban Affairs in 1994, Christopher White used some other vivid imagery for the derivatives affliction. He wrote:

The derivatives market ... is the greatest bubble in history. It dwarfs the Mississippi Bubble in France and the South Sea Island bubble in England. This bubble, like a cancer, has penetrated and taken over the entirety of our banking and credit system; there is no major commercial bank, investment bank, mutual fund, etc. that is not dependent on derivatives for its existence. These derivatives suck the life's blood out of our economy. Our farms, our factories, our nation's infrastructure, our living standards are being sucked dry to pay off interest payments, dividend yields as well as other earnings on the bubble.

How speculation in derivatives draws much-needed capital away from domestic productivity was explained by White with another analogy:

It would be like going to the horse races to bet, not on the race, but on the size of the pot. Who would care about what's involved with getting the runners to the starting gate?

Since the gamblers don't care who wins, they aren't interested in feeding the horses or hiring stable hands. They are only interested in money making money. Today more money can be had at less risk by speculation in derivatives than by investing in the growth of a business, and this is particularly true if you are a very big bank with the ability to influence the way the bet goes. The Office of the Comptroller of the Currency reported that in mid-2006, there were close to 9,000 commercial and savings banks in the United States; yet 97 percent of U.S. bank-held derivatives were concentrated in the hands of just five banks. Topping the fist were JPMorgan Chase and Citibank, the citadels of the Morgan and Rockefeller empires.

How Can a Bank Go Bankrupt?

Individual profiteering aside, however, banks are clearly taking a risk when they extend credit. Bankers will therefore argue that they deserve the interest they get on these loans, even if they did conjure the money out of thin air. Somebody has to create the national money supply. Why not the bankers?

One problem with the current system is that the government itself has been seduced into borrowing money created out of nothing and paying interest on it, when the government could have created the funds itself, debt- and interest-free. In the case of government loans, the banks take virtually no risk, since the government is always good for the interest; and the taxpayers get saddled with a crippling debt that could have been avoided.

Another problem with the fractional reserve system is simply in the math. Since all money except coins comes into existence as a debt to private banks, and the banks create only the principal when they make loans, there is never enough money in the economy to repay principal plus interest on the nation's collective debt. When the money supply was tethered to gold, this problem was resolved through periodic waves of depression and default that wiped the slate clean and started the cycle all over again. Although it was a brutal system for the farmers and laborers who got wiped out, and it allowed a financier class to get progressively richer while the actual producers got poorer, it did succeed in lending a certain stability to the money supply. Today, however, the Fed has taken on the task of preventing depressions, something it does by pumping more and more credit-money into the economy by funding a massive federal debt that no one ever expects to have to repay; and all this credit-money is advanced at interest. At some point, the interest bill alone must exceed the taxpayers' ability to pay it; and according to U.S. Comptroller General David Walker, that day of reckoning is only a few years away. We have reached the end of the line on the debt-money train and will have to consider some sort of paradigm shift if the economy is to survive.

A third problem with the current system is that giant international banks are now major players in global markets, not just as lenders but as investors. Banks have a grossly unfair advantage in this game, because they have access to so much money that they can influence the outcome of their bets. If you the individual investor sell a stock short, your modest investment won't do much to influence the stock's price; but a mega-bank and its affiliates can short so much stock that the value plunges. If the bank is one of those lucky institutions considered "too big to fail," it can rest easy even if its bet does go wrong, since the FDIC and the taxpayers will bail it out from its folly. In the case of international loans, the International Monetary Fund will bail it out. In Sean Corrigan's descriptive prose:

[W]hen financiers and traders get paid enough to make Croesus kvetch for taking wholly asymmetric risks with phantom capital - risks underwritten by government institutions like the Fed and the FDIC.... - this is not exactly a fair card game.

For every winner in this game played with phantom capital, there is a loser; and the biggest losers are those Third World countries that have been seduced into opening their financial markets to currency manipulation, allowing them to be targeted in powerful speculative raids that can and have destroyed their currencies and their economies. Lincoln's economist Henry Carey said that the twin weapons used by the British empire to colonize the world were the "gold standard" and "free trade." The gold standard has now become the petrodollar standard, as we'll see in the next chapter; but the game is still basically the same: crack open foreign markets in the name of "free trade," take down the local currency, and put the nations assets on the block at fire sale prices. The first step in this process is to induce the country to accept foreign loans and investment. The loan money gets dissipated but the loans must be repaid. In the poignant words of Brazilian President Luiz Inacio Lula da Silva:

The Third World War has already started.... The war is tearing down Brazil, Latin America, and practically all the Third World. Instead of soldiers dying, there are children. It is a war over the Third World debt, one which has as its main weapon, interest, a weapon more deadly than the atom bomb, more shattering than a laser beam.

The Third World is fighting back, in a war it thinks was started by the First World; but the governments of the First World are actually victims as well. As Dr. Quigley revealed, the secret of the international bankers' success is that they have managed to control national money systems while letting them appear to be controlled by governments. The U.S. government itself is the puppet of invisible puppeteers….

Chapter 24. Sneering at Doom: Germany Finances a War without Money

The German people were in such desperate straits that they relinquished control of the country to a dictator, and in this they obviously deviated from the "American system," which presupposed a democratically-governed Commonwealth. But autocratic authority did give Adolf Hitler something the American Greenbackers could only dream about - total control of the economy. He was able to test their theories, and he proved that they worked. Like for Lincoln, Hitler's choices were to either submit to total debt slavery or create his own fiat money; and like Lincoln, he chose the fiat solution. He implemented a plan of public works along the lines proposed by Jacob Coxey and the Greenbackers in the 1890s. Projects earmarked for funding included flood control, repair of public buildings and private residences, and construction of new buildings, roads, bridges, canals, and port facilities. The projected cost of the various programs was fixed at one billion units of the national currency. One billion noninflationary bills of exchange, called Labor Treasury Certificates, were then issued against this cost. Millions of people were put to work on these projects, and the workers were paid with the Treasury Certificates. The workers then spent the certificates on goods and services, creating more jobs for more people. The certificates were also referred to as MEFO bills, or sometimes as "Feder money." They were not actually debt-free; they were issued as bonds, and the government paid interest on them. But they circulated as money and were renewable indefinitely, and they avoided the need to borrow from international lenders or to pay off international debts.

Within two years, the unemployment problem had been solved and the country was back on its feet. It had a solid, stable currency and no inflation, at a time when millions of people in the United States and other Western countries were still out of work and living on welfare. Germany even managed to restore foreign trade, although it was denied foreign credit and was faced with an economic boycott abroad. It did this by using a barter system: equipment and commodities were exchanged directly with other countries, circumventing the international banks. This system of direct exchange occurred without debt and without trade deficits. Germany's economic experiment, like Lincoln's, was short-lived; but it left some lasting monuments to its success, including the famous Autobahn, the world’s first extensive superhighway.

According to Stephen Zarlenga. in The Lost Science of Money, Hitler was exposed to the fiat-money solution when he was assigned by German Army intelligence to watch the German Workers Party after World War I. He attended a meeting that made a deep impression on him, at which the views of Gottfried Feder were propounded:

The basis of Feder's ideas was that the state should create and control its money supply through a nationalized central bank rather than have it created by privately owned banks, to whom interest would have to be paid. From this view derived the conclusion that finance had enslaved the population by usurping the nation's control of money.

Zarlenga traces the idea that the state should create its own money to German theorists who had apparently studied the earlier American Greenback movement. Where Feder and Hitler diverged from the American Greenbackers was in equating the financiers who had enslaved the population with the ethnic race of the prominent bankers of the day. The result was to encourage a wave of anti-semitism that darkened Germany and blackened its leader's name. The nineteenth century Greenbackers saw more clearly what the true enemy was -not an ethnic group but a financial scheme, one that transferred the power to create money from the collective body of the people to a private banking elite. The terrible human rights violations Germany fell into could have been avoided by a stricter adherence to the "American system," keeping the reins of power with the people themselves.

While Hitler clearly deserved the opprobrium heaped on him for his later military and racial aggressions, he was enormously popular with the German people, at least for a time. Zarlenga suggests that this was because he temporarily rescued Germany from English economic theory - the theory that money must be borrowed against the gold reserves of a private banking cartel rather than issued outright by the government. Again, the reasons for war are complex; but Zarlenga postulates one that is not found in the history books:

Perhaps [Germany] was expected to borrow gold internationally, and that would have meant external control over her domestic policies. Her decision to use alternatives to gold, would mean that the international financiers would be unable to exercise this control through the international gold standard … and this may have led to controlling Germany through warfare instead.

Dr. Henry Makow, a Canadian researcher, adds some evidence for this theory. He quotes from the 1938 interrogation of C. G. Rakovsky, one of the founders of Soviet Bolshevism and a Trotsky intimate, who was tried in show trials in the USSR under Stalin. Rakovsky maintained that Hitler had actually been funded by the international bankers through their agent Hjalmar Schacht in order to control Stalin, who had usurped power from their agent Trotsky. But Hitler had become an even bigger threat than Stalin when he took the bold step of creating his own money. Rakovsky said:

[Hitler] took over for himself the privilege of manufacturing money and not only physical moneys, but also financial ones; he took over the untouched machinery of falsification and put it to work for the benefit of the state…. Are you capable of imagining what would have come ... if it had infected a number of other states and brought about the creation of a period of autarchy. If you can, then imagine its counterrevolutionary functions….

Autarchy is a national economic policy that aims at achieving self-sufficiency and eliminating the need for imports. Countries that take protectionist measures and try to prevent free trade are sometimes described as autarchical. Rakowsky's statement recalls the editorial attributed to the The London Times, warning that if Lincoln's Greenback plan were not destroyed, "that government will furnish its own money without cost. It will pay off debts and be without a debt. It will have all the money necessary to carry on its commerce. It will become prosperous beyond precedent in the history of the civilized governments of the world." Germany was well on its way to achieving those goals. Henry C K Liu writes of the country's remarkable transformation:

The Nazis came to power in Germany in 1933, at a time when its economy was in total collapse, with ruinous war-reparation obligations and zero prospects for foreign investment or credit. Yet through an independent monetary policy of sovereign credit and a full-employment public-works program, the Third Reich was able to turn a bankrupt Germany, stripped of overseas colonies it could exploit, into the strongest economy in Europe within four years, even before armament spending began.

In Billions for the Bankers, Debts for the People (1984), Sheldon Emry also credited Germany's startling rise from bankruptcy to a world power to its decision to issue its own money. He wrote:

Germany financed its entire government and war operation from 1935 to 1945 without gold and without debt, and it took the whole Capitalist and Communist world to destroy the German power over Europe and bring Europe back under the heel of the Bankers. Such history of money does not even appear in the textbooks of public (government) schools today.

What does appear in modern textbooks is the disastrous runaway inflation suffered in 1923 by the Weimar Republic (the common name for the republic that governed Germany from 1919 to 1933). The radical devaluation of the German mark is cited as the textbook example of what can go wrong when governments are given the unfettered power to print money. That is what it is cited for; but again, in the complex world of economics, things are not always as they seem….

Another Look at the Weimar Hyperinflation

The Weimar financial crisis began with the crushing reparations payments imposed at the Treaty of Versailles. Hjalmar Schacht, who was currency commissioner for the Republic, complained:

The Treaty of Versailles is a model of ingenious measures for the economic destruction of Germany. ... [T]he Reich could not find any way of holding its head above the water other than by the inflationary expedient of printing bank notes.

That is what he said at first; but Zarlenga writes that Schacht proceeded in his 1967 book The Magic of Money "to let the cat out of the bag, writing in German, with some truly remarkable admissions that shatter the 'accepted wisdom' the financial community has promulgated on the German hyperinflation." Schacht revealed that it was the privately-owned Reichsbank, not the German government, that was pumping new currency into the economy. Like the U.S. Federal Reserve, the Reichsbank was overseen by appointed government officials but was operated for private gain. The mark's dramatic devaluation began soon after the Reichsbank was "privatized," or delivered to private investors. What drove the wartime inflation into hyperinflation, said Schacht, was speculation by foreign investors, who would sell the mark short, betting on its decreasing value. Recall that in the short sale, speculators borrow something they don't own, sell it, then "cover" by buying it back at the lower price. Speculation in the German mark was made possible because the Reichsbank made massive amounts of currency available for borrowing, marks that were created on demand and lent at a profitable interest to the bank. When the Reichsbank could not keep up with the voracious demand for marks, other private banks were allowed to create them out of nothing and lend them at interest as well.

According to Schacht, not only was the government not the cause of the Weimar hyperinflation, but it was the government that got the disaster under control. The Reichsbank was put under strict regulation, and prompt corrective measures were taken to eliminate foreign speculation by eliminating easy access to loans of bank-created money. Hitler then got the country back on its feet with his MEFO bills issued by the government.

Schacht actually disapproved of the new government-issued money and wound up getting fired as head of the Reichsbank when he refused to issue it, something that may have saved him at the Nuremberg trials. But he acknowledged in his later memoirs that Feder's theories had worked. Allowing the government to issue the money it needed had not produced the price inflation predicted by classical economic theory. Schacht surmised that this was because factories were sitting idle and people were unemployed. In this he agreed with Keynes: when the resources were available to increase productivity, adding money to the economy did not increase prices; it increased goods and services. Supply and demand increased together, leaving prices unaffected.

These revelations put the notorious hyperinflations of modem history in a different light….

Chapter 25.  Another Look at the Inflation Humbug: Some “Textbook” Hyperinflations Revisited

The Ruble Collapse in Post-Soviet Russia

The usual explanation for the drastic runaway inflation that afflicted Russia and its former satellites following the fall of the Iron Curtain is that their governments resorted to printing money, diluting the money supply and driving up prices. But as William Engdahl shows in A Century of War, this is not what was actually going on. Rather, hyperinflation was a direct and immediate result of letting their currencies float in foreign exchange markets. He writes:

In 1992 the IMF demanded a free float of the Russian ruble as part of its "market-oriented" reform. The ruble float led within a year to an increase in consumer prices of 9,900 per cent, and a collapse in real wages of 84 per cent. For the first time since 1917, at least during peacetime, the majority of Russians were plunged into existential poverty.... Instead of the hoped-for American-style prosperity, two-cars-in-every-garage capitalism, ordinary Russians were driven into economic misery.

After the Berlin Wall came down, the IMF was put in charge of the market reforms that were supposed to bring the former Soviet countries in line with the Western capitalist economies that were dominated by the dollars of the private Federal Reserve and private U.S. banks. The Soviet people acquiesced, lulled by dreams of the sort of prosperity they had seen in the American movies. But Engdahl says it was all a deception:

The aim of Washington's IMF "market reforms" in the former Soviet Union was brutally simple: destroy the economic ties that bound Moscow to each part of the Soviet Union … IMF shock therapy was intended to create weak, unstable economies on the periphery of Russia, dependent on Western capital and on dollar inflows for their survival – a form of neocolonialism.... The Russians were to get the standard Third World treatment ... IMF conditionalities and a plunge into poverty for the population. A tiny elite were allowed to become fabulously rich in dollar terms, and manipulable by Wall Street bankers and investors.

It was an intentional continuation of the Cold War by other means -- entrapping the economic enemy with loans of accounting-entry money. Interest rates would then be raised to unpayable levels, and the IMF would be put in charge of "reforms" that would open the economy to foreign exploitation in exchange for debt relief. Engdahl writes:

The West, above all the United States, clearly wanted a deindustrialized Russia, to permanently break up the economic structure of the old Soviet Union. A major area of the global economy, which had been largely closed to the dollar domain for more than seven decades, was to be brought under its control. ... The new oligarchs were "dollar oligarchs."

The Collapse of Yugoslavia and the Ukraine

Things were even worse in Yugoslavia, which suffered what has been called the worst hyperinflation in history in 1993-94. Again, the textbook explanation is that the government was madly printing money. As one college economics professor put it:

After Tito [the Yugoslavian Communist leader until 1980], the Communist Party pursued progressively more irrational economic policies. These policies and the breakup of Yugoslavia ... led to heavier reliance upon printing or otherwise creating money to finance the operation of the government and the socialist economy. This created the hyperinflation.

That was the conventional view, but Engdahl maintains that the reverse was actually true: the Yugoslav collapse occurred because the IMF prevented the government from obtaining the credit it needed from its own central bank. Without the ability to create money and issue credit, the government was unable to finance social programs and hold its provinces together as one nation. The country's real problem was not that its economy was too weak but that it was too strong. Its "mixed model" combining capitalism and socialism was so successful that it threatened the bankers' IMF/shock therapy model. Engdahl states:

For over 40 years, Washington had quietly supported Yugoslavia, and the Tito model of mixed socialism, as a buffer against the Soviet Union. As Moscow's empire began to fall apart, Washington had no more use for a buffer - especially a nationalist buffer which was economically successful, one that might convince neighboring states in eastern Europe that a middle way other than IMF shock therapy was possible. The Yugoslav model had to be dismantled, for this reason alone, in the eyes of top Washington strategists. The fact that Yugoslavia also lay on a critical path to the potential oil riches of central Asia merely added to the argument.

Yugoslavia was another victim of the Tequila Trap - the lure of wealth and development if it would open its economy to foreign investment and foreign loans. According to a 1984 Radio Free Europe report, Tito had made the mistake of allowing the country the "luxury" of importing more goods than it exported, and of borrowing huge sums of money abroad to construct hundreds of factories that never made a profit. When the dollars were not available to pay back these loans, Yugoslavia had to turn to the IMF for debt relief. The jaws of the whale then opened, and Yugoslavia disappeared within.

As a condition of debt relief, the IMF demanded wholesale privatization of the country's state enterprises. The result was to bankrupt more than 1,100 companies and produce more than 20 percent unemployment. IMF policies caused inflation to rise dramatically, until by 1991 it was over 150 percent. When the government was not able to create the money it needed to hold its provinces together, economic chaos followed, causing each region to fight for its own survival. Engdahl states:

Reacting to this combination of IMF shock therapy and direct Washington destabilization, the Yugoslav president, Serb nationalist Slobodan Milosevic, organized a new Communist Party in November 1990, dedicated to preventing the breakup of the federated Yugoslav Republic. The stage was set for a gruesome series of regional ethnic wars which would last a decade and result in the deaths of more than 200,000 people.

... In 1992 Washington imposed a total economic embargo on Yugoslavia, freezing all trade and plunging the economy into chaos, with hyperinflation and 70 percent unemployment as the result. The Western public, above all in the United States, was told by establishment media that the problems were all the result of a corrupt Belgrade dictatorship.

Similar interventions precipitated runaway inflation in the Ukraine, where the IMF "reforms" began with an order to end state foreign exchange controls in 1994. The result was an immediate collapse of the currency. The price of bread shot up 300 percent; electricity shot up 600 percent; public transportation shot up 900 percent. State industries that were unable to get bank credit were forced into bankruptcy. As a result, says Engdahl:

Foreign speculators were free to pick the jewels among the rubble at dirt-cheap prices.... The result was that Ukraine, once the breadbasket of Europe, was forced to beg food aid from the U.S., which dumped its grain surpluses on Ukraine, further destroying local food self-sufficiency. Russia and the states of the former Soviet Union were being treated like the Congo or Nigeria, as sources of cheap raw materials, perhaps the largest sources in the world.... [T]hose mineral riches were now within the reach of Western multinationals for the first time since 1917.

Chapter 26. Poppy Fields, Opium Wars, and Asian Tigers

The Assault of the Wall Street Speculators

The Japanese-guided market system was so effective and efficient that by the end of the 1980s, Japan was regarded as the leading economic and banking power in the world.  Its Ministry of International Trade and Industry (MITI) played a heavy role in guiding international economic development. The model also proved highly successful in the "Tiger" economies -- South Korea, Malaysia and other East Asian countries. East Asia was built up in the 1970s and 1980s by Japanese state development aid, along with largely private investment and MITI support. When the Soviet Union collapsed, Japan proposed its model for the former communist economies, and many began looking to Japan and South Korea as viable alternatives to the U.S. free-market system. State-guided capitalism provided for the general welfare without destroying capitalist incentive. Engdahl writes:

The Tiger economies were a major embarrassment to the IMF free-market model. Their very success in blending private enterprise with a strong state economic role was a threat to the IMF free-market agenda. So long as the Tigers appeared to succeed with a model based on a strong state role, the former communist states and others could argue against taking the extreme IMF course. In East Asia during the 1980s, economic growth rates of 7-8 per cent per year, rising social security, universal education and a high worker productivity were all backed by state guidance and planning, albeit in a market economy - an Asian form of benevolent paternalism.

High economic growth, rising social security, and universal education in a market economy - it was the sort of "Common Wealth" America's Founding Fathers had endorsed. But the model represented a major threat to the international bankers' system of debt-based money and IMF loans. To diffuse the threat, the Bank of Japan was pressured by Washington to take measures that would increase the yen's value against the dollar. The stated rationale was that this revaluation was necessary to reduce Japan's huge capital surplus (excess of exports over imports). The Japanese Ministry of Finance countered that the surplus, far from being a problem, was urgently required by a world needing hundreds of billions of dollars in railroad and other economic infrastructure after the Cold War. But the Washington contingent prevailed, and Japan went along with the program. By 1987, the Bank of Japan had cut interest rates to a low of 2.5 per cent. The result was a flood of "cheap" money that was turned into quick gains on the rising Tokyo stock market, producing an enormous stock market bubble. When the Japanese government cautiously tried to deflate the bubble by raising interest rates, the Wall Street bankers went on the attack, using their new "derivative" tools to sell the market short and bring it crashing down. Engdahl writes:

No sooner did Tokyo act to cool down the speculative fever, than the major Wall Street investment banks, led by Morgan Stanley and Salomon Bros., began using exotic new derivatives and financial instruments. Their intervention turned the orderly decline of the Tokyo market into a near panic sell-off, as the Wall Street bankers made a killing on shorting Tokyo stocks in the process. Within months, Japanese stocks had lost nearly $5 trillion in paper value.

Japan, the "lead goose," had been seriously wounded. Washington officials proclaimed the end of the "Japanese model" and turned their attention to the flock of Tiger economies flying in formation behind.

Taking Down the Tiger Economies: The Asian Crisis of 1997

Until then, the East Asian countries had remained largely debt-free, avoiding reliance on IMF loans or foreign capital except for direct investment in manufacturing plants, usually as part of a long-term national goal. But that was before Washington began demanding that the Tiger economies open their controlled financial markets to free capital flows, supposedly in the interest of "level playing fields." Like Japan, the East Asian countries went along with the program. The institutional speculators then went on the attack, armed with a secret credit line from a group of international banks including Citigroup.

They first targeted Thailand, gambling that it would be forced to devalue its currency and break from its peg to the dollar. Thailand capitulated, its currency was floated, and it was forced to turn to the IMF for help. The other geese then followed one by one. Chalmers Johnson wrote in The Los Angeles Times in June 1999:

The funds easily raped Thailand, Indonesia and South Korea, then turned the shivering survivors over to the IMF, not to help victims, but to insure that no Western bank was stuck with nonperforming loans in the devastated countries.

Mark Weisbrot testified before Congress, "In this case the IMF not only precipitated the financial crisis, it also prescribed policies that sent the regional economy into a tailspin." The IMF had prescribed the removal of capital controls, opening Asian markets to speculation by foreign investors, when what these countries really needed was a supply of foreign exchange reserves to defend themselves against speculative currency raids. At a meeting of regional finance ministers in 1997, the government of Japan proposed an Asian Monetary Fund (AMF) that would provide the needed liquidity with fewer conditions than were imposed by the IMF. But the AMF, which would have directly competed with the IMF of the Western bankers, met with strenuous objection from the U.S. Treasury and failed to materialize. Meanwhile, the IMF failed to provide the necessary reserves, while insisting on very high interest rates and "fiscal austerity." The result was a liquidity crisis (a lack of available money) that became a major regional depression. Weisbrot testified:

The human cost of this depression has been staggering. Years of economic and social progress are being negated, as the unemployed vie for jobs in sweatshops that they would have previously rejected, and the rural poor subsist on leaves, bark, and insects. In Indonesia, the majority of families now have a monthly income less than the amount that they would need to buy a subsistence quantity of rice, and nearly 100 million people - half the population - are being pushed below the poverty line.

In 1997, more than 100 billion dollars of Asia's hard currency reserves were transferred in a matter of months into private financial hands. In the wake of the currency devaluations, real earnings and employment plummeted virtually overnight. The result was mass poverty in countries that had previously been experiencing real economic and social progress. Indonesia was ordered by the IMF to unpeg its currency from the dollar barely three months before the dramatic plunge of the rupiah, its national currency. In an article in Monetary Reform in the winter of 1998-99, Professor Michel Chossudovsky wrote:

This manipulation of market forces by powerful actors constitutes a form of financial and economic warfare. No need to re-colonize lost territory or send in invading armies. In the late twentieth century, the outright "conquest of nations," meaning the control over productive assets, labor, natural resources and institutions, can be carried out in an impersonal fashion from the corporate boardroom: commands are dispatched from a computer terminal, or a cell phone. Relevant data are instantly relayed to major financial markets - often resulting in immediate disruptions in the functioning of national economies. "Financial warfare" also applies complex speculative instruments including the gamut of derivative trade, forward foreign exchange transactions, currency options, hedge funds, index funds, etc. Speculative instruments have been used with the ultimate purpose of capturing financial wealth and acquiring control over productive assets.

Professor Chossudovsky quoted American billionaire Steve Forbes, who asked rhetorically:

Did the IMF help precipitate the crisis? This agency advocates openness and transparency for national economies, yet it rivals the CIA in cloaking its own operations. Did it, for instance, have secret conversations with Thailand, advocating the devaluation that instantly set off the catastrophic chain of events? ... Did IMF prescriptions exacerbate the illness? These countries' monies were knocked down to absurdly low levels.

Chossudovsky warned that the Asian crisis marked the elimination of national economic sovereignty and the dismantling of the Bretton Woods institutions safeguarding the stability of national economies. Nations no longer have the ability to control the creation of their own money, which has been usurped by marauding foreign banks.

Malaysia Fights Back

Most of the Asian geese succumbed to these tactics, but Malaysia stood its ground. Malaysian Prime Minister Mahathir Mohamad said the IMF was using the financial crisis to enable giant international corporations to take over Third World economies. He contended:

They see our troubles as a means to get us to accept certain regimes, to open our market to foreign companies to do business without any conditions. [The IMF] says it will give you money if you open up your economy, but doing so will cause all our banks, companies and industries to belong to foreigners....

They call for reform but this may result in millions thrown out of work. I told the top official of IMF that if companies were to close, workers will be retrenched, but he said this didn't matter as bad companies must be closed. I told him the companies became bad because of external factors, so you can't bankrupt them as it was not their fault. But the IMF wants the companies to go bankrupt.

Mahathir insisted that his government had not failed. Rather, it had been victimized along with the rest of the region by the international system. He blamed the collapse of Asia's currencies on an orchestrated attack by giant international hedge funds. Because they profited from relatively small differences in asset values, the speculators were prepared to create sudden, massive and uncontrollable outflows of capital that would wreck national economies by causing capital flight. He charged, "This deliberate devaluation of the currency of a country by currency traders purely for profit is a serious denial of the rights of independent nations." Mahathir said he had appealed to the international agencies to regulate currency trading to no avail, so he had been forced to take matters into his own hands. He had imposed capital and exchange controls, a policy aimed at shifting the focus from catering to foreign capital to encouraging national development. He fixed the exchange rate of the ringgit (the Malaysian national currency) and ordered that it be traded only in Malaysia. These measures did not affect genuine investors, he said, who could bring in foreign funds, convert them into ringgit for local investment, and apply to the Central Bank to convert their ringgit back into foreign currency as needed.

Western economists waited for the economic disaster they assumed would follow; but capital controls actually helped to stabilize the system. Before controls were imposed, Malaysia's economy had contracted by 7.5 percent. The year afterwards, growth projections went as high as 5 percent. Joseph Stiglitz, chief economist for the World Bank, acknowledged in 1999 that the Bank had been "humbled" by Malaysia's performance. It was a tacit admission that the World Bank's position had been wrong.

David had stood up to Goliath, but the real threat to the international bankers was Malaysia's much more powerful neighbor to the north. The Chinese Dragon was not only still standing; it was breathing fire….

Chapter 27. Waking the Sleeping Giant: Lincoln’s Greenback System Comes to China

The Mystery of Chinese Productivity

In the eighteenth century, Benjamin Franklin surprised his British listeners with tales of the booming economy in the American colonies, something he credited to the new paper fiat money issued debt-free by provincial governments. In a May 2005 article titled "The Mystery of Mr. Wu," Greg Grillot gave a modern-day variant of this story involving a recent visit to China. He said he and a companion named Karim had interviewed a retired architect named Mr. Wu on his standard of living. Mr. Wu was asked through an interpreter, "How has your standard of living changed in the last two decades?" The interpreter responded, "Thirteen years ago, his pension was 250 yuan a month. Now it is 2,500 yuan. He recently had a cash offer to buy his home for US$300,000, which he's lived in for 50 years." Karim remarked to his companion, "Greg, something doesn't add up here. His pension shot up 900% in 13 years while inflation snoozed at 2-5% per annum. How could the government pay him that much more in such a short period of time?" Grillot commented:

[T]he more you look around, the more you notice that no one seems to know, or care, how so many people can produce so much so cheaply ... and sell it below production cost. How does the Chinese miracle work? Are the Chinese playing with economic fire? All over Beijing, you find people selling things for less than they must have cost to make.

... Karim and I looked over the books of a Chinese steel company. Its year-over-year gross sales increased at a fine, steady clip ... but despite these increasing sales, its debt ascended a bit faster than its sales. So its net profits slowly dwindled over time. ... But it also looked like the company never pays down its debt. ... If the Chinese aren't paying their debts... is there any limit to the amount of money the banks can lend? Just who are these banks, anyway?

Could this be the key? In the land of the world's greatest capitalists [meaning China], there's one business that isn't even remotely governed by free markets: the banks. In the simplest terms, the banks and the government are one and the same. Like modern American banks, the Chinese banks (read: the Chinese government) freely loan money to fledgling and huge established businesses alike. But unlike modern American banks (most of them, anyway), the Chinese banks don't expect businesses to pay back the money lent to them.

Evidently the secret of Chinese national banking is that the government banks are not balancing their books! Grillot concluded that it was a dangerous game:

[E]ven if it's a deliberate policy, an economy can't be deliberately inefficient in allocating capital. Things cost money. They cannot, typically, cost less than the value of the raw materials to make them. The whole cannot be worth less than the sum of the parts... Some laws of economics ... can be bent, but not broken ... at least not without consequences.

Benjamin Franklin's English listeners would no doubt have said the same thing about the innovative monetary scheme of the American colonies. Or could Professor Liu be right? Our entire economic world view may need to be reordered, "just as physics was reordered when we realized that the earth is not stationary and is not the center of the universe."

How the Chinese economy can function on credit that never gets repaid may actually be no more mysterious than the workings of the U.S. economy, which carries $9 trillion in federal debt that nobody ever expects to see repaid. The Chinese government can print its own money and doesn't need to go into debt. Before 1981, it had no federal debt at all; but when it opened to Western trade, it made a show of conforming to Western practices. Advances of credit intended for national development were re-characterized as "non-performing loans," rather like the English tallies that were re-characterized as "unfunded debt" at the end of the seventeenth century. As a result, today China does have a federal debt; but it remains substantially smaller than that of the United States. China can therefore afford to let some struggling businesses carry perpetual debt on their books instead.

In both China and the United States, the money supply is continually being inflated; but the Chinese mechanism may be more efficient, because it does a better job of recycling the money. The new money from Chinese loans that may or may not get repaid goes into the pockets of laborers, increasing their wages and their pensions, giving them more money for producing and purchasing goods. Like in the early American colonies, China's newly-created money is increasing the overall productivity of its economy and the standard of living of its people, promoting the general welfare by leavening the whole loaf at once. In twenty-first century America, by contrast, the economy keeps growing mainly from "money making money." The proceeds go into the pockets of investors who already have more than they can spend on consumer goods. American tax relief also tends to go to these non-producing investors, while American workers are heavily taxed. Meanwhile, the Chinese government is cutting the taxes paid by workers and raising their salaries, in an effort to encourage more spending on cars and household appliances. The Chinese government recently eliminated rural taxes altogether.

Chapter 28. Recovering the Jewel of the British Empire: A People’s Movement Takes Back India

Miracles for Investors, Poverty for Workers

Like other Third World countries, India has been caught in the trap of accepting foreign loans and investment, making it vulnerable to sudden capital flows, subjecting it to the whims and wishes of foreign financial powers. Countries that have been lured into this trap have wound up seeking financial assistance from the IMF, which has then imposed "austerity policies" as a condition of debt relief. These austerities include the elimination of food program subsidies, reduction of wages, increases in corporate profits, and privatization of public industry. All sorts of public assets go on the block - power companies, ports, airlines, railways, even social-welfare services. Canadian critic Wayne Ellwood writes of this "privatization trap":

Dozens of countries and scores of public enterprises around the world have been caught up in this frenzy, many with little choice.... [C]ountries forced to the wall by debt have been pushed into the privatization trap by a combination of coercion and blackmail.... How much latitude do poor nations have to reject or shape adjustment policies? Virtually none. The right of governments ... to make sovereign decisions on behalf of their citizens - the bottom line of democracy - is simply jettisoned.

In theory, these structural adjustment programs also benefit local populations by enhancing the efficiency of local production, something that supposedly happens as a result of exposure to international competition in investment and trade. But their real effect has been simply to impose enormous hardships on the people. Food and transportation subsidies, public sector layoffs, curbs on government spending, and higher interest and tax rates all hit the poor disproportionately hard. Helen Caldicott, M.D., co-founder of Physicians for Social Responsibility, writes:

Women tend to bear the brunt of these IMF policies, for they spend more and more of their day digging in the fields by hand to increase the production of luxury crops, with no machinery or modern equipment. It becomes their lot to help reduce the foreign debt, even though they never benefited from the loans in the first place.... Most of the profits from commodity sales in the Third World go to retailers, middlemen, and shareholders in the First World.... UNICEF estimates that half a million children die each year because of the debt crisis.

Countries have been declared "economic miracles" even when their poverty levels have increased. The "miracle" is achieved through a change in statistical measures. The old measure, called the gross national product or GNP, attributed profits to the country that received the money. The GNP included the gross domestic product or GDP (the total value of the output, income and expenditure produced within a country's physical borders) plus income earned from investment or work abroad. The new statistical measure looks simply at GDP. Profits are attributed to the country where the factories, mines, or financial institutions are located, even if the profits do not benefit the country but go to wealthy owners abroad.

In 1980, median income in the richest 10 percent of countries was 77 times greater than in the poorest 10 percent. By 1999, that gap had grown to 122 times greater. In December 2006, the United Nations released a report titled "World Distribution of Household Wealth," which concluded that 50 percent of the world's population now owns only 1 percent of its wealth. The richest 1 percent own 40 percent of all global assets, with the 37 million people making up that 1 percent and having a net worth of $500,000 or more. The richest 10 percent of adults own 85 percent of global wealth. Under current conditions, the debts of the poorer nations can never be repaid but will just continue to grow. Today more money is flowing back to the First World in the form of debt service than is flowing out in the form of loans. By 2001, enough money had flowed back from the Third World to First World banks to pay the principal due on the original loans six times over. But interest consumed so much of those payments that the total debt actually quadrupled during the same period.

The WTO and the NOW

The United States is also a member of the WTO [World Trade Organization].  Critics warn that Americans could soon be seeing international troops in their own streets. The “New World Order” that was heralded at the end of the Cold War was supposed to be a harmonious global village without restrictions on trade and with cooperative policing of drug-trafficking, terrorism and arms controls. But to the wary, it is the road to a one-world government headed by transnational corporations, oppressing the public through military means and restricting individual freedoms. Bob Djurdjevic, writing in the paleoconservative journal Chronicles in 1998, compared the NWO to the old British empire:

Parallels between the British Empire and the New World Order Empire are striking. It's just that the British crown relied on brute force to achieve its objectives, while the NWO elite mostly use financial terrorism ... The British Empire was built by colonizing other countries, seizing their natural resources and shipping them to England to feed the British industrialists factories. In the wake of the "red coats" invasions, local cultures were often trampled and replaced by a "more progressive" British way of life.

The Wall Street-dominated NWO Empire is being built by colonizing other countries with foreign loans or investments. When the fish is firmly on the hook, the NWO financial terrorists pull the plug, leaving the unsuspecting victim high and dry. And begging to be rescued. In comes the International Monetary Fund (IMF). Its bailout recipes - privatization, trade liberalization and other austerity reforms - amount to seizing the target countries' natural and other resources, and turning them over to the NWO elites - just as surely as the British Empire did by using cruder methods.

Americans tend to identify with these Wall Street banks and transnational corporations because they have U.S. addresses, but Djurdjevic warns that the international cartels do not necessarily have our best interests in mind. To the contrary, Main Street America appears to be their next takeover target….

Chapter 29. Breaking the Back of the Tin Man: Debt Serfdom for American Workers

The mighty United States has been in the banking spider's sights for more than two centuries. This ultimate prize too may finally have been captured in the spider's web, choked in debt spun out of thin air. The U.S. has now surpassed even Third World countries in its debt level. By 2004, the debt of the U.S. government had hit $7.6 trillion, more than three times that of all Third World countries combined. Like the bankrupt consumer who stays afloat by making the minimum payment on his credit card, the government has avoided bankruptcy by paying just the interest on its monster debt; but Comptroller General David M. Walker warns that by 2009 the country may not be able to afford even that mounting bill. When the government cannot service its debt, it will have to declare bankruptcy, and the economy will collapse.

Al Martin is a retired naval intelligence officer, former contributor to the Presidential Council of Economic Advisors, and author of a weekly newsletter called "Behind the Scenes in the Beltway." He observed in an April 2005 newsletter that the ratio of total U.S. debt to gross domestic product (GDP) rose from 78 percent in 2000 to 308 percent in April 2005. The International Monetary Fund considers a nation-state with a total debt-to-GDP ratio of 200 percent or more to be a "de-constructed Third World nation-state." Martin wrote:

What "de-constructed" actually means is that a political regime in that country, or series of political regimes, have, through a long period of fraud, abuse, graft, corruption and mismanagement, effectively collapsed the economy of that country.

Other commentators warn that the "shock therapy" tested in Third World countries is the next step planned for the United States. Editorialist Mike Whitney wrote in CounterPunch in April 2005:

[T]he towering national debt coupled with the staggering trade deficits have put the nation on a precipice and a seismic shift in the fortunes of middle-class Americans is looking more likely all the time.... The country has been intentionally plundered and will eventually wind up in the hands of its creditors.... This same Ponzi scheme has been carried out repeatedly by the IMF and World Bank throughout the world.... Bankruptcy is a fairly straightforward way of delivering valuable public assets and resources to collaborative industries, and of annihilating national sovereignty. After a nation is successfully driven to destitution, public policy decisions are made by creditors and not by representatives of the people.... The catastrophe that middle class Americans face is what these elites breezily refer to as "shock therapy"; a sudden jolt, followed by fundamental changes to the system. In the near future we can expect tax reform, fiscal discipline, deregulation, free capital flows, lowered tariffs, reduced public services, and privatization.

Catherine Austin Fitts was formerly the managing director of a Wall Street investment bank and was Assistant Secretary of the Department of Housing and Urban Development (HUD) under President George Bush Sr. She calls what is happening to the economy "a criminal leveraged buyout of America," something she defines as "buying a country for cheap with its own money and then jacking up the rents and fees to steal the rest." She also calls it the "American Tapeworm" model:

[T]he American Tapeworm model is to simply finance the federal deficit through warfare, currency exports, Treasury and federal credit borrowing and cutbacks in domestic "discretionary" spending.... This will then place local municipalities and local leadership in a highly vulnerable position - one that will allow them to be persuaded with bogus but high-minded sounding arguments to further cut resources. Then, to "preserve bond ratings and the rights of creditors," our leaders can be persuaded to sell our water, natural resources and infrastructure assets at significant discounts of their true value to global investors.... This will all be described as a plan to "save America" by recapitalizing it on a sound financial footing. In fact, this process will simply shift more capital continuously from America to other continents and from the lower and middle classes to elites.

The Destruction of the Great American Middle Class

In 1894, Jacob Coxey warned of the destruction of the great American middle class. That prediction is rapidly materializing, as the gap between rich and poor grows ever wider. The Federal Reserve reported in 2004 that:

·       The wealthiest 1 percent of Americans held 33.4 percent of the nation's wealth, up from 30.1 percent in 1989; while the top 5 percent held 55.5 percent of the wealth.

·       The poorest 50 percent of the population held only 2.5 percent of the wealth, down from 3.0 percent in 1989.

·       The very wealthiest 1 percent of Americans owned a bigger piece of the pie (33.4 percent) than the poorest 90 percent (30.4 percent of the pie). They also owned 62.3 percent of the nation's business assets.

·       The wealthiest 5 percent owned 93.7 percent of the value of bonds, 71.7 percent of nonresidential real estate, and 79.1 percent of the nation's stocks.

Forbes Magazine reported that from 1997 to 1999, the wealth of the 400 richest Americans grew by an average of $940 million each, for a daily increase of $1.3 million per person. Note that lists of this sort do not include the world's truly richest families, including the Rothschilds, the Warburgs, and a long list of royal families. Whether they consider it to be in bad taste or because they fear retribution from the bottom of the wealth pyramid, the super-elite do not make their fortunes public.

Debt Peonage: Eroding the Protection of the Bankruptcy Laws

While the super-rich are amassing fortunes rivaling the economies of small countries, Americans in the lower brackets are struggling with food and medical bills. Personal bankruptcy filings more than doubled from 1995 to 2005. In 2004, more than 1.1 million consumers filed for bankruptcy under Chapter 7. A Chapter 7 bankruptcy stays on the debtor's credit record for ten years from the date of filing, but at least it wipes the slate clean. In 2005, however, even that escape was taken away for many debtors. Under sweeping new provisions to the Bankruptcy Code, many more people are now required to file under Chapter 13, which does not eliminate debts but mandates that they be repaid under a court-ordered payment schedule over a three- to five-year period.

Homestead exemptions have traditionally protected homes from foreclosure in bankruptcy; but not all states have them, and the statutes usually preserve only a fraction of the home's worth. Worse, the new bankruptcy provisions require home ownership for a minimum of 40 months to qualify for the exemption. That means that if you file for bankruptcy within 3.3 years of purchase, your home is no longer off-limits to creditors. In the extreme case, the homeowner could not just lose his home but could owe a "deficiency," or balance due, for whatever the creditor bank failed to get from resale. This balance could be taken from the debtor's paychecks over a five-year period. In some states, "anti-deficiency" laws prevent this, allowing the purchaser to walk away without paying the balance owed. But again not all states have them, and they apply only to the original mortgage on the home. If the buyer takes out a second mortgage or takes equity out of the home, anti-deficiency laws may not apply. The push to persuade homeowners to take out home equity loans recalls the 1920s campaign to persuade people to borrow against their homes to invest in the stock market. When the stock market crashed, their homes became the property of the banks. Elderly people burdened with medical and drug bills are particularly susceptible to those tactics today.

Another insidious change that has been made in the bankruptcy laws pertains to insolvent corporations. The law originally provided for the appointment of an independent bankruptcy trustee, whose job was to try to keep the business running and preserve the jobs of the workers. In the 1970s, the law was changed so that the plan of bankruptcy reorganization would be designed by the banks that were financing the restructuring. The creditors now came first and the workers had to take what was left. The downsizing of the airline industry, the steel industry, and the auto industry followed, precipitating masses of worker layoffs.

Normally, it would fall to the individual States to provide a safety net for their citizens from personal disasters of this sort, but the States have been driven to the brink of bankruptcy as well. Diversion of State funds to out-of-control federal spending has left States with budget crises that have forced them to take belt-tightening measures like those seen in Third World countries. Social services have been cut for those most in need during an economic downturn, including services for childcare, health insurance, income support, job training programs and education. Social services are "discretionary" budget items, which have been sacrificed to the fixed-interest income of the creditors who are first in line to get paid.

Billionaire philanthropist Warren Buffett has warned that America, rather than being an "ownership society," is fast becoming a "sharecroppers' society." Paul Krugman suggested in a 2005 New York Times editorial that the correct term is "debt peonage" society, the system prevalent in the post-Civil War South, when debtors were forced to work for their creditors. American corporations are assured of cheap, non-mobile labor of the sort found in Third World countries by a medical insurance system and other benefits tied to employment. People dare not quit their jobs, however unsatisfactory, for fear of facing medical catastrophes without insurance, particularly now that the escape hatch of bankruptcy has narrowed substantially. Most personal bankruptcies are the result of medical emergencies and other severe misfortunes such as job loss or divorce. The Bankruptcy Reform Act of 2005 eroded the protection the government once provided against these unexpected catastrophes, ensuring that working people are kept on a treadmill of personal debt. Meanwhile, loopholes allowing very wealthy people and corporations to go bankrupt and to shield their assets from creditors remain intact.

Graft and Greed in the Credit Card Business

The 2005 bankruptcy bill was written by and for credit card companies. Credit card debt reached $735 billion by 2003, more than 11 times the tab in 1980. Approximately 60 percent of credit card users do not pay off their monthly balances; and among those users, the average debt carried on their cards is close to $12,000. This "sub-prime" market is actually targeted by banks and credit card companies, which count on the poor, the working poor and the financially strapped to not be able to make their payments. According to a 2003 book titled The Two-Income Trap by Warren and Tyagi:

More than 75 percent of credit card profits come from people who make those low, minimum monthly payments. And who makes minimum monthly payments at 26 percent interest? Who pays late fees, over-balance charges, and cash advance premiums? Families that can barely make ends meet, households precariously balanced between financial survival and complete collapse. These are the families that are singled out by the lending industry, barraged with special offers, personalized advertisements, and home phone calls, all with one objective in mind: get them to borrow more money.

"Payday" lender operations offering small "paycheck advance" loans have mushroomed. Particularly popular in poor and minority communities, they can carry usurious interest rates as high as 500 percent. The debt crisis has been blamed on the imprudent spending habits of people buying frivolous things; but Warren and Tyagi observe that two-income families are actually spending 21 percent less on clothing, 22 percent less on food, and 44 percent less on appliances than one-income families spent a generation earlier. The reason is that they are spending substantially more on soaring housing prices and medical costs.

In 2003, the average family was spending 69 percent more on home mortgage payments in inflation-adjusted dollars than their parents spent a generation earlier, and 61 percent more on health needs. At the same time, real wages had stagnated or declined. Most people were struggling to get by with less; and in order to get by, many turned to credit cards to pay for basic necessities. Credit card companies and their affiliated banks capitalize on the extremity of poor and working-class people by using high-pressure tactics to sign up borrowers they know can't afford their loans, then jacking up interest rates or forcing customers to buy "insurance" on the loans. People who can make only minimal payments on their credit card bills wind up in "debt peonage" to the banks. The scenario recalls the sinister observation made in the Hazard Circular circulated during the American Civil War:

[S]Iavery is but the owning of labor and carries with it the care of the laborers, while the European plan, led by England, is that capital shall control labor by controlling wages. This can be done by controlling the money. The great debt that capitalists will see to it is made out of the war, must be used as a means to control the volume of money.

The slaves kept in the pre-Civil War South had to be fed and cared for. People enslaved by debt must feed and house themselves.

Usurious Loans of Phantom Money

The ostensible justification for allowing lenders to charge whatever interest the market will bear is that it recognizes the time value of money. Lenders are said to be entitled to this fee in return for foregoing the use of their money for a period of time. That argument might have some merit if the lenders actually were lending their own money, but in the case of credit card and other commercial bank debt, they aren't. They aren't even lending their depositors' money. They are lending nothing but the borrower's own credit. We know this because of what the Chicago Fed said in "Modern Money Mechanics":

Of course, [banks] do not really pay out loans from the money they receive as deposits. If they did this, no additional money would be created. What they do when they make loans is to accept promissory notes in exchange for credits to the borrowers' transaction accounts. Loans (assets) and deposits (liabilities) both rise [by the same amount].

Here is how the credit card scheme works: when you sign a merchant's credit card charge slip, you are creating a "negotiable instrument." A negotiable instrument is anything that is signed and convertible into money or that can be used as money. The merchant takes this negotiable instrument and deposits it into his merchant's checking account, a special account required of all businesses that accept credit. The account goes up by the amount on the slip, indicating that the merchant has been paid. The charge slip is forwarded to the credit card company (Visa, MasterCard, etc.), which bundles your charges and sends them to a bank. The bank then sends you a statement, which you pay with a check, causing your transaction account to be debited at your bank. At no point has a bank lent you its money or its depositors' money. Rather, your charge slip (a negotiable instrument) has become an "asset" against which credit has been advanced. The bank has done nothing but monetize your own I.O.U. or promise to repay.

When you lend someone your own money, your assets go down by the amount that the borrower's assets go up. But when a bank lends you money, its assets go up. Its liabilities also go up, since its deposits are counted as liabilities; but the money isn't really there. It is simply a liability - something that is owed back to the depositor. The bank turns your promise to pay into an asset and a liability at the same time, balancing its books without actually transferring any pre-existing money to you.

The spiraling debt trap that has subjected financially-strapped people to usurious interest charges for the use of something the lenders never had to lend is a fraud on the borrowers. In 2006, profits to lenders from interest charges and late fees on U.S. credit card debt came to $90 billion. An alternative for retaining the benefits of the credit card system without feeding a parasitic class of unnecessary middlemen is suggested in Chapter 41.

Chapter 30. The Lure In The Consumer Debt Trap

Baiting the Trap: Seductively Low Interest Rates and “Teaser Rates”

For the first five years of a thirty-year home mortgage, most of the buyer's monthly payments consist of interest. For ARMs, the loans may be structured so that the first five years' payments consist only of interest, with a variable-rate loan thereafter. Since most homes change hands within five years, the average buyer who thinks he owns his own home finds on resale that most if not all of the equity still belongs to the lender. If interest rates have gone up in the meantime, home values will drop, and the buyer will be locked into higher payments for a less valuable house. If he has taken out a home loan for "equity" that has subsequently disappeared, he may have to pay the difference on sale of the home. And if he can't afford that balloon payment, he will be reduced to house serfdom, strapped in a home he can't afford, working to make his payments to the bank. William Hope Harvey's dire prediction that workers would become wage-slaves has come to pass.

The Homestead Laws that gave settlers their own plot of land have been largely eroded by 150 years of the "business cycle," in which bankers have periodically raised interest rates and called in loans, creating successive waves of defaults and foreclosures. For most families, the days of inheriting the family home free and clear are a thing of the past. Some individual homeowners have made out well from the housing boom, but the overall effect has been to put the average family on the hook for a substantially more expensive mortgage than it would have had a decade ago. Again the real winners have been the banks. As market commentator Craig Harris explained in a March 2004 article:

Essentially what has happened is that there was a sort of stealth transfer of net worth from the public to the banks to help save the system. The public took on the risk, went further into debt, spent a lot of money ... and the banks' new properties have appreciated substantially.... They created the money and lent it to you, you spent the money to prop up the economy, and now they own the real property and you're on the hook to pay them back an inflated price [for] that property.... They gave you a better rate but you paid more for the property which they now own until you pay them back.

The Impending Tsunami of Sub-prime Mortgage Defaults

The larger a pyramid scheme grows, the greater the number of investors who need to be brought in to support the pyramid. When the "prime" market was exhausted, lenders had to resort to the riskier "sub-prime" market for new borrowers. Risk was off-loaded by slicing up these mortgages and selling them to investors as  “mortgage-backed securities." "Securitizing" mortgages and selling them to investors was touted as "spreading the risk," but the device backfired. It wound up spreading risk like a contagion, infecting investment pools ranging from hedge funds to pension funds to money market funds.

In a November 2005 article called "Surreal Estate on the San Andreas Fault," Gary North estimated that loans related to the housing market had grown to 80 percent of bank lending, and that much of this growth was in the sub-prime market, which had been hooked with ARMs that were quite risky not only for the borrowers but for the lenders. North said prophetically:

... Even without a recession, the [housing] boom will falter because of ARMs.... These time bombs are about to blow, contract by contract.

If nothing changes -- if short-term rates do not rise -- monthly mortgage payments are going to rise by 60% when the readjustment kicks in. Yet buyers are marginal, people who could not qualify for a 30-year mortgage. This will force "For Sale" signs to flower like dandelions in spring....

If you remember the S&L [savings and loan association] crisis of the mid-1980s, you have some indication of what is coming. The S&L crisis in Texas put a squeeze on the economy in Texas. Banks got nasty. They stopped making new loans. Yet the S&Ls were legally not banks. They were a second capital market. Today, the banks have become S&Ls. They have tied their loan portfolios to the housing market.

I think a squeeze is coming that will affect the entire banking system. The madness of bankers has become unprecedented....

Banks will wind up sitting on top of bad loans of all kinds because the American economy is now housing-sale driven.

As with the Crash of 1929, the finger of responsibility is being pointed at the Federal Reserve, which blew up the housing bubble with “easy” credit, then put a pin in it by making credit much harder to get. Whitney writes:

[The Fed] kept the printing presses whirring along at full-tilt while the banks and mortgage lenders devised every scam imaginable to put greenbacks into the hands of unqualified borrowers. ARMs, "interest-only" or "no down payment" loans etc. were all part of the creative financing boondoggle which kept the economy sputtering along after the "dot.com” crackup in 2000.

... Now, many of those same buyers are stuck with enormous loans that are about to reset at drastically higher rates while their homes have already depreciated 10% to 20% in value. This phenomenon of being shackled to a "negative equity mortgage" is what economist Michael Hudson calls the "New Road to Serfdom"; paying off a mortgage that is significantly larger than the current value of the house. The sheer magnitude of the problem is staggering.

The ability to adjust interest rates is considered a necessary and proper tool of the Fed in managing the money supply, but it is also a form of arbitrary manipulation that can be used to benefit one group over another. The very notion that we have a "free market" is belied by the fact that investors, advisers and market analysts wait with bated breath to hear what the Fed is going to do to interest rates from month to month. The market is responding not to supply and demand but to top-down dictatorial control. That would not be so bad if it actually worked, but a sinking economy can't be kept afloat merely by adjusting interest rates. The problem has been compared to "pushing on a string": when issuing more debt is the only way to keep money in the money supply, once borrowers are "all borrowed up" and lenders have reached their lending limits, no amount of lowering interest rates will get more debt-money into the economy. The only solution to this conundrum is to get "real" money into the system -- real, interest-free, debt-free, government-issued legal tender of the sort first devised by the American colonists.

By 2005, financial weather forecasters could see two economic storm fronts forming on the horizon, and both were being blamed on the market manipulations of the Fed….

Chapter 31. The Perfect Financial Storm

Fannie and Freddie: Compounding the Housing Crisis with Derivatives and Mortgage-Backed Securities

In a June 2002 article titled "Fannie and Freddie Were Lenders," Richard Freeman warned that the housing bubble was the largest bubble in history, dwarfing anything that had gone before; and that it has been pumped up to its gargantuan size by Fannie Mae (the Federal National Mortgage Association) and Freddie Mac (the Federal Home Mortgage Corporation), twin volcanoes that were about to blow. Fannie and Freddie have dramatically expanded the ways money can be created by mortgage lending, allowing the banks to issue many more loans than would otherwise have been possible; but it all adds up to another Ponzi scheme, and it has reached its mathematical limits. Focusing on the larger of these two institutional cousins, Fannie Mae, Freeman noted that if it were a bank, it would be the third largest bank in the world; and that it makes enormous amounts of money in the real estate market for its private owners. Contrary to popular belief, Fannie Mae is not actually a government agency. It began that way under Roosevelt's New Deal, but it was later transformed into a totally private corporation. It issued stock that was bought by private investors, and eventually it was listed on the stock exchange. Like the Federal Reserve, it is now "federal" only in name.

Before the late 1970s, there were two principal forms of mortgage lending. The lender could issue a mortgage loan and keep it, or the lender could sell the loan to Fannie Mae and use the cash to make a second loan, which could also be sold to Fannie Mae, allowing the bank to make a third loan, and so on. Freeman gives the example of a mortgage-lending financial institution that makes five successive loans in this way for $150,000 each, all from an initial investment of $150,000. It sells the first four loans to Fannie Mae, which buys them with money made from the issuance of its own bonds. The lender keeps the fifth loan. At the end of the process, the mortgage-lending institution still has only one loan for $150,000 on its books, and Fannie Mae has loans totaling $600,000 on its books.

In 1979-81, however, policy changes were made that would flood the housing market with even more new money. Fannie Mae gathered its purchased mortgages from different mortgage-lending institutions and pooled them together, producing a type of lending vehicle called a Mortgage-Backed Security (MBS). Fannie might, for example, bundle one thousand 30-year fixed-interest mortgages, each worth roughly $100,000, and pool them into a $100 million MBS. It would put a loan guarantee on the MBS, for which it would earn a fee, guaranteeing that in the event of default it would pay the interest and principal due on the loans "fully and in a timely fashion." The MBS would then be sold as securities in denominations of $1,000 or more to outside investors, including mutual funds, pension funds, and insurance companies. The investors would become the owners of the MBS and would have a claim on the underlying principal and interest stream of the mortgage; but if anything went wrong, Fannie Mae was still responsible. The MBS succeeded in extending the sources of funds that could be tapped into for mortgage lending far into U.S. and international financial markets. It also substantially increased Fannie Mae's risk.

Then Fannie devised a fourth way of extracting money from the markets. It took the securities and pooled them again, this time into an instrument called a Real Estate Mortgage Investment Conduit or REMIC (also known as a "restructured MBS" or collateralized mortgage obligation). REMICs are very complex derivatives. Freeman wrote, "They are pure bets, sold to institutional investors, and individuals, to draw money into the housing bubble." Roughly half of Fannie Mae's Mortgage Backed Securities have been transformed into these highly speculative REMIC derivative instruments. "Thus," said Freeman, "what started out as a simple home mortgage has been transmogrified into something one would expect to find at a Las Vegas gambling casino. Yet the housing bubble now depends on precisely these instruments as sources of funds."

Only the first of these devices is an "asset," something on which Fannie Mae can collect a steady stream of principal and interest. The others represent very risky obligations. These investment vehicles have fed the housing bubble and have fed off it, but at some point, said Freeman, a wave of mortgage defaults is inevitable; and when that happens, the riskier mortgage-related obligations will amplify the crisis. They are particularly risky because they involve leveraging (making multiple investments with borrowed money). That means that when the bet goes wrong, many losses have to be paid instead of one.

In 2002, Fannie Mae's bonds made up over $700 billion of its outstanding debt total of $764 billion. Only one source of income was available to pay the interest and principal on these bonds, the money Fannie collected on the mortgages it owned. If a substantial number of mortgages were to go into default, Fannie would not have the cash to pay its bondholders. Freeman observed that no company in America has ever defaulted on as much as $50 billion in bonds, and Fannie Mae has over $700 billion - at least ten times more than any other corporation in America. A default on a bonded debt of that size, he said, could end the U.S. financial system virtually overnight.

Like those banking institutions considered "too big to fail," Fannie Mae has tentacles reaching into so much of the financial system that if it goes, it could take the economy down with it. A wave of home mortgage defaults would not alone have been enough to bring down the whole housing market, said Freeman; but adding the possibility of default on Fannie's riskier obligations, totaling over $2 trillion in 2002, the chance of a system-wide default has been raised to "radioactive" levels. If a crisis in the housing mortgage market were to produce a wave of loan defaults, Fannie would not be able to meet the terms of the guarantees it put on $859 billion in Mortgage-Backed Securities, and the pension funds and other investors buying the MBS would suffer tens of billions of dollars in losses. Fannie's derivative obligations, which totaled $533 billion in 2002, could also go into default. These hedges are supposed to protect investors from risks, but the hedges themselves are very risky ventures. Fannie Mae has taken extraordinary measures to roll over shaky mortgages in order to obscure the level of default currently threatening the system; but as households with declining real standards of living are increasingly unable to pay rising home prices and the demands of ever larger mortgages and higher interest payments, mortgage defaults will rise. The leverage that has been built into the housing market could then unwind like a rubber band, rapidly de-leveraging the entire market.

In 2003, Freddie Mac was embroiled in a $5 billion accounting scandal, in which it was caught "cooking" the books to make things look rosier than they were. In 2004, Fannie Mae was caught in a similar scandal. In 2006, Fannie agreed to pay $400 million for its misbehavior ($50 million to the U.S. government and $350 million to defrauded shareholders), and to try to straighten out its books. But investigators said the accounting could be beyond repair, since some $16 billion had simply disappeared from the books.

Meanwhile, after blowing the housing bubble to perilous heights with a 1 percent prime rate, the Fed proceeded to let the air back out with a succession of interest rate hikes. By 2006, the housing boom was losing steam. Nervous investors wondered who would be shouldering the risk when the mortgages bundled into MBS slid into default. As one colorful blogger put it:

So let me get this straight.... Is the following scenario below actually playing out?

For starters ma ‘n' pa computer programmer buy a 500K house in Ballard using a neg-am/i-o [negative amortization interest-only mortgage] sold to them by a dodgy local fly-by night lender. That lender immediately sells it off to some middle-man for a period of time. The middlemen take their cut and then sell that loan upstream to Fannie Mae/ Freddie Mac before it becomes totally toxic and reaches critical mass. At which point FM/FM bundle that loan into a mortgage backed security and sell it to pension funds, foreign banks, etc. etc.

What happens when those loans go into their inevitable default? Who owns the property at that point and is left holding the bag?

Nobody on the blog seemed to know; but according to Freeman, Fannie Mae will be holding the bag, since it guaranteed payment of interest and principal in the event of default.  When Fannie Mae can’t pay, the pension funds and other institutions investing in its MBS will be left holding the bag; and it is these pension funds that manage the investments on which the retirements of American workers depend.  When that happens, comfortable retirements could indeed be things of the past.

Chapter 32. In the Eye of the Cyclone: How the Derivatives Crisis Has Gridlocked the Banking System

The looming derivatives crisis is another phenomenon often described with weather imagery. "The grey clouds are getting darker," wrote financial consultant Colt Bagley in 2004; "the winds only need to kick up and we'll have one heck of a financial cyclone in the making."' A decade earlier, Christopher White told Congress:

Taken as a whole, the financial derivatives market, orchestrated by financiers, operates with the vortical properties of a powerful hurricane. It is so huge and packs such a large momentum, that it sucks up the overwhelming majority of the capital and cash that enters or already exists in the economy. It makes a mockery of the idea that a nation exercises sovereign control over its credit policy.

Martin Weiss, writing in a November 2006 investment newsletter, called the derivatives crisis "a global Vesuvius that could erupt at almost any time, instantly throwing the world's financial markets into turmoil ... bankrupting major banks ... sinking big-name insurance companies ... scrambling the investments of hedge funds ... overturning the portfolios of millions of average investors ."

Derivatives 101

Gary Novak, whose website simplifying complex issues was quoted earlier, explains that the banking system has become gridlocked because its pretended “derivative” assets are fake; and the fake assets have swallowed up the real assets.  It all began with deregulators in the 1980s, when government regulation was considered an irrational scheme from which business had to be freed. But regulations are criminal codes, and eliminating them meant turning business over to thieves. The Enron and Worldcom defendants were able to argue in court that their procedures were legal, because the laws making them illegal had been wiped off the books. Government regulation prevented the creation of "funny money" without real value. When the regulations were eliminated, funny money became the order of the day. It manifested in a variety of very complex vehicles lumped together under the label of derivatives, which were often made intentionally obscure and confusing.

"Physicists were hired to write equations for derivatives which business administrators could not understand," Novak says. Derivatives are just bets, but they have been sold as if they were something of value, until the sales have reached astronomical sums that are far beyond anything of real value in existence. Pension funds and trust funds have bought into the Ponzi scheme, only to see their money disappear down the derivatives hole. Universities have been forced to charge huge tuitions although they are financed with huge trust funds, because their money has been tied up in investments that are basically worthless. But the administrators are holding onto their bets, which are "given a pretended value, because heads roll when the truth comes to light." Nobody dares to sell and nobody can collect. The result is a shortage of available funds in global financial institutions. The very thing derivatives were designed to create - market liquidity - has been frozen to immobility in a gridlocked game.

The author of a blog called "World Vision Portal" simplifies the derivatives problem in another way. He writes:

Anyone who has been to Las Vegas or at the casino on a cruise ship can understand it perfectly. A bank gambles and bets on certain pre-determined odds, like playing the casino dealer in a game of poker (banks call this "hedging their risks with derivative contracts"). When they have to show their cards at the end of the play, they either win or lose their bet; either the bank wins or the house wins (this is the end of the derivative contract term).

For us small-time players, we might lose $10 or $20, but the big-time banks are betting hundreds of millions on each card hand. The worst part is that they have a gambling addiction and can’t stop betting money that isn't theirs to bet with....

Winners always leave the gambling table with a big smile and you can see the chips in their hand to know they won more than they had bet. But losers always walk away quietly and don't talk about how much they lost. If a bank makes a good profit (won their bet), they would be telling everyone that their derivative contracts have paid off and they're sitting pretty. In reality, the big-time gambling banks are not talking and won’t tell anyone how much they gambled or how much they lost.

We've been hoodwinked and the game is pretty much over.

The irony is that derivative bets are sold as a form of insurance against something catastrophic going wrong. But if something catastrophic does go wrong, the counterparties (the parties on the other side of the bet, typically hedge funds) are liable to fold their cards and drop out of the game. The "insured" are left with losses both from the disaster itself and from the loss of the premium paid for the bet. To avoid that result, the Federal Reserve, along with other central banks, a fraternity of big private banks, and the U.S. Treasury itself, have gotten into the habit of covertly bailing out losing counterparties. This was done when the giant hedge fund Long Term Capital Management went bankrupt in 1998. It was also evidently done in 2005, but very quietly….

A Derivatives Crisis Orders of Magnitude Beyond LTCM?

Rumors of a derivatives crisis dwarfing even the LTCM debacle surfaced in May 2005, following the downgrading of the debts of General Motors and Ford Motor Corporation to "junk" (bonds having a credit rating below investment grade). Severe problems had apparently occurred at several large hedge funds directly linked to these downgradings. In an article in Executive Intelligence Review in May 2005, Lothar Komp wrote:

The stocks of the same large banks that participated in the 1998 LTCM bailout, and which are known for their giant derivatives portfolios - including Citigroup, JP Morgan Chase, Goldman Sachs, and Deutsche Bank - were hit by panic selling on May 10. Behind this panic was the knowledge that not only have these banks engaged in dangerous derivatives speculation on their own accounts, but, ever desperate for cash to cover their own deteriorating positions, they also turned to the even more speculative hedge funds, placing money with existing funds, or even setting up their own, to engage in activities they didn't care to put on their own books. The combination of financial desperation, the Fed's liquidity binge, and the usury-limiting effects of low interest rates, triggered an explosion in the number of hedge funds in recent years, as everyone chased higher, and riskier, returns. There can be no doubt that some of these banks, not only their hedge fund offspring, are in trouble right now."

Dire warnings ensued of a derivatives crisis "orders of magnitude beyond LTCM." But reports of a major derivative blow-out were being publicly denied, says Komp, since any bank or hedge fund that admitted such losses without first working a bail-out scheme would instantly collapse. An insider in the international banking community said that "there is no doubt that the Fed and other central banks are pouring liquidity into the system, covertly. This would not become public until early April [2006], at which point the Fed and other central banks will have to report on the money supply."

We've seen that when the Fed "pours liquidity into the system," it does it by "open market operations" that create money with accounting entries and lend this money into the money supply, "monetizing" government debt. If it became widely known that the Fed were printing dollars wholesale, however, alarm bells would sound. Investors would rush to cash in their dollar holdings, crashing the dollar and the stock market, following the familiar pattern seen in Third World countries. What to do? The Fed apparently chose to muffle the alarm bells. It announced that in March 2006, it would no longer be reporting M3. M3 has been the main staple of money supply measurement and transparent disclosure for the last half-century, the figure on which the world has relied in determining the soundness of the dollar. In a December 2005 article called "The Grand Illusion," financial analyst Rob Kirby wrote:

On March 23, 2006, the Board of Governors of the Federal Reserve System will cease publication of the M3 monetary aggregate. The Board will also cease publishing the following components: large-denomination time deposits, repurchase agreements (RPs), and Eurodollars.... [These securities] are exactly where one would expect to find the "capture" of any large scale monetization effort that the Fed would embark upon - should the need occur.

It might be clever, if it really were the American government buying back its own securities; but it isn't. It is the private Federal Reserve and private banks. If dollars are to be printed wholesale and federal securities are to be redeemed with them, why not let Congress do the job itself and avoid a massive unnecessary debt to financial middlemen? Arguably, as we'll see later, if the government were to buy back its own bonds and take them out of circulation, it could not only escape a massive federal debt but could do this without producing inflation. Government securities are already traded around the world just as if they were money. They would just be turned into cash, leaving the overall money supply unchanged. When the Federal Reserve buys up government bonds with newly-issued money, on the other hand, the bonds aren't taken out of circulation. Instead, they become the basis for generating many times their value in new loans; and that result is highly inflationary. But that is getting ahead of our story….

The Parasite’s Challenge: How to Feed on the Host without Destroying It

Henry C K Liu draws an analogy from physics:

Whenever credit is issued, money is created.  The issuing of credit creates debt on the part of the counterparty, but debt is not money; credit is. If anything, debt is negative money, a form of financial antimatter. Physicists understand the relationship between matter and antimatter.... The collision of matter and antimatter produces annihilation that returns matter and antimatter to pure energy. The same is true with credit and debt, which are related but opposite.... The collision of credit and debt will produce an annihilation and return the resultant union to pure financial energy unharnessed for human benefit.

Credit and debt cancel each other out and merge back into the great zero-point field from whence they came. To avoid that result and keep "money" in the economy, new debt must continually be created. When commercial borrowers aren't creating enough money by borrowing it into existence, the government must take over that function by spending money it doesn't have, justifying its loans in any way it can. Keeping the economy alive means continually finding ways to pump newly-created loan money into the system, while concealing the fact that this "money" has been spun out of thin air. These new loans don't necessarily have to be paid back. New money just has to be circulated, providing a source of funds to pay the extra interest that wasn't lent into existence by the original loans. A variety of alternatives for pumping liquidity into the system have been resorted to by governments and central banks, including:

1. Drastically lowering interest rates, encouraging borrowers to expand the money supply by going further and further into debt.

2. Instituting tax cuts and rebates that put money into people's pockets. The resulting budget shortfall is made up later with new issues of U.S. bonds, which are "bought" by the Federal Reserve with dollars printed up for the occasion.

3. Authorizing public works, space exploration, military research, and other projects that will justify massive government borrowing that never gets paid back.

4. Engaging in war as a pretext for borrowing, preferably a war that will drag on. People are willing in times of emergency to allow the government to engage heavily in deficit spending to defend the homeland.

5. Lending to Third World countries. If necessary, some of these impossible-to-repay loans can be quietly forgiven later without repayment.

6. Periodic foreclosures on the loan collateral, transferring the collateral back to the banks, which can then be resold to new borrowers, creating new debt-money. The result is the "business cycle" - periodic waves of depression that flush away debt with massive defaults and foreclosures, causing the progressive transfer of wealth from debtors to the banks.

7. Manipulation (or "rigging") of financial markets, including the stock market, in order to keep investor confidence high and encourage further borrowing, until savings are heavily invested and real estate is heavily mortgaged, when the default phase of the business cycle can begin again.

Rigging the stock market? At one time, writes New York Post columnist John Crudele, just mentioning that possibility got a person branded as a "conspiracy nut":

This country, the critics would say, never interferes with its free capital markets. Sure, there's intervention in the currencies markets. And, yes, the Federal Reserve does manipulate the bond market and interest rates through word and deed. But never, ever would such action be taken at the core of capitalism - the equity markets, which for better or worse must operate without interference. That's the way the standoff stayed until 1997 when - at the height of the Last of the Great Bubbles - someone in government decided it wanted the world to know that there was someone actually paying attention in case Wall Street could not handle its own problems. The Working Group on Financial Markets - affectionately known as the Plunge Protection Team - suddenly came out of the closet.

Chapter 33. Maintaining the Illusion: Rigging Financial Markets

[George] Stephanopoulos [President Clinton’s senior adviser on policy and strategy] blurted out on “Good Morning America” on September 17, 2001:

"[T]he Fed in 1989 created what is called the Plunge Protection Team, which is the Federal Reserve, big major banks, representatives of the New York Stock Exchange and the other exchanges, and there - they have been meeting informally so far, and they have kind of an informal agreement among major banks to come in and start to buy stock if there appears to be a problem.

"They have, in the past, acted more formally.

"I don't know if you remember, but in 1998, there was a crisis called the Long Term Capital crisis. It was a major currency trader and there was a global currency crisis. And they, at the guidance of the Fed, all of the banks got together when that started to collapse and propped up the currency markets. And they have plans in place to consider that if the stock markets start to fall."

The Plunge Protection Team (PPT) is formally called the Working Group on Financial Markets (WGFM). Created by President Reagan’s Executive Order 12631 in 1988 in response to the October 1987 stock market crash, the WGFM includes the President, the Secretary of the Treasury, the Chairman of the Federal Reserve, the Chairman of the Securities and Exchange Commission, and the Chairman of the Commodity Futures Trading Commission. Its stated purpose is to enhance "the integrity, efficiency, orderliness, and competitiveness of our Nation's financial markets and [maintain] investor confidence." According to the Order:

To the extent permitted by law and subject to the availability of funds therefore, the Department of the Treasury shall provide the Working Group with such administrative and support services as may be necessary for the performance of its functions.

In plain English, taxpayer money is being used to make the markets look healthier than they are. Treasury funds are made available, but the WGFM is not accountable to Congress and can act from behind closed doors. It not only can but it must, since if investors were to realize what was going on, they would not fall for the bait. "Maintaining investor confidence" means keeping investors in the dark about how shaky the market really is.

The money used to manipulate the market is "Monopoly" money, funds created from nothing and given for nothing, just to prop up the market. Not only is the Dow propped up but the gold market is held down, since gold is considered a key indicator of inflation. If the gold price were to soar, the Fed would have to increase interest rates to tighten the money supply, collapsing the housing bubble and forcing the government to raise inflation-adjusted payments for Social Security. Most traders who see this manipulation going on don't complain, because they think the Fed is rigging the market to their advantage. But gold investors have routinely been fleeced; and the PPT's secret manipulations have created a stock market bubble that will take everyone's savings down when it bursts, as bubbles invariably do. Unwary investors are being induced to place risky bets on a nag on its last legs. The people become complacent and accept bad leadership, bad policies and bad laws, because they think it is all "working" economically.

GATA's findings were largely ignored until they were confirmed in a carefully researched report released by John Embry of Sprott Asset Management of Toronto in August 2004. An update of the report published in The Asia Times in 2005 included an introductory comment that warned, "the secrecy and growing involvement of private-sector actors threatens to foster enormous moral hazards." Moral hazard is the risk that the existence of a contract will change the way the parties act in the future; for example, a firm insured for fire may take fewer fire precautions. In this case, the hazard is that banks are taking undue investment and lending risks, believing they will be bailed out from their folly because they always have been in the past. The comment continued:

Major financial institutions may be acting as de facto agencies of the state, and thus not competing on a level playing field. There are signs that repeated intervention in recent years has corrupted the system.

In a June 2006 article titled "Plunge Protection or Enormous Hidden Tax Revenues," Chuck Augustin was more blunt, writing:

…Today the markets are, without doubt, manipulated on a daily basis by the PPT. Government controlled "front companies" such as Goldman-Sachs, JP Morgan and many others collect incredible revenues through market manipulation. Much of this money is probably returned to government coffers, however, enormous sums of money are undoubtedly skimmed by participating companies and individuals.

The operation is similar to the Mafia-controlled gambling operations in Las Vegas during the 50's and 60's but much more effective and beneficial to all involved. Unlike the Mafia, the PPT has enormous advantages. The operation is immune to investigation or prosecution, there [are] unlimited funds available through the Treasury and Federal Reserve, it has the ultimate insider trading advantages, and it fully incorporates the spin and disinformation of government controlled media to sway markets in the desired direction.... Any investor can imagine the riches they could obtain if they knew what direction stocks, commodities and currencies would move in a single day, especially if they could obtain unlimited funds with which to invest! ... [T]he PPT not only cheats investors out of trillions of dollars, it also eliminates competition that refuses to be "bought" through mergers. Very soon now, only global companies and corporations owned and controlled by the NWO elite will exist.

Is the Spider Losing Its Grip?

What has been good for Rockefeller, has been a curse for the United States. Its citizens, government and country indebted to the hilt, enslaved to his banks.... The country's industrial force lost to overseas in consequence of strong dollar policies.... A strong dollar pursued purely in the interest of the banking empire and not for the best of the country. The USA, now degraded to a service and consumer nation....

The Cracking Economic Egg

The magnitude of the banking crisis and the desperate attempts to cover it up became apparent in June 2007, when two hedge funds belonging to Bear Stearns Company went bankrupt over derivatives bets involving subprime mortgages gone wrong. The parties were being leaned on to settle quietly, to avoid revealing that their derivatives were worth far less than claimed. But as Adrian Douglas observed in a June 30 article called "Derivatives" in LeMetropoleCafe.com (http://www.lemetropolecafe.com):

This is not just an ugly, non-malignant tumor that can be conveniently cut off. This massive financial activity that bets on the outcome of the pricing of the underlying assets has corrupted the system such that those who would be responsible for paying out orders of magnitude more money than they have if the bets go against them are sucked into a black hole of moral and ethical destitution as they have no other choice but to manipulate the price of the underlying assets to prevent financial ruin.

While derivatives may appear to be complex instruments, Douglas says the concept is actually simple: they are insurance contracts against something happening, such as interest rates going up or the stock market going down. Unlike with ordinary insurance policies, however, these are not catastrophic risks that happen infrequently. They will happen eventually. And if a payout event is triggered, "unlike when a house burns down, there will not be just a handful of claims on any one day, payouts will be due in the trillions of dollars on the same day. It is the financial equivalent of a hurricane Katrina hitting every US city on the same day!" Douglas observes:

Instead of stopping this idiotic sham business from growing to galactic proportions, all the authorities, and all the banks, and all the major financial institutions around the world have heralded it as the best thing since sliced bread. But now all these players are complicit in the crime. They are all on the hook. The stakes are now too high. They must manipulate the underlying assets on a daily basis to prevent triggering the payout of a major derivative event.

Derivatives are a bet against volatility. Guess what has happened? Surprise, surprise! Volatility has vanished. The VIX [the Chicago Board Options Exchange Volatility Index] looks like an ECG when the patient has died! Gold has an unofficial $6 rule. The Dow is not allowed to drop more than 200 points and it must rally the following day. Interest rates must not rise, if they do the Fed must issue more of their now secret M3, ship it offshore to the Caribbean and pretend that an unknown foreign bank is buying US Treasuries like crazy.

But the sham is coming unglued because the huge excess liquidity that has been injected into the system to prevent it from imploding is showing up as asset bubbles all over the place and shortages of raw materials are everywhere. There is massive inflation going on. There is no major economy in the world not inflating their money supply by less than 10% annually.

When the derivative buyers realize what is going on and quit paying premiums for insurance that doesn't exist, says Douglas, "there will be a whole new definition of volatility!" And that brings us back to the parasite's challenge. When the bubble collapses, the banking empire that has been built on it must collapse as well….

Chapter 34. Meltdown: The Secret Bankruptcy of the Banks

The Shady World of Investment Banking

As banks have lost profits in the competitive commercial lending business, they have had to expand into investment banking to remain profitable. That expansion was facilitated in 1999, when the Glass-Steagall Act, which forbade commercial and investment banking in the same institutions, was repealed. Investment banking includes corporate fund-raising, mergers and acquisitions, brokering trades, and trading for the bank's own account. Despite this merger of banking functions, however, profits continued to falter. According to a 2002 publication called "Growing Profits Under Pressure" by the Boston Consulting Group:

As the effects of the economic downturn continue to erode corporate profits, large commercial banks - both global and regional - face growing pressures on their corporate- and investment-banking businesses.... From the outside, commercial banks confront increasing competition - particularly from global investment banks ... that are competing more vigorously for commercial banks' traditional corporate transactions. In addition, commercial banks are finding that their corporate clients are increasingly becoming their rivals.... [C]ompanies today ... meet more of their own banking needs themselves.... In recent years, many commercial banks have acquired investment banks, hoping to gain access to new clients.... But ... investment-banking revenues have suffered with the decline in mergers and acquisitions, equity capital markets, and trading activities. All too often, costs have continued to rise.

An article in the June 2006 Economist reported that even with the success of bank trading departments, the overall share values of investment banks were falling. Evidently this was because investors suspected that the banks' returns had been souped up by trading with borrowed money, and they feared the risks involved.

Meanwhile, banking as a public service has been lost to the all-consuming quest for profits….

The Systemic Bankruptcy of the Banks

Before 1913 if too many of a bank's depositors came for their money at one time, the bank would have come up short and would have had to close its doors. That was true until the Federal Reserve Act shored up the system by allowing troubled banks to "borrow" money from the Federal Reserve, which could create it on the spot by selling government securities to a select group of banks that created the money as bookkeeping entries on their books. By rights, Rothbard said, the banks should be put into bankruptcy and the bankers should be jailed as embezzlers, just as they would have been before they succeeded in getting laws passed that protected their swindling. Instead, big banks are assured of being bailed out from their folly, encouraging them to take huge risks because they are confident of being rescued if things go amiss. This "moral hazard" has now been built into the decision-making process. But small businesses don't get bailed out when they make risky decisions that put them under water.          Why should big banks have that luxury? In a "free" market, big banks should be free to fail like any other business. It would be different if they actually were indispensable to the economy, as they claim; but these global mega-banks spend most of their time and resources making profits for themselves, at the expense of the small consumer, the small investor, and small countries.

There are more efficient ways to get the banking services we need than by continually feeding and maintaining the parasitic banking machine we have now. It may be time to cut the mega-banks loose from the Fed's apron strings and let them deal with the free market forces they purport to believe in. Without the collusion of the Plunge Protection Team, the CRMPG and the Federal Reserve, some major banks could soon wind up in bankruptcy. The Federal Deposit Insurance Corporation (FDIC) deals with bankrupt banks by putting them into receivership, a form of bankruptcy in which a company can avoid liquidation by reorganizing with the help of a court-appointed trustee. When a bank is put into receivership, the trustee is the FDIC, an agency of the federal government. In return for bailing the bank out, the FDIC has the option of retaining the bank as a public asset. Why this might not be the disaster for the larger community that has been predicted, and might even work out to the public's benefit, is discussed in Section VI.

Chapter 35. Stepping from Scarcity into Technicolor Abundance

One of the most dramatic scenes in the MGM version of The Wizard of Oz comes when Dorothy's cyclone-tossed house falls from the sky. The world transforms, as she opens the door and steps from the black and white barrenness of a Kansas farmhouse into the Technicolor wonderland of Oz. The world transforms again when Dorothy and her companions don green-colored glasses as they enter the Emerald City. In the Wizard's world, reality can be changed just by looking at things differently. Historian David Parker wrote of Baum's fairytale:

[T]he book emphasized an aspect of theosophy that Norman Vincent Peale would later call "the power of positive thinking": theosophy led to "a new upbeat and positive psychology" that "opposed all kinds of negative thinking - especially fear, worry and anxiety." It was through this positive thinking, and not through any magic of the Wizard, that Dorothy and her companions (as well as everyone else in Oz) got what they wanted.

It would become a popular Hollywood theme - Dumbo's magic feather, Pollyanna's irrepressible positive thinking, the Music Man's "think system" for making beautiful music, the "Unsinkable" Molly Brown. Thinking positively was not just the stuff of children's fantasies but was deeply ingrained in the American psyche. "I have learned," said Henry David Thoreau, "that if one advances confidently in the direction of his dreams, and endeavors to live the life he has imagined, he will meet with a success unexpected in common hours." William James, another nineteenth century American philosopher, said, "The greatest discovery of my generation is that a human being can alter his life by altering his attitudes of mind." Franklin Roosevelt broadcast this upbeat message in his Depression-era "fireside chats," in which he entered people's homes through that exciting new medium the radio and galvanized the country with encouraging words. "The only thing we have to fear is fear itself," he said in 1933, when the "enemy" was poverty and unemployment. Andrew Carnegie, one of the multimillionaire Robber Barons, was another firm believer in achievement through positive thinking. "It is the mind that makes the body rich," he maintained. Believing that financial success could be reduced to a simple formula, he commissioned a newspaper reporter named Napoleon Hill to interview over 500 millionaires to discover the common threads of their success. Hill then memorialized the results in his bestselling book Think and Grow Rich.

Thinking positively was a trait of the Robber Barons themselves, who for all their mischief were a characteristically American phenomenon. They thought big. If there was a criminal element to their thinking, it was a crime the law had not yet codified. The Wild West, the Gold Rush, the Gilded Age, the Roaring Twenties — all were part of the wild and reckless youth of the nation. The Robber Barons were a product of the American capitalist spirit, the spirit of believing in what you want and making it happen. An aspect of a "free" market is the freedom to steal, which is why economics must be tempered with the Constitution and the law. That was the fatal flaw in the laissez-faire free market economics of the nineteenth century: it allowed opportunists to infiltrate and monopolize industry.

America's Founding Fathers saw the necessity of designing a government that would protect the inalienable rights of the people from the power grabs or the unscrupulous. Today we generally think we want less government, not more; but our forebears had a different view of the function of government. The Declaration of Independence declared:

[W]e hold these Truths to be self evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the Pursuit of Happiness – That to secure these Rights, Governments are instituted among Men, deriving their just Powers from the Consent of the Governed.

The capitalist spirit of achieving one’s dreams needed to operate within an infrastructure that insured and supported a fair race. Naming the villains and locking them up temporarily; but to create a millennial utopia, the legal edifice itself had to be secured.

Waking Up from the Spell

The race between Utopia and Oblivion reflects two different visions of reality. One sees a world capable of providing for all. The other sees a world that is too small for its inhabitants, requiring the annihilation of large segments of the population if the rest are to survive. The prevailing scarcity mentality focuses on shortages of oil, water and food. But the real shortage, as Benjamin Franklin explained to his English listeners in the eighteenth century, is in the medium of exchange. If sufficient money could be made available to develop alternative sources of energy, alternative means of extracting water from the environment, and more efficient ways of growing food, there could be abundance for all. The notion that the government could simply print the money it needs is considered unrealistically utopian and inflationary; yet banks create money all the time. The chief reason the U.S. government can't do it is that a private banking cartel already has a monopoly on the practice.

Growth in M3 is no longer officially being reported, but by 2007, reliable private sources put it at 11 percent per year. That means that over one trillion dollars are now being added to the economy annually. Where does this new money come from? It couldn't have come from new infusions of gold, since the country went off the gold standard in 1933. All of this additional money must have been created by banks as loans. As soon as the loans are paid off, the money has to be borrowed all over again, just to keep money in the system; and it is here that we find the real cause of global scarcity: somebody is paying interest on most of the money in the world all of the time. A dollar accruing interest at 5 percent, compounded annually, becomes two dollars in about 14 years. At that rate, banks siphon off as much money in interest every 14 years as there was in the entire world 14 years earlier. (This assumes that the debt is not paid but just keeps compounding, but in the system as a whole, that would be true. When old loans get paid off, debt-money is extinguished, so new loans must continually be taken out just to keep the money supply at its current level. And since banks create the principal but not the interest necessary to pay off their loans, someone somewhere has to continually be taking out new loans to create the money to cover the interest due on this collective debt. Interest then continually accrues on these new loans, compounding the interest due on the whole.) That explains why M3 has increased by 100 percent or more every 14 years since the Federal Reserve first started tracking it in 1959. According to a Fed chart titled "M3 Money Stock," M3 was about $300 billion in 1959. In 1973, 14 years later, it had grown to $900 billion. In 1987, years after that, it was $3,500 billion; and in 2001, 14 years after that, it was $7,200 billion. To meet the huge interest burden required to service all this money-built-on-debt, the money supply must continually expand; and for that to happen, borrowers must continually go deeper into debt, merchants must continually raise their prices, and the odd men out in the bankers' game of musical chairs must continue to lose their property to the banks. Wars, competition and strife are the inevitable results of this scarcity-driven system.

The obvious solution is to eliminate the parasitic banking scheme that is feeding on the world's prosperity. But how? The Witches of Wall Street are not likely to release their vice-like grip without some sort of revolution; and a violent revolution would probably fail, because the world's most feared military machine is already in the hands of the money cartel. Violent revolution would just furnish them with an excuse to test their equipment. The first American Revolution was fought before tasers, lasers, tear gas, armored tanks, and depleted uranium weapons.

Fortunately or unfortunately, in the eye of today's economic cyclone, we may have to do no more than watch and wait, as the global pyramid scheme collapses of its own weight. In the end, what is likely to bring the house of cards down is that the Robber Barons have lost control of the propaganda machine. Their intellectual foe is the Internet, that last bastion of free speech, where even the common blogger can find a voice. As President John Adams is quoted as saying of the revolution of his day:

The Revolution was effected before the war commenced. The Revolution was in the hearts and minds of the people.... This radical change in the principles, opinions, sentiments, and affections of the people, was the real American Revolution.

Today the corporate media are gradually losing control of public opinion; but the Money Machine remains shrouded in mystery, largely because the subject is so complex and forbidding. Richard Russell a respected financial analyst who has been publishing The Dow Theory Letter for over half a century. He observes:

The creation of money is a total mystery to probably 99 percent of the US population, and that most definitely includes the Congress and the Senate. The takeover of US money creation by the Fed is one of the most mysterious and ominous acts in US history.... The legality of the Federal Reserve has never been "tried" before the US Supreme Court.

We the people could try bringing suit before the Supreme Court, but the courts, like the major media, are now largely under the spell of the financial/ corporate cartel. There are honest and committed judges, congresspersons and reporters who could be approached; but to make a real impact will take a vigorous movement from an awakened and aroused populace ready to be heard and make a difference, a popular force too strong to be ignored. When a certain critical mass of people has awakened, the curtain can be thrown aside and the Wizard's hand can be exposed. But before we can build a movement, we need to be ready with an action plan, an ark that will keep us afloat when the flood hits. What sort of ark might that be? We'll begin by looking at a number of alternative models that have been developed around the world.

Perpetual Christmas in Guardiagrele, Italy

One interesting experiment in alternative financing was reported in the October 7, 2000, Wall Street Journal. It was the brainchild of Professor Giacinto Auriti, a wealthy local academic in Guardiagrele, Italy. According to the Journal:

Prof. Auriti ... hopes to convince the world that central bankers are the biggest con artists in modern history. His main thesis: For centuries, central banks have been robbing the common man by the way they put new money in circulation. Rather than divide the new cash among the people, they lend it through the banking system, at interest. This practice, he argues, makes the central banks the money's owners and makes everyone else their debtors. He goes on to conclude that this debt-based money has roughly half the purchasing power it would have if it were issued directly to the populace, free.

To prove his thesis, Professor Auriti printed up and issued his own debt-free bills, called simec. He agreed to trade simec for lire, and to redeem each simec for two lire from local merchants. The result:

Armed with their simec, the townsfolk -- and later their neighbors elsewhere in central Italy's Abruzzo region -- stormed participating stores to snap up smoked prosciutto, designer shoes and other goods at just half the lire price.

"At first, people thought this can't be true, there must be a rip-off hidden somewhere," says Antonella Di Cocco, a guide at a local museum. "But once people realized that the shopkeepers were the only ones taking the risk, they just ran to buy all these extravagant things they never really needed." Often, they raided their savings accounts in the process.

The participating shopkeepers, some of whom barely eked out a living before the simec bonanza, couldn't have been happier. "Every day was Christmas," Pietro Ricci recalls from behind the counter of his cavernous haberdashery.

Neither Mr. Ricci nor his fellow merchants were stuck with their simec for long. Once a week, they turned them in to Prof. Auriti, recouping the full price of their goods.

"We doubled the money in people's pockets, injecting blood into a lifeless body," says Prof. Auriti. "People were so happy, they thought they were dreaming."

Non-participating stores, meanwhile, remained empty week after week.... By mid-August, says the professor, a total of about 2.5 billion simec had circulated.

The professor had primed the pump by doubling the town's money supply. As a result, goods that had been sitting on the shelves for lack of purchasing power started to move. The professor himself lost money on the deal, since he was redeeming the simec at twice what he had charged for them, but the local merchants liked the result so much that they eventually took over the project. When there were enough simec in circulation for the system to work without new money, the professor was relieved of having to put his own money into the venture. The obvious limitation of his system is that it requires a wealthy local benefactor to get it going. Ideally, the benefactor would be the government itself, issuing permanent money in the form of the national currency.

Chapter 37. The Money Question: Goldbugs and Greenbackers Debate

The Greenbackers responded that gold's scarcity, far from being a virtue, was actually its major drawback as a medium of exchange. Gold coins might be "honest money," but their scarcity had led governments, to condone dishonest money, the sleight of hand known as "fractional reserve" banking. Governments that were barred from creating their own paper money would just borrow it from banks that created it and then demanded it back with interest. As Stephen Zarlenga noted in The Lost Science of Money:

[A]ll of the plausible sounding gold standard theory could not change or hide the fact that, in order to function, the system had to mix paper credits with gold in domestic economies. Even after this addition, the mixed gold and credit standard could not properly service the growing economies. They periodically broke down with dire domestic and international results. [In] the worst such breakdown, the Great Crash and Depression of 1929-33 … it was widely noted that those countries did best that left the gold standard soonest.

The reason gold has to be mixed with paper credits is evident from the math. As noted earlier, a dollar lent at 6 percent interest, compounded annually, becomes 10 dollars in 40 years. That means that if the money supply were 100 percent gold, and if bankers lent out 10 percent of it at 6 percent interest compounded annually (continually rolling over principal and interest into new loans), in 40 years the bankers would own all the gold. To avoid that result, either the money supply needs to be able to expand, which means allowing fiat money, or interest needs to be banned as it was in the Middle Ages.

The debate between the Goldbugs and the Greenbackers still rages, but today the Goldbugs are not the bankers. Rather, they are in the money reform camp along with the Greenbackers. Both factions are opposed to the current banking system, but they disagree on how to fix it. That is one reason the modem money reform movement hasn't made much headway politically. As Machiavelli said in the sixteenth century, "He who introduces a new order of things has all those who profit from the old order as his enemies, and he has only lukewarm allies in all those who might profit from the new." Maverick reformers continue to argue among themselves while the bankers and their hired economists march in lockstep, fortified by media they have purchased and laws they have gotten passed with the powerful leverage of their bank-created money.

Chapter 38. The Federal Debt: A Case of Disorganized Thinking

In the 1930s, economist Alvin Hansen told President Roosevelt that plunging the country into debt did not matter, because the public debt was owed to the people themselves and never had to be paid back. But even if that were true in the 1930s (which is highly debatable), it is clearly not true today. Nearly half the public portion of the federal debt is now owed to foreign investors, who are not likely to be so sanguine about continually refinancing it, particularly when the dollar is rapidly shrinking in value. Al Martin cites a study authorized by the U.S. Treasury in 2001, finding that for the government to keep servicing its debt as it has been doing, by 2013 it will have to have raised the personal income tax rate to 65 percent. And that's just to pay the interest on the national debt. When the government can't pay the interest, it will be forced to declare bankruptcy, and the economy will collapse. Martin writes:

The economy of the rest of the planet would collapse five days later.... The only way the government can maintain control in a post-economically collapsed environment is through currency and through military might, or internal military power.... And that's what U.S. citizens are left with ... supersized bubbles and really scary economic numbers.

Compounding the problem, Iran and other oil producers are now moving from dollars to other currencies for their oil trades. If oil no longer has to be traded in dollars, a major incentive for foreign central banks to hold U.S. government bonds will disappear. British journalist John Pilger, writing in The New Statesman in February 2006, suggested that the real reason for the aggressive saber-rattling with Iran is not Iran's nuclear ambitions but is the effect of the world's fourth-biggest oil producer and trader breaking the dollar monopoly. He noted that Iraqi President Saddam Hussein had done the same thing before he was attacked. In an April 2005 article in CounterPunch, Mike Whitney warned of the dire consequences that are liable to follow when the "petrodollar" standard is abandoned:

This is much more serious than a simple decline in the value of the dollar. If the major oil producers convert from the dollar to the euro, the American economy will sink almost overnight. If oil is traded in euros then central banks around the world would be compelled to follow and America will be required to pay off its enormous $8 trillion debt. That, of course, would be doomsday for the American economy.... If there's a quick fix, I have no idea what it might be.

The quick fix! It was the Wizard's stock in trade. He might have suggested fixing the problem by changing the rules by which the game is played. In 1933, Franklin Roosevelt pronounced the country officially bankrupt, exercised his special emergency powers, waved the royal Presidential fiat, and ordered the promise to pay in gold removed from the dollar bill. The dollar was instantly transformed from a promise to pay in legal tender into legal tender itself. Seventy years later, Congress could again acknowledge that the country is officially bankrupt, propose a plan of reorganization, and turn its debts into "legal tender." Alexander Hamilton showed two centuries ago that Congress could dispose of the federal debt by "monetizing" it, but Congress made the mistake of delegating that function to a private banking system. Congress just needs to rectify its error and monetize the debt itself, by buying back its own bonds with newly-issued U.S. Notes.

If that sounds like a radical solution, consider that it is actually what is being done right now -- not by the government but by the private Federal Reserve. The difference is that when the Fed buys back the government's bonds with newly-issued Federal Reserve Notes, it doesn't take the bonds out of circulation. Two sets of securities (the bonds and the cash) are produced where before there was only one. This highly inflationary result could be avoided by allowing the government to buy back its own bonds and simply voiding them out.

Chapter 40. “Helicopter” Money: The Fed’s New Hot Air Balloon

Duncan explained that the shortage of money was not actually in Japan.  It was in the United States, where the threat of deflation had appeared for the first time since the Great Depression.  The technology bubble of the late 1990s had popped in 2000, leading to a serious global economic slowdown in 2001. Before that, the Fed had been bent on curbing inflation; but now it had suddenly switched gears and was focusing on reflation - the intentional reversal of deflation through government intervention. Duncan wrote:

Deflation is a central bank’s worst nightmare. When prices begin to fall, interest rates follow them down. Once interest rates fall to zero, as is the case in Japan at present, central banks become powerless to provide any further stimulus to the economy through conventional means and monetary policy becomes powerless. The extent of the US Federal Reserve's concern over the threat of deflation is demonstrated in Fed staff research papers and the speeches delivered by Fed governors at that time. For example, in June 2002, the Board of Governors of the Federal Reserve System published a Discussion Paper entitled, "Preventing Deflation: Lessons from Japan's Experience in the 1990s." The abstract of that paper concluded "... we draw the general lesson from Japan's experience that when inflation and interest rates have fallen close to zero, and the risk of deflation is high, stimulus - both monetary and fiscal - should go beyond the levels conventionally implied by baseline forecasts of future inflation and economic activity."

Just how far beyond the conventional the Federal Reserve was prepared to go was demonstrated in the Japanese experiment, in which the Bank of Japan created 35 trillion yen over the course of the following year. The yen were then traded with the government's Ministry of Finance (MOF) for Japanese government securities, which paid virtually no interest. The MOF used the yen to buy approximately $320 billion in U.S. dollars from private parties, which were then used to buy U.S. government bonds.

The Fear of Giving Big Government Even More Power

The usual objections to returning the power to create money to Congress are that (a) it would be inflationary, and (b) it would give a corrupt government even more power. But as will be detailed in Chapter 44, government-issued money would actually be less inflationary than the system we have now; and it is precisely because power and money corrupt that money creation needs to be done by a public body, exercised in full view and with full accountability. We can watch our congresspersons deliberating every day on C-SPAN. If the people's money isn't being spent for the benefit of the people, we can vote our representatives out.

What has allowed government to become corrupted today is that it is actually run by the money cartel. Big Business holds all the cards, because its affiliated banks have monopolized the business of issuing and lending the national money supply, a function the Constitution delegated solely to Congress. What hides behind the banner of "free enterprise" today is a system in which giant corporate monopolies have used their affiliated banking trusts to generate unlimited funds to buy up competitors, the media, and the government itself, forcing truly independent private enterprise out. Big private banks are allowed to create money out of nothing, lend it at interest, foreclose on the collateral, and determine who gets credit and who doesn't. They can advance massive loans to their affiliated corporations and hedge funds, which use the money to raid competitors and manipulate markets.

If some players have the power to create money and others don't, the playing field is not "level" but allows some favored players to dominate and coerce others. These giant cartels can be brought to heel only by cutting off their source of power and returning it to its rightful sovereign owners, the people themselves. Private enterprise needs publicly-operated police, courts and laws to keep corporate predators at bay. It also needs a system of truly national banks, in which the power to create the money and advance the credit of the people is retained by the people. We trust government with sweeping powers to declare and conduct wars, provide for the general welfare, and establish and enforce laws. Why not trust it to create the national money supply in all its forms?

The bottom line is that somebody has to have the power to create money. We've seen that gold is too scarce and too inelastic to be the national money supply, at least without an expandable fiat-money system to back it up; and somebody has to create that fiat system. There are only three choices for the job: a private banking cartel, local communities acting separately, or the collective body of the people acting through their representative government. Today we are operating with option #1, a private banking cartel, and it has brought the system to the brink of collapse. The privately-controlled Federal Reserve, which was chartered specifically to "maintain a stable currency," has allowed the money supply to balloon out of control. The Fed manipulates the money supply and regulates its value behind closed doors, in blatant violation of the Constitution and the antitrust laws. Yet it not only can't be held to account; it doesn't even have to explain its rationale or reveal what is going on.

Option #2, the local community fiat alternative, is basically the national fiat currency alternative on a smaller scale. As one commentator put it, what would you have more confidence in - the full faith and credit of Ithaca, New York (population 30,000), or the full faith and credit of the United States? The fiat currency of the national community has the full force of the nation behind it. And even if the politicians in charge of managing it turn out to be no less corrupt than private bankers, the money created by the government will be debt-free. Shifting the power to create money to Congress can relieve future generations of the burden of perpetual interest payments to an elite class of financial oligarchs who have advanced nothing of their own to earn it. The banking spider that has the country trapped in its debt web could be decapitated, returning national sovereignty to the people themselves.

Chapter 41. Restoring National Sovereignty with a Truly National Banking System

William Jennings Bryan, the Cowardly Lion of The Wizard of Oz, proved his courage by challenging the banking cartel's right to create the national money supply. He said in the speech that won him the Democratic nomination in 1896:

[We] believe that the right to coin money and issue money is a function of government. ... Those who are opposed to this proposition tell us that the issue of paper money is a function of the bank and that the government ought to go out of the banking business. I stand with Jefferson ... and tell them, as he did, that the issue of money is a function of the government and that the banks should go out of the governing business.... [W]hen we have restored the money of the Constitution, all other necessary reforms will be possible, and ... until that is done there is no reform that can be accomplished.

The "money of the Constitution" was money created by the people rather than the banks. Technically, the Constitution gave Congress the exclusive power only to "coin" money; but the Constitution was drafted in the eighteenth century, when most forms of money in use today either did not exist or were not recognized as money. Thomas Jefferson said that Constitutions needed to be updated to suit the times. A contemporary version of the Constitutional provision that "Congress shall have the power to coin money" would give Congress the exclusive power to create the national currency in all its forms.

That would mean either abolishing the Federal Reserve or making it what most people think it now is - a truly federal agency. If the Federal Reserve were an arm of the U.S. government, the dollars it generated could go directly into the U.S. Treasury, without the need to add to a crippling federal debt by "funding" them with bonds. That would take care of 3 percent of the U.S. money supply, but what about the other 97 percent that is now created as commercial loans? Would giving Congress the exclusive power to create money mean the government would have to go into the commercial lending business?

Perhaps, but why not? As Bryan said, banking is the government's business, by Constitutional mandate. At least, that part of banking is the government's business that involves creating new money. The rest of the lending business could continue to be conducted privately, just as it is now. Banks would just join those non-bank lending institutions that do not create the money they lend but merely recycle pre-existing funds, including finance companies, pension funds, mutual funds, insurance companies, and securities dealers. Banks would do what most people think they do now -- borrow money at a low interest rate and lend it at a higher rate.

Returning the power to create money to the government would be more equitable and more Constitutional than the current system, but what would it do to bank profits? That was the concern of government officials who reviewed such a proposal recently in England….

The Banking System Is Already Bankrupt

To the charge that imposing a 100 percent reserve requirement could bankrupt the banks, the Wizard's retort might be that the banking system is already bankrupt. The 300-year fractional-reserve Ponzi scheme has reached its mathematical end-point. The bankers' chickens have come home to roost, and only a radical overhaul will save the system. Nouriel Roubini, Professor of Economics at New York University and a former advisor to the U.S. Treasury, gave this bleak assessment in a November 2007 newsletter:

I now see the risk of a severe and worsening liquidity and credit crunch leading to a generalized meltdown of the financial system of a severity and magnitude like we have never observed before. In this extreme scenario whose likelihood is increasing we could see a generalized run on some banks; and runs on a couple of weaker (non-bank) broker dealers that may go bankrupt with severe and systemic ripple effects on a mass of highly leveraged derivative instruments that will lead to a seizure of the derivatives markets ... ; massive losses on money market funds ... ; ever growing defaults and losses ($500 billion plus) in subprime, near prime and prime mortgages ... ; severe problems and losses in commercial real estate ... ; the drying up of liquidity and credit in a variety of asset backed securities putting the entire model of securitization at risk; runs on hedge funds and other financial institutions that do not have access to the Fed's lender of last resort support; [and] a sharp increase in corporate defaults and credit spreads….

The private banking system can no longer be saved with a stream of accounting-entry "reserves" to support an expanding pyramid of "fractional reserve" lending. If private banks are going to salvage their role in the economy, they are going to have to move into some other line of work. Chris Cook is a British market consultant who was formerly director of the International Petroleum Exchange. He observes that the true role of banks is to serve as guarantors and facilitators of deals. The seller wants his money now, but the buyer doesn't have it; he wants to pay over time. So the bank steps in and advances "credit" by creating a deposit from which the borrower can pay the seller. The bank then collects the buyer's payments over time, adding interest as its compensation for assuming the risk that the buyer won't pay. The glitch in this model is that the banks don't create the interest, so larger and larger debt-bubbles have to be created to service the collective debt. A mathematically neater way to achieve this result is through "investment banking" or "Islamic banking" -- bringing investors together with projects in need of funds. The money already exists; the bank just arranges the deal and the issuance of stock. The arrangement is a joint venture rather than a creditor-debtor relationship. The investor makes money only if the company makes money, and the company makes money only if it produces goods and services that add value to the economy. The parasite becomes a partner.

Businesses and individuals do need a ready source of credit, and that credit could be created from nothing and advanced to borrowers under a 100 percent reserve system, just as is done now. The difference would be that the credit would originate with the government, which alone has the sovereign right to create money; and the interest on it would be returned to the public purse. In effect, the government would just be serving as a "credit clearing exchange," or as the accountant in a community system of credits and debits. (More on this later.)

A System of National Bank Branches to Service Basic Public Banking Needs?

Hummel points out that if private banks could no longer lend their deposits many times over, they would have little incentive to service the depository needs of the public. Depository functions are basically clerical and offer little opportunity for income except fees for service. Who would service the public's banking needs if the banks lost interest in that business? In How Credit-Money Shapes the Economy, Professor Guttman notes that our basic banking needs are fairly simple. We need a safe place to keep our money and a practical way to transfer it to others. These services could be performed by a government agency on the model of the now-defunct U.S. Postal Savings System, which operated successfully from 1911 to 1967, providing a safe and efficient place for customers to save and transfer funds. It issued U.S. Postal Savings Bonds in various denominations that paid annual interest, as well as Postal Savings Certificates and domestic money orders. The U.S. Postal Savings System was set up to get money out of hiding, attract the savings of immigrants accustomed to saving at post offices in their native countries, provide safe depositories for people who had lost confidence in private banks, and furnish more convenient depositories for working people than were provided by private banks. (Post offices then had longer hours than banks, being open from 8 a.m. to 6 p.m. six days a week.) The postal system paid two percent interest on deposits annually. The minimum deposit was $1 and the maximum was $2,500. Savings in the system spurted to $1.2 billion during the 1930s and jumped again during World War II, peaking in 1947 at almost $3.4 billion. The U.S. Postal Savings System was shut down in 1967, not because it was inefficient but because it became unnecessary after private banks raised their interest rates and offered the same governmental guarantees that the postal savings system had.

The services offered by a modern system of federally-operated bank branches would have to be modified to reflect today's conditions, but the point is that the government has done these things before and could do them again. Indeed, if "fractional reserve" banking were eliminated, those functions could fall to the government by default. Hummel suggests that it would make sense to simplify the banking business by transferring the depository role to a system of bank branches acting as one entity under the Federal Reserve. Among other advantages, he says:

Since all deposits would be entries in a common computer network, determining balances and clearing checks could be done instantly, thereby eliminating checking system float and its logistic complexities....

With the Fed operating as the sole depository, payments would only involve the transfer of deposits between accounts within a single bank. This would allow for instant clearing, eliminate the nuisance of checking system float, and significantly reduce associated costs. Additional advantages include the elimination of any need for deposit insurance, and ending overnight sweeps and other sterile games that banks play to get around the fractional reserve requirement.

(Float: the time that elapses between when a check is deposited and the funds are available to the depositor, during which the bank is collecting payment from the payer's bank. The overnight sweep is a tactic for maximizing interest by "sweeping" funds not being immediately used in a low-interest account into a high-interest account, where they remain until the balance in the low-interest account drops below a certain minimum.)

In Hummer’s model, the Fed would be the sole depository and only its branches would be called "banks." Institutions formerly called banks would have to close down their depository operations and would become "private financial institutions" (PFIs), along with finance companies, pension funds, mutual funds, insurance companies and the like. Some banks would probably sell out to existing PFIs. PFIs could borrow from the Fed just as banks do now, but the interest rate would be set high enough to discourage them from engaging in "purely speculative games in the financial markets." Without the depository role, banks would no longer need the same number of branch offices. The Fed would probably offer to buy them in setting up its own depository branch offices. Hummel suggests that a logical way to proceed would be to gradually increase the reserve ratio requirement on existing depositories until it reached 100 percent.

The National Credit Card

A system of publicly-owned bank branches could also solve the credit card problem. Hummel notes that imposing a 100 percent reserve requirement on the banks would mean the end of the private credit card business. Recall that when a bank issues credit against a customer's charge slip, the charge slip is considered a "negotiable instrument" that becomes an "asset" against which the bank creates a "liability" in the form of a deposit. The bank balances its books without debiting its own assets or anyone else's account. The bank is thus creating new money, something private banks could no longer do under a 100 percent reserve system. But the ability to get ready credit against the borrower's promise to pay is an important service that would be sorely missed if banks could no longer engage in it. If your ability to use your credit card were contingent on your bank's ability to obtain scarce funds in a competitive market, you might find, when you went to pay your restaurant bill, that credit had been denied because your bank was out of lendable funds.

The notion that money has to "be there" before it can be borrowed is based on the old commodity theory of money. Theorists from Cotton Mather to Benjamin Franklin to Michael Linton (who designed the LETS system) have all defined "money" as something else. It is simply "credit" - an advance against the borrower's promise to repay. Credit originates with that promise, not with someone else's deposit of something valuable in the bank. Credit is not dependent on someone else having given up his rights to an asset for a period of time, and “reserves" are not necessary for advancing it. What is wrong with the current system is not that money is advanced as a credit against the borrower's promise to repay but that the interest on this advance accrues to private banks that gave up nothing of their own to earn it. This problem could be rectified by turning the extension of credit over to a system of truly national banks, which would be authorized to advance the "full faith and credit of the United States" as agents of Congress, which is authorized to create the national money supply under the Constitution.

Credit card services actually are an extension of the depository functions of banks. The link with bank deposits is particularly obvious in the case of those debit cards that can be used to trigger ATM machines to spit out twenty dollar bills. When you make a transfer or withdrawal on your debit card, the money is immediately transferred out of your account, just as if you had written a check. When you use your credit card, the link is not quite so obvious, since the money doesn't come out of your account until later; but it is still your money that is being advanced, not someone else's. Again, your promise to pay becomes an asset and a liability of the bank at the same time, without bringing any of the bank's or any other depositor's money into the deal. The natural agency for handling this sort of transaction would be an institution that is authorized both to deal with deposits and to create credit-money with accounting entries, something a truly "national" bank could do as an agent of Congress. A government banking agency would not be driven by the profit motive to gouge desperate people with exorbitant interest charges. Credit could be extended at interest rates that were reasonable, predictable and fixed. In appropriate circumstances, credit might even be extended interest-free.

Chapter 42. The Question of Interest: Ben Franklin Solves the Impossible Contract Problem

A money supply created by banks was never sufficient, because the bankers created only the principal and not the interest needed to pay back their loans. A government, on the other hand, could not only lend but spend money into the economy, covering the interest shortfall and keeping the money supply in balance. In an article titled "A Monetary System for the New Millennium," Canadian money reform advocate Roger Langrick explains this concept in contemporary terms. He begins by illustrating the mathematical impossibility inherent in a system of bank-created money lent at interest:

[I]magine the first bank which prints and lends out $100. For its efforts it asks for the borrower to return $110 in one year; that is it asks for 10% interest. Unwittingly, or maybe wittingly, the bank has created a mathematically impossible situation. The only way in which the borrower can return 110 of the bank's notes is if the bank prints, and lends, $10 more at 10% interest....

The result of creating 100 and demanding 110 in return, is that the collective borrowers of a nation are forever chasing a phantom which can never be caught; the mythical $10 that were never created. The debt in fact is unrepayable. Each time $100 is created for the nation, the nation's overall indebtedness to the system is increased by $110. The only solution at present is increased borrowing to cover the principal plus the interest of what has been borrowed.

The better solution, says Langrick, is to allow the government to issue enough new debt-free Greenbacks to cover the interest charges not created by the banks as loans:

Instead of taxes, government would be empowered to create money for its own expenses up to the balance of the debt shortfall. Thus, if the banking industry created $100 in a year, the government would create $10 which it would use for its own expenses. Abraham Lincoln used this successfully when he created $500 million of "greenbacks" to fight the Civil War.

In Langrick's example, a private banking industry pockets the interest, which must be replaced every year by a 10 percent issue of new Greenbacks; but there is another possibility. The loans could be advanced by the government itself. The interest would then return to the government and could be spent back into the economy in a circular flow, without the need to continually issue more money to cover the interest shortfall. Government as the only interest-charging lender might not be a practical solution today, but it is a theoretical extreme that can be contrasted with the existing system to clarify the issues. Compare these two hypothetical models:

Bad Witch/Good Witch Scenarios

The Wicked Witch of the West rules over a dark fiefdom with a single private bank owned by the Witch. The bank issues and lends all the money in the realm, charging an interest rate of 10 percent. The Witch prints 100 witch-dollars, lends them to her constituents, and demands 110 back. The people don't have the extra 10, so the Witch creates 10 more on her books and lends them as well. The money supply must continually increase to cover the interest, which winds up in the Witch's private coffers. She gets progressively richer, as the people slip further into debt. She uses her accumulated profits to buy things she wants. She is particularly fond of little thatched houses and shops, of which she has an increasingly large collection. To fund the operations of her fiefdom, she taxes the people heavily, adding to their financial burdens.

Glinda the Good Witch of the South runs her realm in a more people-friendly way. All of the money in the land is issued and lent by a "people's bank" operated for their benefit. She begins by creating 110 people's-dollars. She lends 100 of these dollars at 10 percent interest and spends the extra 10 dollars into the community on programs designed to improve the general welfare - things such as pensions for retirees, social services, infrastructure, education, research and development. The $110 circulates in the community and comes back to the people's bank as principal and interest on its loans. Glinda again lends $100 of this money into the community and spends the other $10 on public programs, supplying the interest for the next round of loans while providing the people with jobs and benefits.

For many years, she just recycles the same $110, without creating new money. Then one year, a cyclone comes up that destroys many of the charming little thatched houses. The people ask for extra money to rebuild. No problem, says Glinda; she will just print more people's-dollars and use them to pay for the necessary labor and materials. Inflation is avoided, because supply increases along with demand. Best of all, taxes are unknown in the realm.

A Practical Real-world Model

It sounds good in a fairytale, in a land with a benevolent queen and only one bank; but things are a bit different in the real world. For one thing, enlightened benevolent queens are hard to come by. For another thing, returning all the interest collected on loans to the government would require nationalizing not only the whole banking system but every other form of private lending at interest, an alternative that is too radical for current Western thinking. A more realistic model would be a dual lending system, semi-private and semi-public. The government would be the initial issuer and lender of funds, and private financial institutions would recycle this money as loans. Private lenders would still be siphoning interest into their own coffers, just not as much. The money supply would therefore still need to expand to cover interest charges, just not by as much. The actual amount by which it would need to expand and how this could be achieved without creating dangerous price inflation are addressed in Chapter 44.

Interest and Islam

Instituting a system of government-owned banks may sound radical in the United States, but some countries have already done it; and some other countries are ripe for radical reform. Rodney Shakespeare, author of The Modern Universal Paradigm (2007), suggests that significant monetary reform may come first in the Islamic community. Islamic reformers are keenly aware of the limitations of the current Western system and are actively seeking change, and oil-rich Islamic countries may have the clout to pull it off.

As noted earlier, Western lenders got around the religious proscription against "usury" (taking a fee for the use of money) by redefining the term to mean taking "excessive" interest; but Islamic purists still hold to the older interpretation. The Islamic Republic of Iran has a state-owned central bank and has led the way in adopting the principles of the Koran as state government policy, including interest-free lending. In September 2007, Iran's President advocated returning to an interest-free system and appointed a new central bank governor who would further those objectives. The governor said that banks should generate income by charging fees for their services rather than making a profit by receiving interest on loans.

That could be a covert factor in the persistent drumbeats for war against Iran, despite a December 2007 National Intelligence Estimate finding that the country was not developing nuclear weapons, the asserted justification for a very aggressive stance against it. We've seen that a global web of debt spun from compound interest is key to maintaining the "full-spectrum dominance" of the private banking monopoly currently controlling international markets. A paper titled "Rebuilding America's Defenses," released in September 2000 by a politically influential neoconservative think tank called the Project for the New American Century, linked America's "national defense" to suppressing economic rivals. The policy goals it urged included "ensuring economic domination of the world, while strangling any potential 'rival' or viable alternative to America's vision of a 'free market' economy." We've seen that alternative models threatening the dominance of the prevailing financial establishment have consistently been targeted for takedown, either by speculative attack, economic sanctions or war. Iran has repeatedly been hit with economic sanctions that could strangle it economically.

How a Truly Interest-free Banking System Might Work

While the threat of a viable interest-free banking system could be a covert factor in the continual war-posturing against Iran, today that threat remains largely hypothetical. Islamic banks typically charge "fees" on loans that are little different from interest. A common arrangement is to finance real estate purchases by buying property and selling it to clients at a higher price, to be paid in installments over time. Skeptical Islamic scholars maintain that these arrangements merely amount to interest-bearing loans by other names. They use terms such as "the usury of deception" and "the jurisprudence of legal tricks."

One problem for banks attempting to follow an interest-free model is that they are normally private institutions that have to compete with other private banks, and they have little incentive to engage in commercial lending if they are taking risks without earning a corresponding profit. In Sweden and Denmark, however, interest-free savings and loan associations have been operating successfully for decades. These banks are cooperatively owned and are not designed to return a profit to their owners. They merely provide a service, facilitating borrowing and lending among their members. Costs are covered by service charges and fees.

Interest-free lending would be particularly feasible if it were done by banks owned by a government with the power to create money, since credit could be extended without the need to make a profit or the risk of bankruptcy from bad loans. Like in China, a government that did not need to worry about carrying a $9 trillion federal debt could afford to carry a few private bad debts on its books without upsetting the economy. A community or government banking service providing interest-free credit would just be a credit clearing agency, an intermediary that allowed people to "monetize" their own promises to repay. People would become sovereign issuers of their own money, not just collectively but individually, with each person determining for himself how much "money" he wanted to create by drawing it from the online service where credit transactions were recorded.

That is what actually happens today when purchases are made with a credit card. Your signature turns the credit slip into a negotiable instrument, which the merchant accepts because the credit card company stands behind it and will pursue legal remedies if you don't pay. But the bank doesn't actually lend you anything. It just facilitates and guarantees the deal. (See Chapter 29.) You create the "money" yourself; and if you pay your bill in full every month, you are creating money interest-free. Credit could be extended interest-free for longer periods on the same model. To assure that advances of the national credit got repaid, national banks would have the same remedies lenders have now, including foreclosure on real estate and other collateral, garnishment of wages, and the threat of a bad credit rating for defaulters; while borrowers would still have the safety net of filing for bankruptcy if they could not pay. But they would have an easier time meeting their obligations, since their interest-free loans would be far less onerous than the 18 percent credit card charges prevalent today.

Interest charges are incorporated into every stage of producing a product, from pulling raw materials out of the earth to putting the goods on store shelves. These cumulative charges have been estimated to compose about half the cost of everything we buy. That means that if interest charges were eliminated, prices might be slashed in half. Interest-free loans would be particularly appropriate for funding state and local infrastructure projects. (See Chapter 44.) Among other happy results, taxes could be reduced; infrastructure and sustainable energy development might pay for themselves; affordable housing for everyone would be a real possibility; and the inflation resulting from the spiral of ever-increasing debt might be eliminated.

On the downside, interest-free loans could create another massive housing bubble if not properly monitored. The current housing bubble resulted when monthly house payments were artificially lowered to the point where nearly anyone could get a mortgage, regardless of assets. This problem could be avoided by reinstating substantial down-payment and income requirements, and by shortening payout periods. A home that formerly cost $3,000 per month would still cost $3,000 per month; the mortgage would just be shorter.

Another hazard of unregulated interest-free lending is that it could produce the sort of speculative carry trade that developed in Japan after it made interest-free or nearly interest-free loans available to all. Investors borrowing at zero or very low interest have used the money to buy bonds paying higher interest, pocketing the difference; and these trades have often been highly leveraged, hugely inflating the money supply and magnifying risk. As the dollar has lost value relative to the yen, investors have had to scramble to repay their yen loans in an increasingly illiquid credit market, forcing them to sell other assets and contributing to systemic market failure. One solution to this problem might be a version of the "real bills" doctrine: interest-free credit would be available only for real things traded in the economy -- no speculation, investing on margin, or shorting. (See Chapter 37.)

What would prudent savers rely on for their retirement years if interest were eliminated from the financial scheme? As in Islamic and Old English systems, money could still be invested for a profit. It would just need to be done as "profit-sharing" -- sharing not only in the profits but in the losses. In a compound-interest arrangement, the lender gets his interest no matter what. In fact, he does better if the borrower fails, since the strapped borrower provides him with a steady income stream at higher rates of interest than otherwise. In today's market, profit-sharing basically means that savers would move their money out of bonds and into stocks. Alternatives for taking the risk out of retirement are explored in Chapter 44.

Chapter 43. Bailout, Buyout, or Corporate Takeover? Beating the Robber Barons at Their Own Game

In the happy ending to our economic fairytale, the drought of debt to a private banking monopoly is destroyed with the water of a freely-flowing public money supply. Among other salubrious results, we the people never have to pay income taxes again. That possibility is not just the fantasy of utopian dreamers but is the conclusion of some respected modern financial analysts. One is Richard Russell, the investment adviser quoted earlier, whose Dow Theory Letter has been in publication for nearly fifty years. In his April 2005 newsletter, Russell observed that the creation of money is a total mystery to probably 99 percent of the U.S. population. Then he proceeded to unravel the mystery in a few sentences:

To simplify, when the US government needs money, it either collects it in taxes or it issues bonds. These bonds are sold to the Fed, and the Fed, in turn, makes book entry deposits. This "debt money" created out of thin air is then made available to the US government. But if the US government can issue Treasury bills, notes and bonds, it can also issue currency, as it did prior to the formation of the Federal Reserve. If the US issued its own money, that money could cover all its expenses, and the income tax wouldn't be needed. So what's the objection to getting rid of the Fed and letting the US government issue its own currency? Easy, it cuts out the bankers and it eliminates the income tax.

In a February 2005 article titled "The Death of Banking and Macro Politics," Hans Schicht reached similar conclusions. He wrote:

If prime ministers and presidents would only be blessed with the most basic knowledge of the perversity of banking, they would not go onto their knees to the Central Banker and ask His Highness for loans.... With a little bit of brains they would expropriate all banking institutions.... Expropriation would bring enough money into the national treasuries for the people not to have to pay taxes for years to come.

"Expropriation," however, means "to deprive of property," and that is not the American way. At least, it isn't in principle. The Robber Barons routinely deprived their competitors of property, but they did it by following accepted business practices: they purchased the property on the open market in a takeover bid. Their sleight of hand was in the funding used for the purchases. They had their own affiliated banks, which could "lend" money into existence with accounting entries.

If the banking cartels can do it, so can the federal government. Commercial bank ownership is held as stock shares, and the shares are listed on public stock exchanges. The government could regain control of the national money supply by simply buying up some prime bank stock at its fair market price. Buying out the entire banking industry would not be necessary, since the depository and credit needs of consumers could be served by a much smaller banking force than is prowling the capital markets right now. The recycling of funds as loans could be left to private banks and those non-bank financial institutions that are already serving a major portion of the loan market. Although buying out the whole industry would not be necessary, it might be the equitable thing to do, since if the government were to take back the power to create money from the banks, bank stock could plummet. Indeed, if commercial banks could no longer make loans with accounting entries, the banks' shareholders would probably vote to be bought out if given the choice.

Bailout, Buyout, or FDIC Receivership

…Insolvent banks are dealt with by the FDIC, which can proceed in one of three ways. It can order a payout, in which the bank is liquidated and ceases to exist. It can arrange for a purchase and assumption, in which another bank buys the failed bank and assumes its liabilities. Or it can take the bridge bank option, in which the FDIC replaces the board of directors and provides the capital to get it running in exchange for an equity stake in the bank. An "equity stake" means an ownership interest: the bank's stock becomes the property of the government.

Time for an Audit of the Banks and a Tax on Derivatives?

Dean Baker of the Center for Economic and Policy Research in Washington is another advocate of a tax on derivatives. He points out that financial transactions taxes have been successfully implemented in the past and have often raised substantial revenue. Until recently, every industrialized nation imposed taxes on trades in its stock markets; and several still do. Until 1966, the United States placed a tax of 0.1 percent on shares of stock when they were first issued, and a tax of 0.04 percent when they were traded. A tax of 0.003 percent is still imposed on stock trades to finance SEC operations.

Baker notes that the vast majority of stock trades and other financial transactions are done by short term traders who hold assets for less than a year and often for less than a day. Unlike long-term stock investment, these trades are essentially a form of gambling. He writes, "When an investor buys a share of stock in a company that she has researched and holds it for ten years, this is not gambling. But when a day trader buys a stock at 2:00 P.M. and sells it at 3:00 P.M., this is gambling. Similarly, the huge bets made by hedge funds on small changes in interest rates or currency prices is a form of gambling." When poor and middle income people gamble, they usually engage in one of the heavily taxed forms such as buying lottery tickets or going to the race track; but wealthier people who gamble in the stock market escape taxation. Baker argues that a tax on derivative trades would only be fair, equalizing the rules of the game:

Insofar as possible, taxes should be shifted away from productive activity and onto unproductive activity. In recognition of this basic economic principle, the government ... already taxes most forms of gambling quite heavily. For example, gambling on horse races is taxed at between 3.0 and 10.0 percent. Casino gambling in the states where it is allowed is taxed at rates between 6.25 and 20.0 percent. State lotteries are taxed at a rate of close to 40 percent. Stock market trading is the only form of gambling that largely escapes taxation. This is doubly inefficient. The government has no reason to favor one form of gambling over others, and it is far better economically to tax unproductive activities than productive ones.

... From an economic standpoint, the nation is certainly no better off if people do their gambling on Wall Street rather than in Atlantic City or Las Vegas. In fact, there are reasons to believe that the nation is better off if people gamble in Las Vegas, since gambling on Wall Street can destabilize the functioning of financial markets. Many economists have argued that speculators cause the price of stocks and other assets to diverge from their fundamental values.

A tax on short-term trades would impose a significant tax on speculators while leaving long-term investors largely unaffected. According to Baker, a tax of as little as 0.25 percent imposed on each purchase or sale of a share of stock, along with a comparable tax on the transfer of other assets such as bonds, options, futures, and foreign currency, could easily have netted the Treasury $120 billion in 2000. By December 2007, according to the Bank for International Settlements, derivatives tallied in at $681 trillion. A tax of 0.25 percent on that sum would have added $1.7 trillion to the government's coffers.

Solving the Derivatives Crisis

A derivatives tax might do more than just raise money for the government. Hoefle maintains that it could actually kill the derivatives business, since even a very small tax leveraged over many trades would make them unprofitable. Killing the derivatives business, in turn, could propel some very big banks into bankruptcy; but the fleas' loss could be the dog's gain. The handful of banks in which 97 percent of U.S. bank-held derivatives are concentrated are the same banks that are engaging in vulture capitalism, bear raids through collusive short selling, and a massive derivatives scheme that allows them to manipulate markets and destroy businesses. A tax on derivatives could expose these corrupt practices and bring both the schemes and the culpable banks under public control.

Chapter 45. Government with Heart: Solving the Problem of Third World Debt

In the nineteenth century, the corporation was given the legal status of a "person" although it was a person without heart, incapable of love and charity. Its sole legal motive was to make money for its stockholders, ignoring such "external" costs as environmental destruction and human oppression. The U.S. government, by contrast, was designed to be a social organism with heart. The Founding Fathers stated as their guiding principles that all men are created equal; that they are endowed with certain inalienable rights, including life, liberty and the pursuit of happiness; and that the function of government is to "provide for the general welfare."

If the major corporate banking entities that are now in control of the nation's money supply were made agencies of the U.S. government, they could incorporate some of these humanitarian standards into their business models; and one important humanitarian step these public banks would be empowered to take would be to forgive unfair and extortionate Third World debt. Most Third World debt today is held by U.S.-based international banks. If those banks were made federal agencies (either by purchasing their stock or by acquiring them in receivership), the U.S. government could declare a "Day of Jubilee" -- a day when oppressive Third World debts were forgiven across the board. The term comes from the Biblical Book of Leviticus, in which Jehovah Himself, evidently recognizing the mathematical impossibility of continually collecting debts at interest compounded annually, declared a day to be held every 49 years, when debts would be forgiven and the dispossessed could return to their homes.

Unlike when Jehovah did it, however, a Day of Jubilee declared by the U.S. government would not be an entirely selfless act. If the United States is going to pay off its international debts with new Greenbacks, it is going to need the goodwill of the world. Forgiving the debts of our neighbors could encourage them to forgive ours. Other countries have no more interest in seeing the international economy collapse than we do; but if they are "spooked" by the market, they could rush to dump their dollars along with everyone else, bringing the whole shaky debt edifice down. Forgiving Third World debt could show our good intentions, quell market jitters, and get everyone on the same page. Our shiny new monetary scheme, rather than appearing to be more sleight of hand, could unveil itself as a millennial model for showering abundance everywhere.

Forgiving Third World debt could have a number of other important benefits, including a reduction in terrorism. In a 2004 book called The Debt Threat: How Debt Is Destroying the Developing World and Threatening Us All, Noreena Hertz notes that "career terrorists" are signing up for that radical employment because it pays a salary when no other jobs are available. Relieving Third World debt would also help protect the global environment, which is being destroyed piece by piece to pay off international lenders; and it could help prevent the spread of diseases that are being bred in impoverished conditions abroad.

Chapter 47. Over the Rainbow: Government without Taxes or Debt

Commentators suggested that the Dow fell by only 500 points because of the behind-the-scenes maneuverings of the Plunge Protection Team, the Counterparty Risk Management Policy Group and the Federal Reserve. But it was all just window-dressing, a dog and pony show to keep investors lulled into complacency, inducing them to keep betting on a stock market nag on its last legs. The same pattern has been repeated since, with assorted manipulations to keep the band playing on; but the iceberg has struck and the economic Titanic is sinking.

Like at the end of the Roaring Twenties, we are again looking down the trough of the "business cycle," mortgaged up to the gills and at risk of losing it all. We own nothing that can't be taken away. The housing market could go into a tailspin and so could the stock market. The dollar could collapse and so could our savings. Even social security and pensions could soon be things of the past. Before the economy collapses and our savings and security go with it, we need to reverse the sleight of hand that created the bankers' Ponzi scheme. The Constitutional provision that "Congress shall have the power to coin money" needs to be updated so that it covers the national currency in all its forms, including the 97 percent now created with accounting entries by private commercial banks. That modest change could transform the dollar from a vice for wringing the lifeblood out of a nation of sharecroppers into a bell for ringing in the millennial abundance envisioned by our forefathers. The government could actually eliminate taxes and the federal debt while expanding the services it provides.

The Puzzle Assembled

The pieces to the monetary puzzle have been concealed by layers of deception built up over 400 years, and it has taken some time to unravel them; but the picture has now come clear, and we are ready to recap what we have found. The global debt web has been spun from a string of frauds, deceits and sleights of hand, including:

·       "Fractional reserve" banking. Formalized in 1694 with the charter for the Bank of England, the modern banking system involves credit issued by private bankers that is ostensibly backed by "reserves." At one time, these reserves consisted of gold; but today they are merely government securities (promises to pay). The banking system lends these securities many times over, essentially counterfeiting them.

·       The "gold standard." In the nineteenth century, the government was admonished not to issue paper fiat money on the ground that it would produce dangerous inflation. The bankers insisted that paper money had to be backed by gold. What they failed to disclose was that there was not nearly enough gold in their own vaults to back the privately-issued paper notes laying claim to it. The bankers themselves were dangerously inflating the money supply based on a fictitious "gold standard" that allowed them to issue loans many times over on the same gold reserves, collecting interest each time.

·       The "Federal" Reserve. Established in 1913 to create a national money supply, the Federal Reserve is not federal, and today it keeps nothing in "reserve" except government bonds or I.O.U.s. It is a private banking corporation authorized to print and sell its own Federal Reserve Notes to the government in return for government bonds, putting the taxpayers in perpetual debt for money created privately with accounting entries. Except for coins, which make up only about one one-thousandth of the money supply, the entire U.S. money supply is now created by the private Federal Reserve and private banks, by extending loans to the government and to individuals and businesses.

·       Tile federal debt and the money supply. The United States went off the gold standard in the 1930s, but the "fractional reserve" system continued, backed by "reserves" of government bonds. The federal debt these securities represent is never paid off but is continually rolled over, forming the basis of the national money supply. As a result of this highly inflationary scheme, by January 2007 the federal debt had mushroomed to $8.679 trillion and was approaching the point at which the interest alone would be more than the public could afford to pay.

·       The federal income tax. Considered unconstitutional for over a century, the federal income tax was ostensibly legalized in 1913 by the Sixteenth Amendment to the Constitution. It was instituted primarily to secure a reliable source of money to pay the interest due to the bankers on the government's securities, and that continues to be its principal use today.

·       The Federal Deposit Insurance Corporation and the International Monetary Fund. A principal function of the Federal Reserve was to bail out banks that got over-extended in the fractional-reserve shell game, using money created in "open market" operations by the Fed. When the Federal Reserve failed in that backup function, the FDIC and then the IMF were instituted, ensuring that mega-banks considered "too big to fail" would get bailed out no matter what unwarranted risks they took.

·       The "free market." The theory that businesses in America prosper or fail due to "free market forces" is a myth. While smaller corporations and individuals who miscalculate their risks may be left to their fate in the market, mega-banks and corporations considered too big to fail are protected by a form of federal welfare available only to the rich and powerful. Other distortions in free market forces result from the covert manipulations of a variety of powerful entities. Virtually every market is now manipulated, whether by federal mandate or by institutional speculators, hedge funds, and large multinational banks colluding on trades.

·       The Plunge Protection Team and the Counterparty Risk Management Policy Group (CRMPG). Federal manipulation is done by the Working Group on Financial Markets, also known as the Plunge Protection Team (PPT). The PPT is authorized to use U.S. Treasury funds to rig markets in order to "maintain investor confidence," keeping up the appearance that all is well. Manipulation is also effected by a private fraternity of big New York banks and investment houses known as the CRMPG, which was set up to bail its members out of financial difficulty by colluding to influence markets, again with the blessings of the government and to the detriment of the small investors on the other side of these orchestrated trades.

·       The "floating" exchange rate. Manipulation and collusion also occur in international currency markets. Rampant currency speculation was unleashed in 1971, when the United States defaulted on its promise to redeem its dollars in gold internationally. National currencies were left to "float" against each other, trading as if they were commodities rather than receipts for fixed units of value. The result was to remove the yardstick for measuring value, leaving currencies vulnerable to attack by international speculators prowling in these dangerous commercial waters.

·       The short sale. To bring down competitor currencies, speculators use a device called the "short sale" - the sale of currency the speculator does not own but has theoretically "borrowed" just for purposes of sale. Like "fractional reserve" lending, the short sale is actually a form of counterfeiting. When speculators sell a currency short in massive quantities, its value is artificially forced down, forcing down the value of goods traded in it.

·       "Globalization" and "free trade." Before a currency can be brought down by speculative assault, the country must be induced to open its economy to "free trade" and to make its currency freely convertible into other currencies. The currency can then be attacked and devalued, allowing national assets to be picked up at fire sale prices and forcing the country into bankruptcy. The bankrupt country must then borrow from international banks and the IMF, which impose as a condition of debt relief that the national government may not issue its own money. If the government tries to protect its resources or its banks by nationalizing them for the benefit of its own citizens, it is branded "communist," "socialist" or "terrorist" and is replaced by one that is friendlier to "free enterprise." Locals who fight back are termed "terrorists" or "insurgents."

·       Inflation myths. The runaway inflation suffered by Third World countries has been blamed on irresponsible governments running the money printing presses, when in fact these disasters have usually been caused by speculative attacks on the national currency. Devaluing the currency forces prices to shoot up overnight. "Creeping inflation" like that seen in the United States today is also blamed on government’s irresponsibly printing money, when it is actually caused by private banks inflating the money supply with debt. Banks advance new money as loans that must be repaid with interest, but the banks don't create the interest necessary to service the loans. New loans must continually be taken out to obtain the money to pay the interest, forcing prices up in an attempt to cover this new cost, spiraling the economy into perpetual price inflation.

·       The "business cycle." As long as banks keep making low-interest loans, the money supply expands and business booms; but when the credit bubble gets too large, the central bank goes into action to deflate it. Interest rates are raised, loans are reduced, and the money supply shrinks, forcing debtors into foreclosure, delivering their homes to the banks. This is called the "business cycle," as if it were a natural condition like the weather. In fact, it is a natural characteristic only of a monetary scheme in which money comes into existence as a debt to private banks for "reserves" of something lent many times over.

·       The home mortgage boondoggle. A major portion of the money created by banks today has originated with the "monetization" of home mortgages. The borrower thinks he is borrowing pre-existing funds, when the bank is just turning his promise to repay into an "asset" secured by real property. By the time the mortgage is paid off, the borrower has usually paid the bank more in interest than was owed on the original loan; and if he defaults, the bank winds up with the house, although the money advanced to purchase it was created out of thin air.

·       The housing bubble. The Fed pushed interest rates to very low levels after the stock market collapsed in 2000, significantly shrinking the money supply. "Easy" credit pumped the money supply back up and saved the market investments of the Fed's member banks, but it also led to a housing bubble that will again send the economy to the trough of the "business cycle" as it collapses.

·       The Adjustable Rate Mortgage or ARM. The housing bubble was fanned into a blaze through a series of high-risk changes in mortgage instruments, including variable rate loans that allowed nearly anyone to qualify to buy a home who would take the bait. By 2005, about half of all U.S. mortgages were at "adjustable" interest rates. Purchasers were lulled by "teaser" rates into believing they could afford mortgages that were liable to propel them into inextricable debt if not into bankruptcy. Payments could increase by 50 percent after 6 years just by their terms, and could increase by 100 percent if interest rates went up by a mere 2 percent in 6 years.

·       "Securitization" of debt and the credit crisis. The banks moved risky loans off their books by selling them to unwary investors as "mortgage-backed securities," allowing the banks to meet capital requirements to make yet more loans. But when the investors discovered that the securities were infected with "toxic" subprime debt they quit buying them, leaving the banks scrambling for funds.

·       The secret insolvency of the banks. The Wall Street banks are themselves heavily invested in these mortgage-backed securities, as well as in very risky investments known as "derivatives," which are basically side bets that some asset will go up or down. Outstanding derivatives are now counted in the hundreds of trillions of dollars, many times the money supply of the world. Banks have been led into these dangerous waters because traditional commercial banking has proven to be an unprofitable venture. While banks have the power to create money as loans, they also have the obligation to balance their books; and when borrowers default, the losses must be made up from the banks' profits. Faced with a wave of bad debts and lost business, banks have kept afloat by branching out into the economically destructive derivatives business, by "churning" loans, and by engaging in highly leveraged market trading. Today their books may look like Enron's, with a veneer of "creative accounting" concealing bankruptcy.

·       "Vulture capitalism" and the derivatives cancer. At one time, banks served the community by providing loans to developing businesses; but today this essential credit function is being replaced by a form of "vulture capitalism," in which bank investment departments and affiliated hedge funds are buying out shareholders and bleeding businesses of their profits, using loans of "phantom money" created on a computer screen. Banks are also underwriting speculative derivative bets, in which money that should be going into economic productivity is merely gambled on money making money in the casino of the markets.

·       Moral hazard. Both the housing bubble and the derivatives bubble are showing clear signs of imploding; and when they do, banks considered too big to fail will expect to be bailed out from the consequences of their risky ventures just as they have been in the past….

Waking Up in Kansas

It is at this point in our story, if it is to have a happy ending, that we the people must snap ourselves awake, stand up, and say "Enough!" The bankers' extremity is our opportunity. We can be kept indebted and enslaved only if we continue to underwrite bank profligacy. As Mike Whitney wrote in March 2007, "The Federal Reserve will keep greasing the printing presses and diddling the interest rates until someone takes away the punch bowl and the party comes to an end.” It is up to us, an awakened and informed populace, to take away the punch bowl. Private commercial banking as we know it is obsolete, and the vulture capitalist investment banking that has come to dominate the banking business is a parasite on productivity, serving its own interests at the expense of the public's. Rather than propping up a bankrupt banking system, Congress could and should put insolvent banks into receivership, claim them as public assets, and operate them as agencies serving the depository and credit needs of the people.

Besides the imploding banking system, a second tower is now poised to fall. The U.S. federal debt is approaching the point at which just the interest on it will be more than the taxpayers can afford to pay; and just when foreign investors are most needed to support this debt, China and other creditors are threatening to demand not only the interest but the principal back on their hefty loans. The Ponzi scheme has reached its mathematical limits, forcing another paradigm shift if the economy is to survive. Will the collapse of the debt-based house of cards be the end of the world as we know it? Or will it be the way through the looking glass, a clarion call for change? We can step out of the tornado into debtors' prison, or we can step into the Technicolor cornucopia of a money system based on the ingenuity and productivity that are the true wealth of a nation and its people.

Home at Last

In the happy ending to our modern monetary fairytale, Congress takes back the power to create money in all its forms, including the money created with accounting entries by private banks. Highlights of this satisfying ending include:

·       Elimination of personal income taxes, allowing workers to keep their wages, putting spending money in people's pockets, stimulating economic growth.

·       Elimination of a mounting federal debt that must otherwise burden and bind future generations.

·       The availability of funds for a whole range of government services that have always been needed but could not be afforded under the "fractional reserve" system, including improved education, environmental cleanup and preservation, universal health care, restoration of infrastructure, independent medical research, and development of alternative energy sources.

·       A social security system that is sufficiently funded to support retirees, replacing private pensions that keep workers chained to unfulfilling jobs and keep employers unable to compete in international markets.

·       Elimination of the depressions of the "business cycle" that have resulted when interest rates and reserve requirements have been manipulated by the Fed to rein in out-of-control debt bubbles.

·       The availability of loans at interest rates that are not subject to unpredictable manipulation by a private central bank but remain modest and fixed, something borrowers can rely on in making their business decisions and in calculating their risks.

·       Elimination of the aggressive currency devaluations and economic warfare necessary to sustain a money supply built on debt. Exchange rates become stable, the U.S. dollar becomes self-sustaining, and the United States and other countries become self-reliant, trading freely with their neighbors without being dependent on foreign creditors or having to dominate and control other countries and markets.

This happy ending is well within the realm of possibility, but it won't happen unless we the people get our boots on and start marching. We have become conditioned by our television sets to expect some hero politician to save the day, but the hero never appears, because both sides dominating the debate are controlled by the banking/ industrial cartel. Nothing will happen until we wake up, get organized, and form a plan. What sort of plan? The platform of a revamped Populist/Greenback/American Nationalist/ Whig Party might include:

1. A bill to update the Constitutional provision that "Congress shall have the power to coin money" so that it reads, "Congress shall have the power to create the national currency in all its forms, including not only coins and paper dollars but the nation's credit issued as commercial loans."

2. A call for an independent audit of the Federal Reserve and the giant banks that own it, including an investigation of:

·       The creation of money through "open market operations,"

·       The market manipulations of the Plunge Protection Team and the CRMPG,

·       The massive derivatives positions of a small handful of mega-banks and their use to rig markets, and

·       The use of "creative accounting" to mask bank insolvency.

Any banks found to be insolvent would be delivered into FDIC receivership and to the disposal of Congress.

3. Repeal of the Sixteenth Amendment to the Constitution, construed as authorizing a federal income tax.

4. Either repeal of the Federal Reserve Act as in violation of the Constitution, or amendment of the Act to make the Federal Reserve a truly federal agency, administered by the U.S. Treasury.

5. Public acquisition of a network of banks to serve as local bank branches of the newly-federalized banking system, either by FDIC takeover of insolvent banks or by the purchase of viable banks with newly-issued U.S. currency. Besides serving depository banking functions, these national banks would be authorized to service the credit needs of the public by advancing the "full faith and credit of the United States" as loans. Any interest charged on advances of the national credit would be returned to the Treasury, to be used in place of taxes.

6. Elimination of money creation by private "fractional reserve" lending. Private lending would be limited either to recycling existing funds or to lending new funds borrowed from the newly-federalized Federal Reserve.

7. Authorization for the Treasury to buy back and retire all of its outstanding federal debt, using newly-issued U.S. Notes or Federal Reserve Notes. This could be done gradually over a period of years as the securities came due. In most cases it could be done online, without physical paper transfers.

8. Advances of interest-free credit to state and local governments for rebuilding infrastructure and other public projects. Congress might also consider authorizing interest-free credit to private parties for properly monitored purposes involving the production of real goods and services (no speculation or shorting).

9. Authorization for Congress, acting through the Treasury, to issue new currency annually to be spent on programs that promoted the general welfare. To prevent inflation, the new currency could be spent only on programs that contributed new goods and services to the economy, keeping supply in balance with demand; and issues of new currency would be capped by some ceiling -- the unused productive capacity of the national work force, or the difference between the Gross Domestic Product and the nation's purchasing power (wages and spendable income). Computer models might be run first to determine how rapidly the new money could safely be infused into the economy.

10. Authorization for Congress to fund programs that would return money to the Treasury in place of taxes, including the development of cheap effective energy alternatives (wind, solar, ocean wave, etc.) that could be sold to the public for a fee, and affordable public housing that returned rents to the government.

11. Regulation and control of the exploding derivatives crisis, either by imposing a modest .25 percent tax on all derivative trades in order to track and regulate them, or by imposing an outright ban on derivatives trading. If the handful of banks responsible for 97 percent of all derivative trades were found after audit to be insolvent, they could be put into receivership and their derivative trades could be unwound by the FDIC as receiver.

12. Initiation of a new round of international agreements modeled on the Bretton Woods Accords, addressing the following monetary issues, among others:

·       The pegging of national currency exchange rates to the value either of an agreed-upon standardized price index or an agreed-upon "basket" of commodities;

·       International regulation of, or elimination of, speculation in derivatives, short sales, and other forms of trading that are used to manipulate markets;

·       Interest-free loans of a global currency issued Greenback-style by a truly democratic international congress, on the model of the Special Drawing Rights of the IMF; and

·       The elimination of burdensome and unfair international debts. This could be done by simply writing the debts off the books of the issuing banks, reversing the sleight of hand by which the loan money was created in the first place.

13. Other domestic reforms that might be addressed include publicly-financed elections, verifiable paper trails for all voting machines, media reform to break up monopoly ownership, lobby reform, sustainable energy development, basic universal health coverage, reinstating farm parity pricing, and reinstating and strengthening the securities laws.

Like the earlier Greenback and Populist Parties, this grassroots political party might not win any major elections; but it could raise awareness, and when the deluge hit, it could provide an ark. We need to spark a revolution in the popular understanding of money and banking while free speech is still available on the Internet, in independent media and in books. New ideas and alternatives need to be communicated and put into action before the door to our debtors' prison slams shut. The place to begin is in the neighborhood, with brainstorming sessions in living rooms in the Populist tradition. The

Populists were the people, and what they sought was a people's currency. Reviving the "American system" of government-issued money would not represent a radical departure from the American tradition. It would represent a radical return. Like Dorothy, we the people would finally have come home.


Appendix P. Excerpts from John Robb’s Brave New War (John Wiley & Sons, 2007)

This book is about rapid, chaotic, and unexpected events, such as those that we witnessed over that fateful forty-eight hours [referring to 9/11].  I call events like these black swans – events to different from what we know, so unpredictable and hidden by uncertainty, that they are impossible to predict with accuracy.  While we’re busy working to protect ourselves against the previous attack, we can expect more black swans, because they are being manufactured by our foes at an increasing frequency.

The reasons for this are twofold.  The first is that we now live in an extremely complex global system.  It is too complex for any single state, or group of states, to keep under control.  As a result, most of the systems that we have built over the last several centuries to dampen the excesses of instability – enabled by markets, travel, communications, and other global systems – are now ineffectual.

We have now entered the age of the faceless, agile enemy.  From London to Madrid to Nigeria to Russia, stateless terrorist groups have emerged to score blow after blow against us.  Driven by cultural fragmentation, schooled in the most sophisticated technologies, and fueled by transnational crime, these groups are forcing corporations and individuals to develop new ways of defending themselves.

The end result of this struggle will be a new, more resilient approach to national security, one built not around the state but around private citizens and companies.

From a security perspective, the most disturbing aspect of 9/11 wasn’t the horrible destruction, but that the men who attacked us on that day didn’t even factor the opposition of the US military into their planning.  Despite tens of trillions of dollars spent on defense over the last decades, this military force proved ineffectual as a deterrent at the point when we needed it most.

Worse yet, nothing has changed since then.  The US military, in budget after budget since 9/11, has continued to plan, build, and fund forces dedicated to fighting a great power war – with an increasing emphasis on China and to a lesser extent on Iran.  Even the guerrilla war in Iraq hasn’t forced any substantive changes to our defense structure.  This isn’t due to a nefarious plot at the highest levels of government.  It is due to the fundamental inability of the nation-state to conceptualize a role that makes sense in fighting and deterring an emerging threat.

The rise of superempowered groups is part of a larger historical trend.  This trend is the process of putting ever-more-powerful technological tools and the knowledge of how to use them into an ever-increasing number of hands.  Economically, this is fantastic news.  This transfer of technological leverage means faster productivity growth and improvements in incomes.  Within the context of war, however, this is dire news, because this trend dictates that technology will leverage the ability of individuals and small groups to wage war with equal alacrity.

Within this larger context, the conflict that we are currently engaged in is merely a waypoint on this trend line.  The threshold necessary for small groups to conduct warfare has finally been breached, and we are only starting to feel its effects.  Over time, perhaps in as little as twenty years, and as the leverage provided by technology increases, this threshold will finally reach its culmination – with the ability of one man to declare war on the world and win.

In aggregate, states have extended their reach to control the economy, personal rights, borders, resources, security, laws, infrastructure, education and health of their citizens.  States as a group, despite numerous internecine wars, have been in strict control of the world’s destiny either through direct control or through colonization for at least two centuries.  In this process of ascent, they have crushed all opposition, from empires to tribal confederations.

That control is coming to an end.

The culprits are globalization and the Internet.  This new environment is sweeping aside state power in ways that no army could.  States are losing control of their borders, economies, finances, people and communications.  They are so intertwined that no independent action can be taken without serious repercussions on multiple levels.  To further complicate matters, a new competitive force is emerging in this vacuum of state power.  Nonstate actors in the form of terrorists, crime syndicates, gangs, and networked tribes are stepping into the breach to lay claim to areas once in the sole control of states.  It is this conflict, the war between states and nonstates, that is the basis for the first epochal or long war of this century.

The ability of nonstate groups to fight states and win isn’t new.  States have been fighting nonstate guerrillas led by people such as George Washington to Mao Tse-tung for hundreds of years.  These wars were over the control of the state, however.  Successful guerrillas fought governments that were too corrupt, distant, or illegitimate to function.  Successful guerrillas used efficiency, discipline, and greater legitimacy to take control of the apparatus of governance; in other words, they were conducting a coup d’état.  In each case, the guerrilla movement served the function of evolutionary renewal within the state’s life cycle.  Failed states were picked off by new, more efficient (though not necessarily ideal) replacements.

The threat posed by al-Qaeda and other emerging groups is different.  It is not at war with us over the replacement of the state but over who controls the power a state exercises.  Al-Qaeda doesn’t want to govern Iraq or Saudi Arabia.  It wants to collapse them and exercise power through feudal relationships in the vacuum created by their failure.

In most cases, capitalist democracies are still stuck within the confines of borders, bureaucracies, and nationalism.  Furthermore, the services that they currently offer their citizens are in broad decline.  The environment has changed, but the states have not.  This has set the stage for the development of nonstate groups that represent the needs of minorities (or at a minimum members of the group) that aren’t being served by the states to which they belong.  Unfortunately for those of us who have done well under the developed world’s rules, these groups are now developing a means of warfare that will allow them to not only survive but also thrive at the expense of states.

To summarize, guerrilla warfare as it has been practiced throughout the last 250 years has been used as either a means of proxy warfare between states or a means of replacing the state with a more organized and efficient alternative (part of the evolution of the state – as evidenced by the guerrilla wars of national liberation in the United States, Russia, China, and others).  Ultimately, by default guerrilla warfare became the dominant form of warfare in the world – nothing else since the advent of the intercontinental ballistic missiles in the early 1950s is acceptable as a form of global conflict.

…the essence of many global guerrilla wars won’t be to replace or break away from the state, rather, it will be to hollow it out.

…it is easy to engage in hindsight bias.  This is the tendency to believe that the event was predictable based on knowledge gained after the event occurred.  In effect, people unknowingly substitute current knowledge of outcomes into gaps of knowledge that were present when building earlier expectations of potential events.

Taleb’s analysis reflects the tone of the postmortems that were done after 9/11.  There was an incredible focus on the details at the expense of the overall picture.  Talk of overlooked memos prevented a deeper level of analysis.  There was also lots of focus on organizational failures at the Federal Bureau of Investigation, the Central Intelligence Agency (CIA) and other departments.  The problem with all of this, despite how good it feels to point fingers (and it does feel good despite the fact that nobody lost his or her job because of 9/11), is that engaging in hindsight bias doesn’t prepare us for the next black swan, the next blip on the horizon that threatens catastrophe.

A second approach that is along the wrong path seeks to roll back the external threat – the sources of terrorism and extremism – through the use of preemptive war followed by aggressive nation-building.  Interestingly, this approach is currently (in bastardized form) the grand strategy of the United States (the Bush doctrine).

…States are no longer singular nation-states, but rather (meta?) organizations in competition within a globe-spanning marketplace.  This shift is already in the process of changing the character of the state’s constitutional order (its source of legitimacy), even though foreign and security policy are still caught in the nation-state phase of thinking.


Appendix Q. Excerpts from James Fallows’ article, “Declaring Victory,” in the September 2006 issue of Atlantic Monthly

The larger and more important surprise was the implicit optimism about the U.S. situation that came through in these accounts – not on Iraq but on the fight against al-Qaeda and the numerous imitators it has spawned. For the past five years the United States has assumed itself to be locked in “asymmetric warfare,” with the advantages on the other side. Any of the tens of millions of foreigners entering the country each year could, in theory, be an enemy operative – to say nothing of the millions of potential recruits already here. Any of the dozens of ports, the scores of natural-gas plants and nuclear facilities, the hundreds of important bridges and tunnels, or the thousands of shopping malls, office towers, or sporting facilities could be the next target of attack. It is impossible to protect them all, and even trying could ruin America’s social fabric and public finances. The worst part of the situation is helplessness, as America’s officials and its public wait for an attack they know they cannot prevent.

Viewing the world from al-Qaeda’s perspective, though, reveals the underappreciated advantage on America’s side. The struggle does remain asymmetric, but it may have evolved in a way that gives target countries, especially the United States, more leverage and control than we have assumed. Yes, there could be another attack tomorrow, and most authorities assume that some attempts to blow up trains, bridges, buildings, or airplanes in America will eventually succeed. No modern nation is immune to politically inspired violence, and even the best-executed antiterrorism strategy will not be airtight.

But the overall prospect looks better than many Americans believe, and better than nearly all political rhetoric asserts. The essence of the change is this: because of al-Qaeda’s own mistakes, and because of the things the United States and its allies have done right, al-Qaeda’s ability to inflict direct damage in America or on Americans has been sharply reduced. Its successor groups in Europe, the Middle East, and elsewhere will continue to pose dangers. But its hopes for fundamentally harming the United States now rest less on what it can do itself than on what it can trick, tempt, or goad us into doing. Its destiny is no longer in its own hands.

“Does al-Qaeda still constitute an ‘existential’ threat?” asks David Kilcullen, who has written several influential papers on the need for a new strategy against Islamic insurgents. Kilcullen, who as an Australian army officer commanded counter-insurgency units in East Timor, recently served as an adviser in the Pentagon and is now a senior adviser on counterterrorism at the State Department. He was referring to the argument about whether the terrorism of the twenty-first century endangers the very existence of the United States and its allies, as the Soviet Union’s nuclear weapons did throughout the Cold War (and as the remnants of that arsenal still might).

“I think it does, but not for the obvious reasons,” Kilcullen told me. He said the most useful analogy was the menace posed by European anarchists in the nineteenth century. “If you add up everyone they personally killed, it came to maybe 2,000 people, which is not an existential threat. But one of their number assassinated Archduke Franz Ferdinand and his wife. The act itself took the lives of two people. The unthinking response of European governments in effect started World War I. So because of the reaction they provoked, they were able to kill millions of people and destroy a civilization.

“It is not the people al-Qaeda might kill that is the threat,” he concluded. "Our reaction is what can cause the damage. It’s al-Qaeda plus our response that creates the existential danger.”

Since 9/11, this equation has worked in al-Qaeda’s favor. That can be reversed.

Over the past five years Americans have heard about “asymmetric war,” the “long war,” and “fourth-generation war.” Here is an important but under­discussed difference between all of these and “regular war.”

In its past military encounters, the United States was mainly concerned about the damage an enemy could do directly – the Soviet Union with nuclear missiles, Axis-era Germany or Japan with shock troops. In the modern brand of terrorist warfare, what an enemy can do directly is limited. The most dangerous thing it can do is to provoke you into hurting yourself.

This is what David Kilcullen meant in saying that the response to terrorism was potentially far more destructive than the deed itself. And it is why most people I spoke with said that three kinds of American reaction – the war in Iraq, the economic consequences of willy-nilly spending on security, and the erosion of America’s moral authority – were responsible for such strength as al-Qaeda now maintained.

“You only have to look at the Iraq War to see how much damage you can do to yourself by your response,” Kilcullen told me. He is another of those who supported the war and consider it important to fight toward some kind of victory, but who recognize the ways in which this conflict has helped al-Qaeda. So far the war in Iraq has advanced the jihadist cause because it generates a steady supply of Islamic victims, or martyrs; because it seems to prove Osama bin Laden’s contention that America lusts to occupy Islam’s sacred sites, abuse Muslim people, and steal Muslim resources; and because it raises the tantalizing possibility that humble Muslim insurgents, with cheap, primitive weapons, can once more hobble and ultimately destroy a superpower, as they believe they did to the Soviet Union in Afghanistan twenty years ago. The United States also played a large role in thwarting the Soviets, but that doesn’t matter. For mythic purposes, mujahideen brought down one anti-Islamic army and can bring down another.

Higher-priced oil has hurt America, but what has hurt more is the economic reaction bin Laden didn’t fully foresee. This is the systematic drag on public and private resources created by the undifferentiated need to be “secure.”

The effect is most obvious on the public level. “The economy as a whole took six months or so to recover from the effects of 9/11,” Richard Clarke told me. “The federal budget never recovered. The federal budget is in a permanent mess, to a large degree because of 9/11.” At the start of 2001, the federal budget was $125 billion in surplus. Now it is $300 billion in deficit.

A total of five people died from anthrax spores sent through the mail shortly after 9/11. In Devils and Duct Tape, his forthcoming book, John Mueller points out that the U.S. Postal Service will eventually spend about $5 billion on protective screening equipment and other measures in response to the anthrax threat, or about $1 billion per fatality. Each new security guard, each extra checkpoint or biometric measure, is both a direct cost and an indirect drag on economic flexibility.

If bin Laden hadn’t fully anticipated this effect, he certainly recognized it after it occurred. In his statement just before the 2004 election, he quoted the finding of the Royal Institute of International Affairs (!) to the effect that the total cost, direct and indirect, to America of the 9/11 attacks was at least $500 billion. Bin Laden gleefully pointed out that the attacks had cost al-Qaeda about $500,000, for a million-to-one payoff ratio. America’s deficit spending for Iraq and homeland security was, he said, “evidence of the success of the bleed-until-bankruptcy plan, with Allah’s permission.”

But the deeper and more discouraging prospect – that the United States is doomed to spend decades cowering defensively – need not come true. How can the United States regain the initiative against terrorists, as opposed to living in a permanent crouch? By recognizing the point that I heard from so many military strategists: that terrorists, through their own efforts, can damage but not destroy us. Their real destructive power, again, lies in what they can provoke us to do. While the United States can never completely control what violent groups intend and sometimes achieve, it can determine its own response. That we have this power should come as good and important news, because it switches the strategic advantage to our side.

So far, the United States has been as predictable in its responses as al-Qaeda could have dreamed.

In the interview, al-Faqih said that for nearly a decade, bin Laden and al-Zawahiri had followed a powerful grand strategy for confronting the United States. Their approach boiled down to “superpower baiting” (as John Robb, of the Global Guerrillas blog, put it in an article about the interview). The most predictable thing about Americans, in this view, was that they would rise to the bait of a challenge or provocation. “Zawahiri impressed upon bin Laden the importance of understanding the American mentality,” al-Faqih said. He said he believed that al-Zawahiri had at some point told bin Laden something like this:

The American mentality is a cowboy mentality – if you confront them … they will react in an extreme manner. In other words, America with all its resources and establishments will shrink into a cowboy when irritated successfully. They will then elevate you, and this will satisfy the Muslim longing for a leader who can successfully challenge the West.

The United States is immeasurably stronger than al-Qaeda, but against jujitsu forms of attack its strength has been its disadvantage. The predictability of the U.S. response has allowed opponents to turn our bulk and momentum against us. Al-Qaeda can do more harm to the United States than to, say, Italy because the self-damaging potential of an uncontrolled American reaction is so vast.

How can the United States escape this trap? Very simply: by declaring that the “global war on terror” is over, and that we have won. “The wartime approach made sense for a while,” Dearlove says. “But as time passes and the situation changes, so must the strategy.”

As a general principle, a standing state of war can be justified for several reasons. It might be the only way to concentrate the nation’s resources where they are needed. It might explain why people are being inconvenienced or asked to sacrifice. It might symbolize that the entire nation’s effort is directed toward one goal.

But none of those applies to modern America in its effort to defend itself against terrorist attack. The federal budget reveals no discipline at all about resources: the spending for antiterrorism activities has gone up, but so has the spending for nearly everything else. There is no expectation that Americans in general will share the inconveniences and sacrifice of the 1 percent of the population in uniform (going through airport screening lines does not count). Occasional speeches about the transcendent importance of the “long war” can’t conceal the many other goals that day by day take political precedence.

And while a standing state of war no longer offers any advantages for the United States, it creates several problems. It cheapens the concept of war, making the word a synonym for effort or goal. It predisposes us toward overreactions, of the kind that have already proved so harmful. The detentions at Guantánamo Bay were justified as a wartime emergency. But unlike Abraham Lincoln’s declaration of martial law, they have no natural end point.

A state of war encourages a state of fear. “The War on Terror does not reduce public anxieties by thwarting terrorists poised to strike,” writes Ian Lustick, of the University of Pennsylvania, in his forthcoming book, Trapped in the War on Terror. “Rather, in myriad ways, conducting the antiterror effort as a ‘war’ fuels those anxieties.” John Mueller writes in his book that because “the creation of insecurity, fear, anxiety, hysteria, and overreaction is central for terrorists,” they can be defeated simply by a refusal to overreact. This approach is harder in time of war.

Perhaps worst of all, an open-ended war is an open-ended invitation to defeat. Sometime there will be more bombings, shootings, poisonings, and other disruptions in the United States. They will happen in the future because they have happened in the past (Oklahoma City; the Unabomber; the Tylenol poisonings; the Washington, D.C.-area snipers; the still-unsolved anthrax mailings; the countless shootings at schools; and so on). These previous episodes were not caused by Islamic extremists; future ones may well be. In all cases they represent a failure of the government to protect its people. But if they occur while the war is still on, they are enemy “victories,” not misfortunes of the sort that great nations suffer. They are also powerful provocations to another round of hasty reactions.

War implies emergency, and the upshot of most of what I heard was that the United States needs to shift its operations to a long-term, nonemergency basis. “De-escalation of the rhetoric is the first step,” John Robb told me. “It is hard for insurgents to handle de-escalation.” War encourages a simple classification of the world into ally or enemy. This polarization gives dispersed terrorist groups a unity they might not have on their own. Last year, in a widely circulated paper for the Journal of Strategic Studies, David Kilcullen argued that Islamic extremists from around the world yearn to constitute themselves as a global jihad. Therefore, he said, Western countries should do everything possible to treat terrorist groups individually, rather than “lumping together all terrorism, all rogue or failed states, and all strategic competitors who might potentially oppose U.S. objectives.” The friend-or-foe categorization of war makes lumping together more likely.

The United States can declare victory by saying that what is controllable has been controlled: Al-Qaeda Central has been broken up. Then the country can move to its real work. It will happen on three levels: domestic protection, worldwide harassment and pursuit of al-Qaeda, and an all-fronts diplomatic campaign.


Appendix R. Excerpts from Thomas Hammes’ The Sling and the Stone (Zenith Press, 2006)

[From the cover…]  Not only is 4GW the only kind of war America has ever lost, we have done so three times: Vietnam, Lebanon, and Somalia.  It has also defeated the Soviet Union (Afghanistan, Chechnya) and the French (Vietnam, Algeria).  Arguably, 4GW has been the most successful form of war for the last 50 years.  First defined by Mao, 4GW has evolved as each practitioner learned from his predecessors or co-combatants and refined its techniques.  [End of cover note.]

In a later essay, “Through a Glass Darkly,” van Creveld expands on this idea to point out how the last fifty years have led to a fundamental erosion of the state’s monopoly on the use of force:

The roughly three-hundred-year period which was associated primarily with the type of political organization known as the state – first in Europe, and then, with its expansion, in other parts of the globe as well – seems to be coming to an end.  If the last fifty years or so provide any guide, future wars will be overwhelmingly of the type known, however inaccurately, as “low intensity.”

Van Creveld clearly sees warfare as evolving with the political, social and economic structures of that time.

Politically, there have been extensive changes since the end of World War II.  The most obvious is the exponential increase in the number of players on the international stage.  Prior to the war, the nation-state was the only significant player on the international scene.  Immediately after the war, both the political and economic spheres begin changing rapidly, and each added numerous and varied players to the political stage.

On the subnational level, we have numerous nations that lack states.  Many of these groups fall either within a single state or straddle various states as a result of the artificial boundaries that evolved from the colonial era.  Although these are not powerful organizations, they can play notable roles on the international scene.  One only has to consider the impact of the Kurds, the Serbs, the Croats, the Palestinians, or the Irish Republican Army on recent events to see that, although relatively minor players, these subnational organizations can and do have impact.

Although on the surface the markets seem to be purely economic, their impact is that of a powerful political player.  This player can dictate trade policies, influence elections, determine interest rates, place limits on national social policy, decide acceptable banking practices, and drive many other activities of nations.

The cumulative effect of this proliferation of players on the international scene is a distinct reduction in the power and freedom of action of nations.

This unrest, combined with the artificial nature of the boundaries of many states, has resulted in the severe breakdown of order within many of these postcolonial “nations.”  Often it has led to the effective, if not the official, dissolution of many of these creations of the colonial powers.  The result has been the reversion to much earlier social organizations – tribal, clan or gang.  The result is a major change in whom we might fight and how they see a fight.  In the last hundred years or so, Western nations have become accustomed to fighting disciplined, uniformed soldiers of another nation.  Now we are faced with fighting warrior or clan societies.  The difference between a soldier and a warrior is essential.

Soldiers are disciplined members of a specific profession.  As such, they are under the control of a political entity and do not have specific financial or social benefits from continuing to fight.  Although there is increased prestige and opportunity for promotion during war, most professional soldiers will at least pay lip service to a preference for peace.

In contrast, a warrior society thrives on and exists for war.  Often, the young warrior has everything to lose (except his life) if he stops fighting.  Consider the young clansman in Somalia.  As a member of a fighting clan, he has prestige and income.  They combine to give him access to money, food, property and women.  If he puts his weapons down, he loses that prestige and the income – and with them everything else.  Although the risk of death from fighting is always present, it is actually less than the risk of death from starvation if he stops fighting.

Unfortunately, most of these warrior societies’ mechanism for keeping violence to a manageable level are based on traditional systems.  For instance, in Somalia, clan elders would meet and determine fines imposed on an individual or family who killed another during a camel raid.  However, the advent of powerful new weapons has escalated the killing beyond the control of the old social systems.  The young warriors have learned new techniques to employ the new weapons.  Like all human organizations, they have adapted.

This creates a major problem for Western soldiers facing such a warrior society.  These societies have learned that pushing women and children to the front, even in close combat, will often neutralize the superior firepower of Western soldiers.  Sometimes the women and children are armed, sometimes not.  Further, women and children at the front shows that the entire society has mobilized against a perceived threat to its livelihood, territory, or customs.  Even when the confrontation does not include weapons, warrior societies have learned that Western soldiers have trouble dealing with large numbers of women and children – and have added them as a tactical tool when it creates an advantage.

In sum, there has been an enormous social change from what Western forces faced at the beginning of World War II.  The societies of rich nations have fragmented and are beginning to align by interests rather than by nationality.  Many poor nations have failed completely, with their populations breaking up into the tribes or clans that preceded the nation-state imposed by the colonial powers.  Unfortunately, the tribal organizations were never designed to deal with the challenges inherent in a failed nation.  Thus, many of the poor face little hope.  In short, social changes since World War II have been extensive and wide-ranging.

It is difficult for a despot to effectively use 4GW as a strategic approach.  Although many of the tactics and techniques of 4GW can be effective even for a dictator, the fundamental strength of 4GW lies in the idea or message that is the heart of the concept.

Rather than with the complex political, economic and social aspects of the conflicts we are currently fighting, they focus on technological solutions to problems at the tactical level of war.  If there is one thing we should have learned from watching the Germans execute 3GW, it is that strategic victory is not the sum of incredible, tactical victories.  Both Germany and Japan failed to understand the strategic context of the war – and despite exceptional tactical- and operational-level victories, failed abysmally strategically.  They could win battles but not wars.  In many ways, the United States mirrors this misunderstanding today.  We continue to focus on technological solutions at the tactical and operational levels without a serious discussion of the strategic imperatives or the nature of the war we are fighting.

At the strategic level, the combination of our perceived technological superiority and our bureaucratic organization sets us up for a major failure against a more agile, intellectually prepared enemy.

In the case of the United States, we have huge sunk costs in conventional forces.  Not only do we have the resources funding them, we have the culture, promotion system, schools, staffs, procurement systems, and so on.  In short, we have an entire culture and industry built around second- and third-generation warfare.  As a result, we have convinced ourselves that applying our technology to these older generations of war gives us a unique and virtually unassailable lead, through our ability to conduct precision attacks that surgically remove an enemy’s ability to fight.  Our official documents state that we have the sensors, processing systems, secure communications, and precision weapons that will allow us to dominate the battlespace via precision strike.

In fact, small, moderately well funded organizations have the same ability to perform all the steps necessary to conduct precision strikes.  Their sensors are human intelligence contacts, open-source reporting, Internet mining, and commercially available imagery.  They process information through the most subtle, sophisticated, and capable system in existence: the human mind.  They have secure, worldwide communications through the use of the Internet and basic tradecraft.   Finally, they have precision weapons, in the form of humans willing to ride the ordnance to the target.

No nation or group will give up its right to exist as the result of 4GW techniques.  The techniques can only weaken the enemy’s will and reduce his resources to the point that a conventional military campaign can defeat him entirely.

Therefore, although major constituencies in the United States have an interest in perfecting high-technology war, none of our most dangerous enemies sees it as a viable way to compete against the United States.  We simply have too great a lead, and the weapons cost too much.  We have to give up our cyberwar fantasy and understand that our most dangerous enemies are those who are preparing to fight a 4GW conflict.

To date, 4GW organizations have fared well against previous generations, but the converse is not true.  If we do not transition to 4GW, we will not do well against those threats either.

To achieve success, we must be prepared to fight across the spectrum of political, economic, social, and military spheres.  We not only have to win battles, we have to fill the vacuum behind them – starting with rapidly establishing security.  This means not only police and security forces but also the court system and prison system to support them.  We have to establish banking, currency, customs, public health organizations, public sanitation, air traffic control, business regulation, a system of taxation, and every other process needed for running a modern society.  And all of these must be done in conjunction with the people of that nation.  We know that solutions imposed from outside rarely remain in effect once the occupying power leaves.

It is interesting that DOD is reluctant to incur risk by changing personnel policies yet never seems to fear risk when fielding advanced technology weapons.  This is more puzzling given that history proves advanced technology does not ensure an organization can function in a new generation of war.  All the technological progress in the world cannot drive an organization to move to the next generation of war – only its people can do that.  The real risk lies in not changing our personnel systems.

…There is a huge constituency not [willing] to change.  Not only is this constituency huge, it is wide-ranging and influential.  It includes entire organizations within DOD, major segments of the defense industry, and entire regions of the country that rely heavily on defense spending.  Each represents major political constituencies and works hard through its congressional representatives to protect its purpose for existence or livelihood.

In short, our nation has a huge investment in weapons, training, production capability, intellectual property, and so on, focused on defeating a 2GW or 3GW conventional enemy.  Obviously, many people have a great deal to lose to an honest study that points out that those generations of war have passed and that the fourth generation needs different skills and tools.

Despite the conflicting views within the United States, any thinking enemy has realized that only 4GW has a chance of success against our overwhelming conventional power.  They know that reliance on the tactics, techniques, and equipment of earlier generations of war will lead to their inevitable defeat.  In fact, they have watched such forces suffer battlefield defeat in living color on TV – not once, but twice.

Yet, for good reason, some of our potential enemies still retain large 2GW forces.  They require those forces to maintain internal order, both by providing a politically reliable armed force to suppress their own citizens and by providing employment to many young men while focusing their frustrations against an external target rather than against the regime.  In addition, the same bureaucratic inertia that affects the United States also afflicts our nation-state enemies.

Further reinforcing the shift away from conventional war is our exceptionally poor record in unconventional conflicts, such as Vietnam, Lebanon, Somalia, Iraq, and Afghanistan.  Although we must be prepared for the rise of a new power, we should be looking for a 4GW enemy, not one that uses older generations of war.

This first type of 4GW enemy, the insurgent, will be unpredictable and capable of causing great destruction and death – even within the United States.  It will require great resources, effort, and time to protect ourselves from the worst of their attacks.  It is essentially impossible to provide complete protection against this type of enemy.

A further challenge is the accelerating rate of change.  Although change is constant, the rate of change is not.  It has been accelerating almost exponentially.  We an see that acceleration in the rate of development and adoption of new thoughts, processes and technologies in all areas of human endeavor.  It took decades for electric power to penetrate the far reaches of America but less than twenty years for the personal compute and less than ten years for the Internet to spread to the same regions.  Instant messaging took less than five years.

In warfare, it took centuries to change from medieval warfare to 1GW but only decades to reach 2GW.  Each new generation of war had developed and been disseminated in less time than the previous generation.  We have to assume this trend will continue and prepare accordingly.  We have to assume fifth-generation warfare is out there.

It is incredibly important that the unpredictable nature of war be the starting point for any discussion concerning the future of war.

…Fourth-generation war has been around for more than seventy years, no doubt fifth generation is evolving even as we attempt to deal with its predecessor.  We may not recognize it as it evolves around us.  Or we may look at several alternative futures and see each as fifth-generation war.


Appendix S. Excerpts from Martin van Creveld’s The Changing Face of War (Ballantine Books, 2006)

In Thailand, in Indonesia, in the Philippines, in a dozen other countries, regular armed forces are engaged in so-called counterinsurgency operations.  In terms of sheer military power, all are far stronger than their enemies.  None, however, seems to be making any considerable headway, and most will probably end up in defeat.

What is known, though, is that attempts by post-1945 armed forces to suppress guerrillas and terrorists have constituted a long, almost unbroken record of failure – a record that, as events in Iraq testify, continues to the present day.

The very first failure, which took place in 1947 when the British withdrew from the Indian subcontinent, may perhaps be excused by the fact that the British hardly tried to make a military stand.  That, however, was not true in the piece of land known as Palestine.  A relatively small country whose borders are not too difficult to seal, in 1946 Palestine was inhabited by an estimated 1.3 million people, fewer than half of whom were Jewish.  The number of armed terrorists probably never exceeded a few hundred at any one time; the heaviest weapons to which they had access were homemade bombs and submachine guns.

To oppose the terrorists, the British had no fewer than one hundred thousand troops in country although, as was also to be the case with subsequent counterinsurgents, the number of those actually available for the conduct of active operations was much smaller.  Finally, these were the years immediately following the greatest war of all time.  As a result, heavy weapons such as armored cars, tanks, aircraft, and warships (useful to seal off the Mediterranean shore and prevent illegal immigrants from coming in) were available in huge numbers.

The three stages Mao describes fit the war he himself waged in China as well as a few other so-called struggles of national liberation, particularly the one in Vietnam that, by 1975, had indeed developed into a conventional war.  They do not, however, fit the great majority of post-1945 conflicts.

[The three stages of guerrilla war that Mao defined are: (1) small-scale violence (terrorist attacks); (2) assumption of control of parts of the country, which are used as bases; and (3) conventional warfare.  See Hammes for discussion of these, or Mao Tse Tung’s On Guerrilla Warfare.  These stages are useful in describing Mao’s conquest of China, and do not constitute a good general description of guerrilla warfare.]

Perhaps the person who put it most incisively was Henry Kissinger.  The forces of order, he once said, as long as they do not win, they lose.  Insurgents, as long as they do not lose, win.  To a very large extent, all this applies whether the perpetrators are white or black, traditional or modern, capitalist or socialist, or whatever.  It also applies regardless of whether they are God-fearing Americans or atheistic communists.  It even applies to arrogant, bloodthirsty, racist Nazis.

Starting at least as early as the 1950s, the literature on counterinsurgency is so enormous that, had it been put aboard the Titanic, it would have sunk the ship without any help from the iceberg.  However, the astonishing fact is that almost all of it has been written by the losers.  It is as if we should have Saddam Hussein, sitting in his prison cell, instructing the next US commander, how he should fight the next Gulf War.  Worse still, very often the authors’ real objective is not to enlighten but to provide excuses by shifting the blame onto everything and everybody except themselves; hence the frequent emphasis on lack of “political direction,” “coordination,” and “cooperation.”  Even when that is not the case, many of the books and articles, being written in modern social science jargon, are simply incomprehensible.  What, for example, is one to make of advice such as “seek synergy minus one interventions” or “decrease negative entropy / increase entropy”?

Among the few counterinsurgency campaigns that have been successful, perhaps the most interesting is the British experience in Northern Ireland….

First, unlike President Bush in 2001, the British did not declare war, which would have removed a whole series of legal constraints and put the entire conflict on a new footing.  Instead, from beginning to end the problem was treated as a criminal one, meaning that responsibility for it rested with the fifty or so police forces (which were backed up by the army where needed) and the court system….

Looking back, his [Hafez Assad, with reference to his destruction of the Syrian city of Hama] strategy may be condensed into five simple rules.  All of them are as old as history; and all of them can already be found in Machiavelli’s The Prince (1512-13).  The following paragraphs are merely an attempt to expand on and systematize what the Italian thinker had to say.

First, there are situations in which it is necessary to resort to cruelty.  If, in such situations, you are not prepared to apply it, then you are a traitor to the people who put you where you are in order that you may safeguard their liberty, their property, and their lives; at most, you are fit to be prime minister of Disneyland.  Should you reach the point at which you have no choice left but to resort to cruelty, then the blow should be sudden.  The more like a thunderbolt out of a clear sky it comes, the greater the effect: both because your victim will be unable to prepare, and in consideration of the all-important psychological point of view.  Never threaten, and never announce what you plan to do in advance.  Instead, talk softly, feign weakness, and use secrecy and deceit to hide your preparations as much as you can for as long as you can.

Second, once you have made up your mind to strike, you cannot strike hard enough. Tenderhearted people believe it is better to kill too few than too many. However applicable this may be to the justice system of a law-abiding, liberal democracy, if the alternative is civil war and, perhaps, the disintegration of the community, it is misguided.  Better to kill too many people than too few. Strike so hard as to ensure you don't have to strike twice; otherwise, by showing that there is life after death, the fact that you have to do so will weaken the impact of your original blow. Besides, you must consider the effect that numerous repetitions will have on your own troops. Loyal, well trained, and hard-bitten they may well be. Still, if they are made to commit one atrocity after another (and very likely resort to alcohol or drugs in order to muster the necessary will), it will only be a matter of time before they lose their edge and, by so doing, turn into a danger to themselves and to you.

Third, considering that everything is relative, timing is absolutely vital. Everything else being equal, the earlier the blow is launched, the better – just as a blister must be lanced before it can fester and lead to gangrene. Suppose, for the sake of argument, that Rabin had understood what was going on when the First Intifada broke out in Gaza in December 1987. In that case, instead of flying to the United States to haggle about the price of some F-16 fighter-bombers Israel was about to buy, he could have ordered his troops to kill (say) five hundred Palestinians more or less on the spot and also blow up some object of high symbolic value. Assuming the troops had obeyed him, the outcome could have been, if not peace, at any rate quiet. Instead, a prolonged struggle ensued in which immense physical damage was done and several thousand Palestinian lives were lost. To the extent that it could have been avoided, Rabin might even have done his enemies a favor. To quote a Hebrew proverb: He who shows mercy to the cruel will end up being cruel to those who deserve mercy.

Of course, getting in one's blow as early as necessary is easier said than done. Inertia governs counterinsurgency operations as much as, if not more than, it does other human affairs; having watched the pot simmer for a long time, people are reluctant to admit it will end up exploding. Hence most blows are probably launched too late, not too early, in the struggle. By the time they are finally launched, people will have gotten so used to the killing that they will have little effect. Conversely, the longer you wait, the more barbaric the means you will have to use.

Many, perhaps .most, of insurgents' activities take place undercover. Others are not concentrated in a single place but are widely scattered inside a country's borders and, often enough, outside them as well. Hence it is an illusion to think that you can ever ‘get’ all or even most of your opponents at once – something that not even Saddam Hussein, using gas against the Kurds, succeeded in doing.  Even if you do, chances are that, like the mythological hydra, the organization will reconstitute itself….

In other words, the true objective of your strike is not to kill people per se. Rather, it is to display your ruthlessness and your willingness to go to any lengths to achieve your objective – a war for hearts and minds, only by different means. Clausewitz once wrote that war is a moral and physical contest by means of the latter. The same is even more true of the massacre that accompanies a war: If you do it right, it may even prevent a war that has not yet broken out. It is therefore essential that careful consideration should be given to the means….

Fourth, once you have started, do what you have to do openly.  The media you control, you can control.  The rest are your enemy.  While you are busy carrying out your plans, they will gather like vultures around carrion; once you are done, they will summon their righteous wrath and denounce you as a bloodthirsty monster. Still, that does not mean you cannot harness them to your chariot. At any cost, prevent the media from messing with your operations while they are going on, both to prevent them from getting in the way and to increase the effect by giving people's imagination free play. Once you are done, though, do not try to hide your operations or explain them away. In fact, you should do exactly the opposite. Let there be no apologies, no kvetching about collateral damage caused by mistake, innocent lives regrettably lost, "excesses" that will be investigated and brought to trial, and similar signs of weakness. Instead, make sure that as many people as possible can see, hear, smell, and touch the results; if they can also taste them, such as by inhaling the smoke from a burning city, then so much the better. Invite journalists to admire the headless corpses rolling in the streets, film them, and write about them to their hearts' contents. Do, however, make sure they do not talk to any of the survivors so as not to arouse sympathy.

Last but not least, do not command the strike yourself. Instead, have a Rif'at [Assad’s brother, whom Assad placed in charge of the Hama massacre] do it for you – if at all possible, without ever giving him written orders that he may later produce to implicate you. This method has the advantage that, if your designated commander succeeds, you can take the credit. Presenting him to the world, you will make sure he looks as grim as possible, perhaps by sticking him into a blackened uniform and putting a dirty bandage on his forehead. Presenting yourself to the world, you will offer no regrets and shed no tears over the victims. Instead, you will explain why it absolutely had to be done and make sure everybody understands you are ready to do it again at a moment's notice. But what if, for one reason or another, your designated henchman fails in his mission, and resistance, instead of being broken, increases? In that case, you can always disown him and try another course such as negotiation.

…from 1945 on, almost all attempts to deal with insurgencies have ended in failure.  In 2006, the United States spent more than four hundred billion dollars on its armed forces, of which one hundred billion went to Iraq, much of it to fight the insurgency there.  Yet the most important terrorists remained at large.

Does all this imply that we must resign ourselves to a world where insurgencies will typically gain their objectives? The answer is, by no means. The first, and absolutely indispensable, thing to do is to throw overboard 99 percent of the literature on counterinsurgency, counterguerrilla, counterterrorism, and the like. Since most of it has been written by the losing side, it is of little value.

Next, we should focus on the relatively small number of cases where counterinsurgency operations actually succeeded. Even disregarding small, isolated outbreaks such as the ones in Germany and Italy in the 1970s, there are such cases – campaigns in which a counterinsurgent force did succeed in coping with its enemies, bringing them to heel, and imposing something like peace. The methods used have been outlined in chapter 6. However, since counterinsurgency, and not major war, is the most important military problem facing humanity in the present and the foreseeable future, I shall take the liberty to spell them out once again.

By definition, guerrillas and terrorists are weak. By definition, their opponents are much stronger. Contrary to the accepted wisdom, and barring small movements with no popular support such as the Baader-Meinhof Gang, most guerrillas and terrorists won their struggles precisely because they were weak. It was their weakness that enabled them to hide; even more important, it was their weakness that permitted them to do what they wanted to do and what had to be done. Most, remaining weak, won their struggles long before they reached Mao's third stage of open warfare. Of those that were defeated, some suffered that fate because, emerging from hiding at too early a stage, they exposed themselves to their opponents; one very good example of this is provided by the Greek Civil War. All this proves, if proof were needed, that the core of the difficulty is neither military nor political, but moral.

In principle, two methods suggest themselves. The first depends on excellent intelligence. Terrorists do not identify themselves by wearing uniforms, but rather use the natural and artificial environment to operate undercover. When everything is said and done, it is intelligence, obtained by personnel who are as familiar with the environment as the terrorists themselves, that proves decisive. Intelligence, though, is not enough. It must be backed up by the solid professionalism and iron discipline that alone make discrimination and self-restraint possible. This, in turn, will accomplish two things. First, it will prevent many more people from joining the insurgency. Second, by postponing, or perhaps even forestalling, the day when the counterinsurgents wake up, look into the mirror, and reflect on what they have become, it will also help them retain their fighting edge.

The other method will have to be used in case good intelligence is not available and discrimination is therefore impossible and, in case things reach the point where they threaten to run completely out of control. The first rule is to make your preparations in secret or, if that is not feasible, to use guile and deceit to disguise your plans. The second is to get your timing right; other things being equal, the sooner you act, the fewer people you must kill. The third is to strike as hard as possible within the shortest possible time; better to strike too hard than not hard enough. The fourth is to explain why your actions were absolutely necessary without, however, providing any apology for them. The fifth is to operate in such a way that, in case your blow fails to deliver the results you expect and need, you will still have some other cards up your sleeve.

Each method, in its own way, demands tremendous courage and nerve – the former, if anything, even more than the latter. Clearly such nerve and such courage are not commodities every leader, every army, and every people possess. Without them, they tend to move from one extreme to another; "a sharp shift from killing to kindness," as The Washington Post, referring to US troops in Iraq, put it. Now they use firepower to slaughter their enemies en masse, now they embrace them. Now they demolish, now they rebuild. Now they kill innocent people, now they pay compensation (an Iraqi life is worth twenty-five hundred dollars). Now they use torture to elicit information, now they accuse the torturer of acting without orders and put him on trial; since punishments are seldom very harsh, however, the only effect is to add yet another layer of doubt and cynicism. Very often different military, intelligence, and aid organizations are in charge of the different activities. Lack of coordination ensues; the left hand does not know what the right is doing. Each time the policy shifts, the population, instead of feeling either terrified or friendly, becomes more and more puzzled at what may come next. Each time it shifts, the terrorists, interpreting the change as a proof of weakness, take heart. Over the last sixty years or so, the results have spoken for themselves.

This, of course, will not do. The attacks on the World Trade Center and the Pentagon, as well as other coups mounted by al-Qaeda and its like can leave us in no doubt; terrorism is spreading into the developed world. Precisely because so many of its proponents are able to operate inside that world, it represents a far greater threat than do any number of third-rate dictators – including, let it be added, those who have acquired, or are about to acquire, nuclear weapons. Either the developed world, with the United States at its head, shakes off its lethargy, realizes the nature of the problem (which is not the same as studying it t( death), and learns to deal with terrorism, or as sure as night follow, day, terrorism will deal with it. The choice, as always, is ours.


Appendix T. Excerpts from Rupert Smith’s The Utility of Force (Vintage Books / Random House, 2005, 2007)

Nevertheless, war as cognitively known to most non-combatants, war as a massive deciding event in a dispute in international affairs: such war no longer exists.

War amongst the people is both a graphic description of modern warlike situations, and also a conceptual framework: it reflects the hard fact that there is no secluded battlefield upon which armies engage, nor are there necessarily armies, definitely not on all sides.  To be clear: this is not asymmetric warfare, a phrase I dislike invented to explain a situation in which conventional states were threatened by unconventional powers but in which conventional military power in some formulation would be capable of both deterring the threat and responding to it.  War amongst the people is different: it is the reality in which the people in the streets and houses and fields – all the people, anywhere – are the battlefield.  Military engagements can take place anywhere: in the presence of civilians, against civilians, in defense of civilians.  Civilians are the targets, objectives to be won, as much as an opposing force.  However, it is also not asymmetric warfare since it is also a classic example of disinterest in the change of paradigms.   The practice of war, indeed its “art,” is to achieve an asymmetry over the opponent.  Labeling wars as asymmetric is to me something of a euphemism to avoid acknowledging that my opponent is not playing to my strengths and I am not winning.  In which case perhaps the model of war rather than its name is no longer relevant: the paradigm has changed.

In all these cases [the 1991 Gulf war, Chechnya, Kosovo] military force may have achieved a local military success, but frequently this success failed to produce its political promise: there was no decisive victory.  In other words, throughout these fifteen years statesmen, politicians, diplomats, admirals, generals and air marshals have had difficulty both in applying military force to advantage and in explaining their intentions and actions.

Military force when employed has only two immediate effects: it kills people and destroys things.  Whether or not this death and destruction serve to achieve the overarching or political purpose the force was intended to achieve depends on the choice of targets or objectives, all within the broader context of the operation.  That is the true measure of its utility.

Finally, we come to the level that links the tactical to the strategic: the theatre, or operational, level.  In our modern circumstances I think that theatre is a better description, and in the main I shall use it from now on – largely because of the widespread use of the term “operational” for a variety of activities in the military and civilian worlds.

…in our modern circumstances most fights do not go to the strategic level: war amongst the people is mostly a tactical event, with occasional forays into the theatre level – yet we persist in thinking of them as wars which will deliver decisive victories and solutions.

Military force does not have an absolute utility, other than its basic purposes of killing and destroying.

War amongst the people is not a better paradigm than interstate industrial war, it is simply different…

…we are living in a world of confrontations and conflicts rather than one of war and peace; one in which the clear categories of security and defense – the basic purposes for which force is used – have merged.

…As such, it is no longer practical for the politicians and the diplomats to expect the military to solve the problem by force, nor its it practical for the military to plan and execute a purely military campaign, or in many cases take tactical action, without placing it within the political context and plan accordingly throughout the operation as the situation evolves.

…the opponent will not present a formed, coherent structure to be attacked.

This reality also reflects upon another problem, which is that at present the tendency is to analyze situations in terms of industrial war, and when the circumstances do not fit to declare a case of asymmetry or asymmetric war.  As I noted at the start of this book, I have never cared for this description, since I think that the essence of the practice of war is to achieve an asymmetric advantage over one’s opponent; an advantage in any terms, not just technological.  If you opponent has found a way to negate your industrial and technological advantage, and for whatever reason you are unable or unwilling to change your own parameters so as to regain the advantage, then you must fight on the battlefield that he has set and on his terms.  And on the whole, it is this outcome that we are watching in many hotspots around the world, such as Afghanistan, the Israeli-occupied territories, Lebanon, and Iraq.

Finally, there is the tendency to assume that a situation put to the military is automatically one of defense, since security is dealt with by other arms.  However, since the events of 11 September 2001, it has become clear that the worlds of security and defense have become closely intertwined – to a point where it is no longer possible to simply divide activities between military and other services, such as the police, for example, neither within a state nor when it deploys force outside its borders.

…I have argued that the strategic object cannot now be achieved through the singular use of massive military force alone; in most cases military force can only achieve tactical results, and to have more than passing value these must be stitched into a greater plan.

The new approach to analysis must therefore incorporate a clear understanding of the desired outcome and the utility of force in achieving it.

In addition, one must always revert to the basic point made throughout Part Three of this book: that the people are not the enemy.  The enemy is amongst the people, and the purpose of any use of military force and other power is to differentiate between the enemy and the people, and to win the latter over to you….

Nonetheless, war as cognitively known to most non-combatants, war as battle in a field between men and machinery, war as a massive deciding event in a dispute in international affairs, industrial war – such war no longer exists.  We are now engaged, constantly and in many permutations, in war amongst the people.  We must adapt our approach and organize our institutions to this overwhelming reality if we are to triumph in the confrontations and conflicts that we face.


Appendix U. Excerpts from the US Army / Marine Corps Counterinsurgency Field Manual (University of Chicago Press, 2007)

[This field manual may be downloaded from www.fas.org/irp/doddir/army .  LtC Nagl’s Introduction may be downloaded from http://www.press.uchicago.edu/Misc/Chicago/841519foreword.html .]

[From the Foreword to the University of Chicago Press Edition, by Lieutenant Colonel John A. Nagl]

Doctrine is “the concise expression of how Army forces contribute to unified action in campaigns, major operations, battles, and engagements…Army doctrine provides a common language and a common understanding of how Army forces conduct operations.”  (from Army Field Manual 3-0, Operations).

…All survived, and a draft version of this Field Manual in your hands was produced in just two months.

Population security is the first requirement of success in counterinsurgency, but it is not sufficient. Economic development, good governance, and the provision of essential services, all occurring within a matrix of effective information operations, must all improve simultaneously and steadily over a long period of time if America’s determined insurgent enemies are to be defeated. All elements of the United States government – and those of her allies in this Long War that has been well described as a “Global Counterinsurgency” campaign – must be integrated into the effort to build stable and secure societies that can secure their own borders and do not provide safe haven for terrorists. Recognizing this fact – a recognition spurred by the development of the Counterinsurgency Field Manual – the Department of State hosted an interagency counterinsurgency conference in Washington, D.C., in September 2006. That conference in turn built a consensus behind the need for an interagency counterinsurgency manual. It promises to result in significant changes to the Department of State, the U.S. Agency for International Development, and the other agencies of the U.S. government that have such an important role to play in stabilizing troubled countries around the globe.

[End of Nagl excerpt.]

[From the Introduction to the University of Chicago Press Edition, by Sarah Sewall]

…Today, counterinsurgency and counterterrorism operations are often conflated in official US statements, and COIN [counterinsurgency] has become an increasingly common conceptual framework for the global struggle against terrorism.  But the missions may involve different actors – from specialized military forces to intelligence agencies – with conflicting operational approaches.”

Counterinsurgents seek to expand their efforts along the right continuum, beyond physical security toward economic, social, civil, and political rights (though doing so is complicated, as we shall later see).  Achieving a more holistic form of human security is important for overall mission success.

There are some advantages in having the military essentially do everything in COIN.  It would strengthen unit of command, always the military’s preferred practice.  The manual wistfully notes that having a single person in charge of military and civilian decisions in COIN may not be practical.

…COIN requires significant, effective, and civilian-led efforts to strengthen economies, local political and administrative institutions, and social infrastructure and services for sustained periods of time.

This is the third radical notion: that civilian actors and agencies would become centrally engaged in the field alongside combat forces, and that risks and costs of counterinsurgency would be spread across the US government.

Not everyone accepts the manual’s underlying premise – the legitimacy of counterinsurgency, or even of war itself.  For the pacifist, even “perfect” doctrine enables immoral purpose.  But most of us, however, reluctantly, accept war as necessary.  This is consistent with the lengthy tradition of Western moral reasoning about war, embodied in the concept of the “Just War.”  Rather than eliminate armed conflict, Just War theory aims to bound it.  The ethical framework applies both to decisions to wage war (jus ad bellum) and to the conduct of war (jus in bello), although it offers separate criteria to assess each aspect of war – choice and combat – independently.

To critique the manual on its own merits as military doctrine, one must first enter the jus in bello frame.  Some critics cannot take the full measure of this step.  They may reject the very notion that revolutionary change should be suppressed, a larger argument this essay will not address.  They may warn against counterinsurgency as a genre of war because they believe it inexorably descends into depravity.  Such logic is appealing, but only rewards insurgents’ intolerable behavior.  If the decision to go to war is “just,” enemy misconduct cannot make it morally impermissible to fight it.

Within the jus in bello frame, there are two principal critiques of the new COIN doctrine.  One is that the manual is a naïve and even dangerous approach to fighting today’s enemies.  Its restraint and political correctness threaten to emasculate American military power.  From a competing perspective, the manual smells like a suspect marketing campaign for an inherently inhumane concept of war.  Both critiques reflect a basic truth: counterinsurgency is difficult to fight well and win.  Accordingly, careful consideration of these objections is worthwhile.

…During Vietnam, the US spoke of winning hearts and minds even as it carpet bombed rural areas and rained napalm on village streets.

In truth, nothing prevents the field manual’s prescriptions from being ignored or even used to mask conduct that is counter to its precepts.

A prescription for US forces, then, would be to unleash US military power without regard to its broader consequences.  In this view, morality is not useful or even relevant.

US unwillingness to govern other nations is, in this account, a fatal national flaw.  The field manual stresses the importance of effectively employing nonmilitary power.

…Many contemporary insurgent movements are, by Western standards, conservative or regressive – seeking to restore the social structures and practices threatened by the modern state (or its failure).  Others are based on ethnic or sectarian claims that reject equality of human rights, pluralism, or notions of nationality that are synonymous with state boundaries.  Under these circumstances, the COIN dictum of enhancing host nation legitimacy may require severely compromising Western conceptions of justice.

…Because counterinsurgency is predominantly political, military doctrine should flow from a broader framework.

During the 1990s, the US was bedeviled by an analogous problem.  The Clinton administration wanted to build capacity for multilateral peace operations.  At the same time, the military didn’t want the job, and much of Congress opposed “foreign policy as social work.”  As a result, the US failed to develop critical nation-building capabilities that could have proved crucial in Iraq.

Lack of clarity about when and why the United States will conduct counterinsurgency operations undermines the likelihood that the US will ever do it well.  Alas, military doctrine cannot solve this problem.  Doctrine focuses on how, while national policy focuses on what.  We don’t want military leaders telling the nation which wars it will fight, nor do we want political leaders dictating details of how the military will fight the nation’s wars.

As a leading power in a fragmenting international order, the United States’ strategic challenge is stabilization.  It must do more than simply buttress a government in order to legitimize a state.  It must buttress multiple failing state structures to legitimize the interstate system.  As this requires helping governments control internal threats, it can support efforts to defeat terrorism.  Conversely, antiterrorism efforts can support the stabilization and governance aspects of a counterinsurgency campaign.  But success may also require creating sub- or supra-state authority to secure “ungoverned spaces” around the globe.

[End of Sarah Sewall excerpt.]

The primary audience for this manual is leaders and planners at the battalion level and above.

All insurgencies are different; however, broad historical trends underlie the factors motivating insurgents.  Most insurgencies follow a similar course of development.  The tactics used to successfully defeat them are likewise similar in most cases.  Similarly, history shows that some tactics that are usually successful against conventional forces may fail against insurgents.

The military forces that successfully defeat insurgencies are usually those able to overcome the institutional inclination to wage conventional war against insurgents.  They learn how to practice COIN and apply that knowledge.  This publication can help to compress the learning curve.  It is a tool for planners, trainers and field commanders.

This learning cycle should repeat continuously as US counterinsurgents seek to learn faster than the insurgent enemy.  The side that learns faster and adapts more rapidly wins.

Taxing a mass base usually yields low returns.  In contrast, kidnapping, extortion, bank robbery and drug trafficking – four favorite insurgent activities – are very lucrative.

Throughout history, many insurgencies have degenerated into criminality. This occurred as the primary movements disintegrated and the remaining elements were cast adrift. Such disintegration is desirable; it replaces a dangerous, ideologically inspired body of disaffiliated individuals with a less dangerous but more diverse body, normally of very uneven character. The first is a security threat, the second a law-and-order concern. This should not be interpreted, of course, as denigrating the armed capacity of a law-and-order threat. Successful counterinsurgents are prepared to address this disintegration. They also recognize that the ideal approach eliminates both the insurgency and any criminal threats its elimination produces.

Twenty counterinsurgents per 1,000 residents is often considered the minimum troop density required for effective COIN operations; however as with any fixed ratio, such calculations remain very dependent upon the situation.

Effective analysis of an insurgency requires identifying its strategic, operational, and tactical objectives.  The strategic objective is the insurgents’ desired end state. Operational objectives are those that insurgents pursue to destroy government legitimacy and progressively establish their desired end state.  Tactical objectives are the immediate aims of insurgent acts. Objectives can be psychological or physical.  One example of a psychological objective is discouraging support for the government by assassinating local officials. An example of a physical objective is the disruption of government services by damaging or seizing a key facility. These tactical acts are often linked to higher purposes; in fact, tactical actions by both insurgents and counterinsurgents frequently have strategic effects.

The primary objective of any COIN operation is to foster development of effective governance by a legitimate government. Counterinsurgents achieve this objective by the balanced application of both military and nonmilitary means. All governments rule through a combination of consent and coercion.  Governments described as “legitimate” rule primarily with the consent of the governed; those described as “illegitimate” tend to rely mainly or entirely on coercion. Citizens of the latter obey the state for fear of the consequences of doing otherwise, rather than because they voluntarily accept its rule. A government that derives its powers from the governed tends to be accepted by its citizens as legitimate. It still uses coercion – for example, against criminals – but most of its citizens voluntarily accept its governance.

Six possible indicators of legitimacy that can be used to analyze threats to stability include the following:

·       The ability to provide security for the populace (including protection from internal and external threats).

·       Selection of leaders at a frequency and in a manner considered just and fair by a substantial majority of the populace.

·       A high level of popular participation in or support for political processes.

·       A culturally acceptable level of corruption.

·       A culturally acceptable level and rate of political, economic, and social development.

·       A high level of regime acceptance by major social institutions.

It is easier to separate an insurgency from its resources and let it die than to kill every insurgent.  Clearly, killing or capturing insurgents will be necessary, especially when an insurgency is based in religious or ideological extremism. However, killing every insurgent is normally impossible. Attempting to do so can also be counterproductive in some cases; it risks generating popular resentment, creating martyrs that motivate new recruits, and producing cycles of revenge.

As the HN [host-nation] government increases its legitimacy, the populace begins to assist it more actively.  Eventually, the people marginalize and stigmatize insurgents to the point that the insurgency’s claim to legitimacy is destroyed. However, victory is gained not when this isolation is achieved, but when the victory is permanently maintained by and with the people’s active support and when insurgent forces have been

defeated.

Table I-1.  Successful and unsuccessful counterinsurgency operational practices

Successful practices

Unsuccessful practices

Emphasize intelligence.

Overemphasize killing and capturing the enemy than securing and engaging the populace.

Focus on the population, its needs, and its security.

Conduct large-scale operations as the norm.

Establish and expand secure areas.

Concentrate military forces in large bases for protection.

Isolate insurgents from the populace (population control).

Focus special forces primarily on raiding.

Conduct effective, pervasive, and continuous information operations.

Place low priority on assigning quality advisors to host-nation forces.

Provide amnesty and rehabilitation for those willing to support the new government.

Build and train host-nation security forces in the U.S. military’s image.

Place host-nation police in the lead with military support as soon as the security situation permits.

Ignore peacetime government processes, including legal procedures.

Expand and diversify the host-nation police force.

Allow open borders, airspace, and coastlines.

Train military forces to conduct Counterinsurgency operations.

Embed quality advisors and special forces with host-nation forces.

Deny sanctuary to insurgents.

Encourage strong political and military cooperation and information sharing.

Secure host-nation borders.

Protect key infrastructure.

In counterinsurgencies, warfighting and policing are dynamically linked. The moral purpose of combat operations is to secure peace. The moral purpose of policing is to maintain the peace. In COIN operations, military forces defeat enemies to establish civil security; then, having done so, these same forces preserve it until host-nation (HN) police forces can assume responsibility for maintaining the civil order. When combatants conduct stability operations in a way that undermines civil security, they undermine the moral and practical purposes they serve. There is a clear difference between warfighting and policing.  COIN operations require that every unit be adept at both and capable of moving rapidly between one and the other.


Appendix V. Excerpts from the Tamil Nation Website

Posted at http://www.tamilnation.org

International Relations in an Asymmetric Multilateral World

War & Armed Conflict

"War is the exercise of force for the attainment of a political object, unrestrained by any law save that of expediency…" Carl von  Clausewitz

Guernica (picture by Picasso) is modern art's most powerful antiwar statement. There is no doubt that Guernica challenges our notions of warfare as heroic and exposes it as a brutal act of self-destruction. Speculations as to the exact meaning of the jumble of tortured images are as numerous and varied as the people who have viewed the painting. But it is a hallmark of Picasso's art that any symbol can hold many, often contradictory meanings, and the precise significance of the imagery in Guernica remains ambiguous. When asked to explain his symbolism, Picasso remarked, "It isn't up to the painter to define the symbols. Otherwise it would be better if he wrote them out in so many words! The public who look at the picture must interpret the symbols as they understand them." Guernica: Testimony of War

Extracts from Clausewitz's On War  from an Instructors Guide to teaching Clausewitz at the US National War College, Washington D.C.

- War is fighting and operates in a peculiar element -- danger. But war is served by many activities quite different from it, all of which concern the maintenance of the fighting forces. These preparatory activities are excluded from the narrower meaning of the art of war -- the actual conduct of war, because they are concerned only with the creation, training, and maintenance of the fighting forces. The theory of war proper, on the other hand, is concerned with the use of these means, once they have been developed, for the purposes of the war.

- "Tactics teaches the use of armed forces in the engagement; strategy, the use of engagements for the object of the war."

- "In tactics the means are the fighting forces . . . the end is victory."

"The original means of strategy is victory -- that is, tactical success; its ends . . . are those objects which will lead directly to peace. Strategy . . . confers a special significance . . . on the engagement: it assigns a particular aim to it."

- The activities characteristic of war may be split into two main categories: those that are merely preparations for war, and war proper.

- Earlier theorists aimed to equip the conduct of war with principles, rules, or even systems, and thus considered only factors that could be mathematically calculated (e.g., numerical superiority; supply; the base; interior lines). All these attempts are objectionable, however, because they aim at fixed values. In war everything is uncertain and variable, intertwined with psychological forces and effects, and the product of a continuous interaction of opposites.

- Theory becomes infinitely more difficult as soon as it touches the realm of moral values.

- Thus it is easier to use theory to organize, plan, and conduct an engagement than it is to use it in determining the engagement’s purpose.

- Theory then becomes a guide to anyone who wants to learn about war from books; it will light his way, ease his progress, train his judgment, and help him to avoid pitfalls.

- Theory need not be a positive doctrine, a sort of manual for action. . . . It is an analytical investigation leading to a close acquaintance with the subject.

- Fighting is the central military act. . . . Engagements mean fighting. The object of fighting is the destruction or defeat of the enemy.

 What do we mean by the defeat of the enemy? Simply the destruction of his forces, whether by death, injury, or any other means -- either completely or enough to make him stop fighting. .

- The complete or partial destruction of the enemy must be regarded as the sole object of all engagements. . . . Direct annihilation of the enemy's forces must always be the dominant consideration.

- Although the concept of defense is parrying a blow and its characteristic feature is awaiting the blow, if we are really waging war, we must return the enemy's blows. . . . Thus a defensive campaign can be fought with offensive battles. . . The defensive form of war is not a simple shield, but a shield made up of well-directed blows.

- The object of defense is preservation; and since it is easier to hold ground than to take it, defense is easier than attack. But defense has a passive purpose: preservation; and attack a positive one: conquest. . . . If defense is the stronger form of war, yet has a negative object, if follows that it should be used only so long as weakness compels, and be abandoned as soon as we are strong enough to pursue a positive object.

- Defense is the stronger form of waging war.

- In the defense of a theater, "the importance of possessing the country increases, the less a decision is actively sought by the belligerents." When the war is governed by the urge for a decision, however, "such a decision may be made up of a single battle or a series of major engagements." This likelihood "should be enough to call for the utmost possible concentration of strength. . . . A major battle in a theater of operations is a collision between two centers of gravity; the more forces we can concentrate in our center of gravity, the more certain and massive the effect will be."

- “No one starts a war--or rather, no one in his senses ought to do so--without first being clear in his mind what he intends to achieve by that war and how he intends to conduct it.”

- "The natural aim of military operations is the enemy's overthrow. . . . Since both belligerents hold that view, it would follow that military operations could not be suspended . . . until one or other side were finally defeated." But that theoretical concept is not borne out in practice because of a "vast array of factors, forces, and conditions in national affairs that are affected by war."

-- "The degree of force that must be used against the enemy depends on the scale of political demands on either side. . . . But they seldom are fully known. Since in war too small an effort can result not just in failure, but in positive harm, each side is driven to outdo the other, which sets up an interaction."

- The aim of war should be the defeat of the enemy. But what constitutes defeat? The conquest of his whole territory is not always necessary, and total occupation of his territory may not be enough.

- Out of the dominant characteristics of both belligerents "a certain center of gravity develops, the hub of all power and movement, on which everything depends. That is the point against which all our energies should be directed."

- "The acts we consider most important for the defeat of the enemy are…

--- Destruction of his army, if it is at all significant

--- Seizure of his capital if it is not only the center of administration but also that of social, professional, and political activity

--- Delivery of an effective blow against his principal ally if that ally is more powerful than he."

- "Time . . . is less likely to bring favor to the victor than to the vanquished. . . An offensive war requires above all a quick, irresistible decision. . . . Any kind of interruption, pause, or suspension of activity is inconsistent with the nature of offensive war."

- “A defender must always seek to change over to the attack as soon as he has gained the benefit of the defense.”

- "The defeat of the enemy . . . . presuppose[s] great physical or moral superiority or else an extremely enterprising spirit. . . . When neither of these is present, the object of military activity can only be one of two kinds: seizing a small or larger piece of enemy territory, or holding one's own until things take a better turn." Thus "two kinds of limited war are possible: offensive war with a limited aim, and defensive war."

- "It is of course well known that the only source of war is politics -- the intercourse of governments and peoples. . . . We maintain . . . that war is simply a continuation of political intercourse, with the addition of other means.

- "If war is part of policy, policy will determine its character. As policy becomes more ambitious and vigorous, so will war, and this may reach the point where war attains its absolute form. . . . Policy is the guiding intelligence and war only the instrument, not vice versa."

- "No major proposal required for war can be worked out in ignorance of political factors. . . . [Likewise,] if war is to be fully consonant with political objectives, and policy suited to the means available for war, . . . the only sound expedient is to make the commander-in-chief a member of the cabinet."

- In limited war, we can achieve a positive aim by seizing and occupying a part of the enemy's territory. However, this effort is burdened with the defense of other points not covered by our limited offensive. Often the cost of this additional defense negates or even outweighs the advantages of our limited offensive.

- We can also undertake a limited defensive war, of which there are two distinct kinds. In the first, we aim to keep our territory inviolate and hold it as long as possible, hoping time will change the external situation and relieve the pressure against us. In the second, we adopt the defensive to help create the conditions for a counteroffensive and the pursuit of a positive aim.

- "Two basic principles . . . underlie all strategic planning. . . .

--- The first principle is: act with the utmost concentration [trace the ultimate substance of enemy strength to the fewest possible sources; compress the attack on these sources to the fewest possible actions; and subordinate minor actions as much as possible].

--- The second principle is: act with the utmost speed [every unnecessary expenditure of time and every unnecessary detour is a waste of strength; take the shortest possible road to the goal]."

--- The first task, then, in planning for a war is to identify the enemy’s center of gravity, and if possible trace it back to single one.

--- The second task is to ensure that the forces to be used against that point are concentrated for a main offensive.

Pp 75-89.

- "War is . . . an act of force to compel our enemy to do our will."

- Because war is an act of force, committed against a living, reacting opponent, it produces three interactions that, in theory, lead to three extremes: maximum use of force; total disarmament of the enemy; and maximum exertion of strength.

--- However, war never achieves its absolute nature because: "war is never an isolated act;" "war does not consist of a single short blow;" and "in war the result is never final."

--- "Once the extreme is no longer feared or aimed at, it becomes a matter of judgment what degree of effort should be made; and this can only be based on . . . the laws of probability."

--- "War is also interrupted (or moderated), and thus made even more a gamble, by: the superiority of defense over offense; imperfect knowledge of the situation; and the element of chance."

- "As this law [of extremes] begins to lose its force and as this determination wanes, the political aim will reassert itself. . . . The political object -- the original motive for the war -- will thus determine both the military objective to be reached and the amount of effort it requires."

--- "War is not a mere act of policy but a true political instrument, a continuation of political activity by other means."

--- "The more powerful and inspiring the motives for war . . . the closer will war approach its abstract concept. . . . The less intense the motives, the less will the military element's natural tendency to violence coincide with political directives."

--- "The first, the supreme, the most far-reaching act of judgment that the statesman and commander have to make is to establish . . . the kind of war on which they are embarking."

- "As a total phenomenon its dominant tendencies always make war a remarkable trinity -- composed of primordial violence, hatred, and enmity . . . of the play of chance and probability . . . and of its element of subordination, as an instrument of policy."

"If . . . we consider the pure concept of war . . . . its aim would have always and solely to be to overcome the enemy and disarm him." This encompasses "three broad objectives, which between them cover everything: destroying the enemy's armed forces; occupying his country; and breaking his will to continue the struggle.

"But the aim of disarming the enemy (the object of war in the abstract . .) is in fact not always encountered in reality, and need not be fully achieved as a condition of peace."

"Inability to carry on the struggle can, in practice, be replaced by two other grounds for making peace: the first is the improbability of victory; the second is its unacceptable cost."

We may demonstrate to the enemy the improbability of his victory by: obtaining a single victory; by seizing a province; or by conducting operations to produce direct political repercussions.

We may demonstrate to the enemy the unacceptable cost of his struggle by: invading his territory; conducting operations to increase his suffering; or by wearing down the enemy.

There is only one means in war: combat.

"Whenever armed forces . . . are used, the idea of combat must be present. . . . The end for which a soldier is recruited, clothed, armed, and trained, the whole object of his sleeping, eating, drinking, and marching is simply that he should fight at the right place and the right time."

"If the idea of fighting underlies every use of the fighting forces, then their employment means simply the planning and organizing of a series of engagements. . . The destruction of the enemy's forces is always the means by which the purpose of the engagement is achieved."

"When one force is a great deal stronger than the other, an estimate may be enough. There will be no fighting: the weaker side will yield at once. . . Even if no actual fighting occurs . . . the outcome rests on the assumption that if it came to fighting, the enemy would be destroyed."

"When we speak of destroying the enemy's forces we must emphasize that nothing obliges us to limit this idea to physical forces: the moral element must also be considered."

"Destruction of the enemy forces is always the superior, more effective means, with which others cannot compete. . . . The commander who wishes to adopt different means can reasonably do so only if he assumes his opponent to be equally unwilling to resort to major battles."

"Genius refers to a very highly developed mental aptitude for a particular occupation. . . . The essence of military genius . . . . consists in a harmonious combination of elements."

"War is the realm of danger; therefore courage is the soldier's first requirement"

"War is the realm of physical exertion and suffering. . . . Birth or training must provide us with a certain strength of body and soul."

"We come now to the region dominated by the powers of intellect. War is the realm of uncertainty . . . . War is the realm of chance. . . . Two qualities are indispensable: first, an intellect that, even in the darkest hour, retains some glimmerings of the inner light which leads to truth; and second, the courage to follow this faint light wherever it may lead. The first of these qualities is described by the French term, coup d'oeil; the second is determination."

"War's climate of danger, exertion, uncertainty, and chance also demands other intellectual qualities.

"Presence of mind . . . is nothing but an increased capacity of dealing with the unexpected."

"Energy in action varies in proportion to the strength of its motive." Of all the passions none is more powerful than ambition.

"Staunchness indicates the will's resistance to a single blow; endurance refers to prolonged resistance."

"Strength of mind or of character" is "the ability to keep one's head at times of exceptional stress and violent emotion."

"Firmness cannot show itself, of course, if a man keeps changing his mind." It demands sticking to one's convictions.

The relationship between warfare and terrain demands "the faculty of quickly and accurately grasping the topography of any area."

"If we then ask what sort of mind is likeliest to display the qualities of military genius . . . it is the inquiring rather than the creative mind, the comprehensive rather than the specialized approach, the calm rather than the excitable head."

"We have identified danger, physical exertion, intelligence, and friction as the elements that coalesce to form the atmosphere of war, and turn it into a medium that impedes activity."

"The novice cannot pass through these layers of increasing intensity of danger without sensing that here ideas are governed by other factors, that the light of reason is refracted in a quite different from that which is normal in academic speculation."

"If no one had the right to give his views on military operations except when he is frozen, or faint from heat and thirst, or depressed from privation and fatigue, objective and accurate views would be even rarer than they are."

"Many intelligence reports in war are contradictory; even more are false, and most are uncertain."

"Everything in war is very simple, but the simplest thing is difficult. The difficulties accumulate and end by producing a kind of friction. . . . This tremendous friction . . . is everywhere in contact with chance, and brings about effects that cannot be measured, just because they are largely due to chance. . . . Moreover, every war is rich in unique episodes."

"The good general must know friction in order to overcome it whenever possible, and in order not to expect a standard of achievement in his operations which this very friction makes impossible."

"Is there any lubricant that will reduce this abrasion? Only one . . . combat experience."

"Strategy is the use of the engagement for the purpose of the war. The strategist must therefore define an aim for the entire operational side of the war that will be in accordance with its purpose. . . . The aim will determine the series of actions intended to achieve it."

"Results are of two kinds: direct and indirect. . . . The possession of provinces, cities, fortresses, roads, bridges, munitions dumps, etc., may be the immediate object of an engagement, but can never be the final one."

"If we do not learn to regard a war, and the separate campaigns of which it is composed, as a chain of linked engagements each leading to the next, but instead succumb to the idea that the capture of certain geographical points or the seizure of undefended provinces are of value in themselves, we are liable to regard them as windfall profits."

"The strategic elements that affect the use of engagements may be classified into various types: moral, physical, mathematical, geographical, and statistical."

The moral elements [everything that is created by intellectual and psychological qualities and influences] are among the most important in war. Unfortunately, they will not yield to academic wisdom. They cannot be classified or counted. . . . The effects of physical and psychological factors form an organic whole. In formulating any rule concerning physical factors, the theorist must bear in mind the part that moral factors may play in it."

The principal moral elements . . . . are: the skill of the commander, the experience and courage of the troops, and their patriotic spirit.

"An army that maintains its cohesion; . . that cannot be shaken by fears . . ; [that] will not lose the strength to obey orders and its respect and trust for its officers . . ; [that] has been steeled by training in privation and effort; . . that is mindful of the honor of its arms -- such an army is imbued with the true military spirit."

“There are only two sources for this spirit. . . . The first is a series of victorious wars; the second, frequent exertions of the army to the utmost limits of its strength."

“In what field of human activity is boldness more at home than in war? . . . It must be granted a certain power over and above successful calculations involving space, time, and magnitude of forces."

"In war more than anywhere else things do not turn out as we expect. . . . Perseverance in the chosen course is the essential counterweight."

A universal desire is to take the enemy by surprise as a means to gain superiority. But "it is equally true that by its very nature surprise can rarely be outstandingly successful. . . . In strategy surprise becomes more feasible the closer it occurs to the tactical realm, and more difficult, the more it approaches the higher levels of policy."

"Cunning implies secret purpose. . . . It is itself a form of deceit. . . . No human characteristic appears so suited to the task of directing and inspiring strategy. . . . [Yet] the fact remains that these qualities do not figure prominently in the history of war."

"Superiority of numbers is the most common element in victory. . . . Superiority . . . can obviously reach the point where it is overwhelming. . . . It thus follows that as many troops as possible should be brought into the engagement at the decisive point.

"The best strategy is always to be very strong; first in general, and then at the decisive point. . . . There is no higher and simpler law of strategy than that of keeping one's forces concentrated."


Appendix W. Excerpt from Sheldon Richman’s War Is a Government Program

War Is a Government Program

by Sheldon Richman, Posted November 19, 2007

It is always amusing to hear conservatives complain — as they are complaining now and used to complain during the Vietnam War — that if it weren’t for the politicians, the generals could win America’s wars. Those with this mindset believe the politicians are always getting in the way by subordinating military considerations to — ugh! — political considerations. Politicians, leave those generals alone!

This is amusing for a couple of reasons. First, these same conservatives claim to worship the U.S. Constitution, which, the last time I read it, subordinated the military to civilian authority.

Second, those who make this complaint seem willfully blind to the nature of war. At its most fundamental level, war is no more a military phenomenon than it is a scientific phenomenon. True, militaries fight wars, and military tactics is a meaningful discipline. But war also requires weapons that make use of the principles of physics. Does that mean wars are fundamentally the province of scientists? No, and neither are they fundamentally the province of generals.

Wars are political phenomena. You’d think the armchair generals and word-processor pilots would know that. It’s been 175 years since the publication of Karl von Clausewitz’s posthumous book, On War, which stated,

[War] is not merely a political act, but also a real political instrument, a continuation of political commerce, a carrying out of the same by other means.

War is politics.

Unless they are also heads of state, generals don’t start wars. Politicians start wars. In fact, generals have been known to oppose wars, having a more realistic sense than politicians of what wars really entail.

Politicians start wars for political reasons. (This is not to imply that economic reasons aren't involved.) They may seek to control resources or a foreign population. Or they may seek to secure existing interests that could be at risk without the war. The mark of a global empire is that nothing can happen anywhere in the world without its potentially involving the interests of the imperial power and requiring, under the appropriate circumstances, war to protect those interests. This well describes the United States for the last half-century at least. The military is the means to a political end. The politicians cannot be concerned with military matters exclusively because that might cause them to ignore important political considerations, both domestic and foreign.

An instrument of tyranny

War always has a domestic side. Ruling classes hold power so that they may live off the toil of the domestic population. And because those ruled always far outnumber the rulers, ideology and propaganda are necessary to maintain the allegiance of the subject population. War is useful in keeping the population in a state of fear and therefore trustful of their rulers. H.L. Mencken said it best:

The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.

War and putative external threats are used to justify conscription, higher taxes, regimentation, suspension of civil liberties, lucrative contracts for cronies, and the like.

War, then, is always a government program, sharing certain features characteristic of all government programs. Politicians are not like entrepreneurs in a marketplace, dealing exclusively through consent to earn profits by pleasing consumers. Rather, politicians operate an apparatus of force, deception, and exploitation — the state — in pursuit of their own objectives. All the perversities we have come to expect from government’s domestic programs are present in war: hubris, corruption, self-interestedness disguised as public service, insulation from accountability, the inability to calculate true costs. But war is more to be feared than other government programs, and not just for the obvious reason—mass murder. It is only in matters of war and foreign affairs that the politicians can demand secrecy. If they refused to discuss Social Security because the information was classified, they would be ridiculed. Yet they routinely get away with denying the public information in military matters. This provides all the more scope for the horrors that war entails.

Thus anyone with even a scintilla of suspicion of state power ought to be wary of the state’s power to make war, for this power is the root of so many other evils. As James Madison said,

Of all the enemies of true liberty, war is, perhaps, the most to be dreaded, because it comprises and develops the germ of every other. War is the parent of armies; from these proceed debts and taxes; and armies, and debts, and taxes are the known instruments for bringing the many under the domination of the few.… No nation can preserve its freedom in the midst of continual warfare.

It is maintained by conservatives and others that these concerns may be relevant to other countries but not to the United States. Here American exceptionalism comes into play with full force. What this says is that the U.S. government is not subject to the same laws of politics that other governments are subject to. Why not? Because the Declaration of Independence proclaimed noble principles? But this is a non sequitur. One can grant the nobility of the Declaration’s philosophy and still demand proof that this noble character carries over to the government. Well, what about the Constitution? Again, this argument does not work, for even if we overlook the flaws in the Constitution, U.S. governments from 1789 on have evaded its limitations whenever possible.

The power to declare war

The war power is illustrative. Article I, Section 8, reserves to Congress the power to declare war. Yet presidents have invaded and occupied countries without congressional declarations for many years. Even when Congress has “authorized” a president to commence military operations in another country, as in Iraq in 2002, it has done so in ways that do not resemble the declarations approved before U.S. entry in World War I and World War II. In those cases, Presidents Wilson and Roosevelt asked Congress to exercise its constitutional power to declare war on grounds that a “state of war” had already existed with Germany in 1917 and Japan in 1941. In contrast, the authorization for military force in Iraq did not demonstrate that a state of war existed between that country and the United States or declare a state of war. Rather, it authorized the president to use the armed forces to

(1)                defend the national security of the United States against the continuing threat posed by Iraq; and

(2)                enforce all relevant United Nations Security Council resolutions regarding Iraq. (Emphasis added.)

Iraq had not attacked or overtly threatened the United States or any American citizen. Leaving aside the questions about the intelligence on weapons of mass destruction, the “threat” alluded to in the authorization was not based on any acts taken by the Iraqi government against this country. Thus Congress could not have found that a state of war existed between the two countries. Instead, Congress improperly delegated to the president the power to decide when and if to create a state of war. The resolution required only that he “certify” that diplomatic efforts had failed before he used force. Indeed, House Minority Leader Richard Gephardt confirmed that Congress was not declaring war when he said, “[We] should deal with it [the Iraqi problem] diplomatically if we can, militarily if we must. And I think this resolution does that.” The document amounted to a blank check with which the president could go to war or not as he saw fit.

This is not to say the Constitution’s war-power provision is an adequate safeguard against presidential adventurism. American history teaches that presidents are fully able to get declarations out of Congress if they want them badly enough. There was no good reason for the United States to intervene in World War I, since the German attacks on American ships came as a result of provocations by Woodrow Wilson. Yet Wilson was able to use those responses to his provocations to obtain a declaration. Similarly, Franklin Roosevelt conducted years of (undeclared) economic warfare against Japan before the attack on Pearl Harbor. He attempted to bait Germany into attacking U.S. ships but failed. Had he succeeded, Congress would have declared war on Germany. The point is that a president can get a declaration of war if he wants one.

Thus, even strictly construed, the Constitution cannot preclude foreign intervention. This was seen clearly by Randolph Bourne, who broke with his “progressive” erstwhile colleagues and opposed U.S. intervention into World War I. In his essay “The State,” which he left uncompleted at his death in 1918, Bourne wrote,

The Government, with no mandate from the people, without consultation of the people, conducts all the negotiations, the backing and filling, the menaces and explanations, which slowly bring it into collision with some other Government, and gently and irresistibly slides the country into war. For the benefit of proud and haughty citizens, it is fortified with a list of the intolerable insults which have been hurled toward us by the other nations; for the benefit of the liberal and beneficent, it has a convincing set of moral purposes which our going to war will achieve; for the ambitious and aggressive classes, it can gently whisper of a bigger role in the destiny of the world. The result is that, even in those countries where the business of declaring war is theoretically in the hands of representatives of the people, no legislature has ever been known to decline the request of an Executive, which has conducted all foreign affairs in utter privacy and irresponsibility, that it order the nation into battle. [Emphasis added.]

Hermann Goering during the Nuremberg trials put it succinctly:

[Of] course the people don’t want war.... But after all, it is the leaders of the country who determine the policy, and it is always a simple matter to drag the people along, whether it is a democracy or a fascist dictatorship or a Parliament, or a Communist dictatorship.

[Voice] or no voice, the people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked and denounce the pacifists for lack of patriotism and exposing the country to greater danger.

I’d rather quote Mencken:

Wars are seldom caused by spontaneous hatreds between people, for peoples in general are too ignorant of one another to have grievances and too indifferent to what goes on beyond their borders to plan conquests. They must be urged to the slaughter by politicians who know how to alarm them.

Sheldon Richman is senior fellow at The Future of Freedom Foundation, author of Tethered Citizens: Time to Repeal the Welfare State, and editor of The Freeman magazine. Visit his blog “Free Association” or send him email.

This article originally appeared in the August 2007 edition of Freedom Daily. Subscribe to the print or email version of Freedom Daily.

FndID(228)

FndTitle(The Late Great United States: The Decline and Fall of the United States of America, Appendices)

FndDescription(The Late Great United States: The Decline and Fall of the United States of America, Appendices)

FndKeywords(decline of the US; the collapse of complex societies; growth-based economics; steady-state economics; limits to growth)