Best Laid Plans The Tyranny of Unintended Consequences and How to Avoid Them William A. Sherden
Copyright 2011 by Wil...
50 downloads
1266 Views
805KB Size
Report
This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
Report copyright / DMCA form
Best Laid Plans The Tyranny of Unintended Consequences and How to Avoid Them William A. Sherden
Copyright 2011 by William A. Sherden All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, except for the inclusion of brief quotations in a review, without prior permission in writing from the publisher. Library of Congress Cataloging-in-Publication Data Sherden, William A. Best laid plans : the tyranny of unintended consequences and how to avoid them / William A. Sherden. p. cm. Includes bibliographical references and index. ISBN 978-0-313-38531-5 — ISBN 978-0-313-38532-2 (ebook) 1. Decision making. 2. Statics and dynamics (Social sciences) I. Title. BF448.S44 2011 302'.1—dc22 2010043607 ISBN: 978-0-313-38531-5 EISBN: 978-0-313-38532-2 15
14
13
12
11
1
2
3
4
5
This book is also available on the World Wide Web as an eBook. Visit www.abc-clio.com for details. Praeger An Imprint of ABC-CLIO, LLC ABC-CLIO, LLC 130 Cremona Drive, P.O. Box 1911 Santa Barbara, California 93116-1911 This book is printed on acid-free paper Manufactured in the United States of America
To my wife Molly for the many invaluable roles she played in helping create this book.
Contents 1. The Tyranny of Unintended Consequences
1
2. The Web of Life
11
3. The Domino Effect
29
4. The Vicious Cycle
49
5. The Bandwagon Effect
67
6. The Balance of Nature
87
7. Perverse Adaptations
103
8. Coming into Being
121
9. Breaching the Peace
143
10. Thinking through the Maze
165
Notes
177
Bibliography
191
Index
201
Chapter 1
The Tyranny of Unintended Consequences One of life’s most pervasive dilemmas is the unintended consequences of our actions. Historian Daniel Boorstin observed that “the unintended consequences of man’s enterprises have and will always be more potent, more widespread, and more influential than those he intended.”1 Supporting Boorstin’s assertion are the nearly two million Internet sites that deal with the unintended consequences of everything from social policy to international relations to technology. There is even a site called The Museum of Unintended Consequences, which chronicles the myriad instances of how best laid plans went awry. Examining and understanding how unintended consequences occur can help to avoid life’s many pitfalls. This book seeks to provide this knowledge by introducing eight social mechanisms that are the root cause of unintended consequences. Sometimes unintended consequences are catastrophic, sometimes beneficial. Occasionally, their impacts are imperceptible; other times, colossal. Large events frequently have a number of unintended consequences, but even small events can trigger them. There are numerous instances of purposeful deeds completely backfiring, causing the exact opposite of what was intended. Major events like wars and large government programs almost always have unintended outcomes. Conservatives have frequently employed the prospect of unintended consequences as a foil to big government by arguing that massive programs fail to solve intended problems and create a host of unexpected, deleterious side effects. The late conservative economist and Nobel laureate Milton Friedman, for example, argued, “The only way to control unanticipated events is to have Washington do as little as possible.”2 Invariably, large social programs
2
Best Laid Plans
attract people seeking to exploit the system for personal gain, which creates unanticipated results. Some opponents of big government have argued, for example, that welfare promotes poverty because it creates an incentive that keeps some people from seeking employment. Similarly, people have argued that Aid to Families with Dependent Children—whose original intent was to help widows with children—promoted fatherless households, because assistance to unwed mothers created an incentive not to get married. A perpetual political debate continues to rage over whether the benefits of these large social programs outweigh their negative unintended consequences. Unintended consequences can also be beneficial, as in the case of the Sarbanes-Oxley Act, which was passed in 2002 to combat fraud by requiring corporations to produce accurate financial reports. Although corporate executives initially viewed the legislation as governmental interference, the act had the unintended consequences of equipping them with better managerial information and of making their firms more efficient. When the U.S. Congress agreed to add educational and mortgage benefits to the GI Bill after World War II, it did not realize that the opportunities the bill created for veterans would transform U.S. society, giving rise to the formation of the suburbs and of an educated middle class. The Pentagon’s efforts to create a bombproof military communications network and accurate satellite system for guiding missiles unintentionally spawned the highly popular civilian use of the Internet and the Global Positioning System (GPS). The best laid plans of politicians of all parties and all ideologies, including proponents of laissez-faire policies, are subject to unintended consequences. According to Ronald Brownstein and Daniel Balz, political editors of the Los Angeles Times and the Washington Post, respectively, “There is no reason to believe it is easier to predict the consequences of retrenching government than enlarging it.”3 The dismantling of decades-old financial regulations in the United States, for example, was a cause of the recession that began in 2007. In 1999, Congress repealed the Glass-Steagall Act, which restricted commercial banks from participating in investment banking. Thereafter, for the first time since the Great Depression, commercial banks engaged in the risky underwriting of derivatives and other securities and mounted such massive losses that some of the bigger institutions had to be bailed out by the U.S. government. Former chairman of the Federal Reserve Alan Greenspan—a free market advocate and a major player in the deregulation of the U.S. financial system—testified at an October 2008 congressional hearing that “those of us who have looked to the self-interest of lending institutions to protect shareholders’ equity, myself included, are in a state of shocked disbelief.”4
The Tyranny of Unintended Consequences
3
Congress passed legislation in 1980 and 1982 during the Reagan administration to deregulate savings and loans banks (S&Ls), which led to the savings and loan crisis in the early 1990s. Banking regulations had previously restricted S&Ls’ activities to providing only savings accounts and home mortgages. Legislation passed in the early 1980s permitted them to make other types of loans and to operate with less regulatory oversight. Losses from making risky loans and from fraud caused the failure of 747 S&Ls, which required a massive $160.1 billion bailout by the U.S. government, according to the U.S. General Accounting Office (now called the Government Accountability Office). In another misguided effort at deregulation, the state of California deregulated power utilities in 1996 to increase the supply of energy. This action contributed to the California electric crisis of 2000, which led to rolling blackouts in 2000 and 2001 as well as the bankruptcy of the Pacific Gas and Electric Company in 2001. Even small events can trigger unintended consequences. Small altercations can lead to major feuds between individuals, countries, and organizations. The accusation of the theft of a pig, for example, triggered the notorious Hatfield and McCoy feud in 1878 in rural West Virginia and Kentucky that ultimately resulted in the death of nearly two dozen people. Both World War I and the Spanish-American War were started by small events: the former by the 1914 assassination of Austria’s Archduke Ferdinand, the latter by the unexplained 1898 sinking of the USS Maine in Havana Harbor. Some acts create strange, unexpected side effects that are completely unrelated to what was originally intended. The mandated addition of ethanol to gasoline for promoting energy independence and reducing air pollution has increased world hunger and has fomented global political unrest by diverting U.S. corn from exportation for food to making ethanol. In a different context, U.S. regulations requiring fertility clinics to report their success rate for in vitro fertilization procedures to help inform the decisions of prospective parents have unintentionally caused a troubling increase in the number of premature births. Most intriguing are the many instances where deeds have totally backfired by creating the opposite effect of what was intended. In fact, the phenomenon is so prevalent that the CIA has a name for it: blowback. Many examples of blowback exist, and this book discusses the following: •
Trillions of dollars spent in foreign aid to the world’s most impoverished countries have hurt the aid recipients’ economies and made them less able to climb out of poverty.
4
Best Laid Plans
• •
•
• •
• • •
• • • • • •
The deportation of illegal immigrants committing crimes in Los Angeles has spawned one of the world’s most dangerous crime gangs. Efforts in the United States to limit the influence of wealthy individuals and organizations via campaign finance reform have devolved into a situation where political groups can now spend unlimited funds and shape the outcomes of elections. Consumers who purchased sport utility vehicles for safety purposes have experienced much higher driving fatalities than those with standard automobiles. The use of GPS to enhance safety on land and sea has caused accidents by creating new hazards as users become overreliant on these devices. Organizations seeking to control expenses by enacting hiring freezes have experienced cost increases by relying on overtime work and other costly resources. Busing programs intended to desegregate schools have created school systems that are more segregated. Programs to protect endangered species have hastened the species’ demise. The introduction of the Scholastic Aptitude Test to make college admissions more equitable for middle- and low-income students has given affluent students an added advantage. The acquisition of new weapons to increase nations’ security has put these countries at greater risk. The imposition of trade barriers to protect industries has resulted in the decline of those industries. Programs designed to improve neighborhoods have created slums. Dominant parties, by wielding too much power, have lost their dominance. Efforts to prevent riots have precipitated them. Financial assistance to families has created more fatherless homes.
Even ventures deemed to be successful have generated a host of unexpected and unwanted side effects. Federal Reserve chairman Paul Volcker successfully curtailed inflation in the 1980s while causing the Third World debt crisis and a severe recession. The introduction of the labor-saving cotton gin increased slavery, because it became profitable to extend cotton farming to new lands. The Clean Air Act improved local air quality around power plants but created acid rain that fell hundreds of miles away. The banning of the insecticide DDT preserved many endangered species while increasing mosquitoes and malaria in developing countries. Although history is rife with examples of best laid plans going awry, there has been little scholarly or journalistic mention of unintended consequences until fairly recently. Historically, the little that has been written dates back to Adam Smith’s 1776 book, The Wealth of Nations, in which he
The Tyranny of Unintended Consequences
5
made the brilliant observation that the greedy behavior of buyers and sellers of goods in a marketplace unintentionally created a working economy where the supplies of goods met their demands with fair and stable prices. Smith noted, “It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own self interest.”5 Thus, Smith recognized the existence of unintended consequences, although he did not label them as such. Nineteenth-century French economic journalist Frederic Bastiat was among the first to note that human actions frequently gave rise to multiple outcomes and that some of these outcomes are foreseen while others are unforeseen. Bastiat blamed unintended consequences on the inability to anticipate the unforeseen outcomes of our acts: In the economic sphere an act, a habit, an institution, a law produces not only one effect, but a series of effects. Of these effects, the first alone is immediate; it appears simultaneously with its cause; it is seen. The other effects emerge only subsequently; they are not seen; we are fortunate if we foresee them. There is only one difference between a bad economist and a good one: the bad economist confines himself to the visible effect; the good economist takes into account both the effect that can be seen and those effects that must be foreseen. Yet this difference is tremendous; for it almost always happens that when the immediate consequence is favorable, the later consequences are disastrous, and vice versa.6
Sociologist Robert Merton attempted to explain the causes of unintended consequences in his famous 1936 article, “The Unintended Consequences of Purposive Social Action.” Merton blamed unintended consequences largely on ignorance, error in analysis, and the overzealousness of decision makers too wed to their plans to consider potential outcomes other than those they intended.7 Merton might be considered prescient in explaining the Iraq War. President George W. Bush’s ill-fated invasion of Iraq in 2003 for Iraq’s supposed possession of weapons of mass destruction was driven by ignorance, error, and a love affair with his original plan. Ignorance was evident in the fact that there were no weapons of mass destruction; intelligence reports show this to be the case. The president had so little knowledge about Iraqi society that he was unaware of the difference between Sunni and Shiite Islamic sects that had been killing each other since the day Mohammad died in 632 c.e. Error was evident in declaring victory before securing peace in Iraq. That he was wed to his original plan was evident in Bush’s continual plea to “stay the course” years after conditions in Iraq called for a new approach.
6
Best Laid Plans
Whether because of beliefs in superstition or randomness, many people attribute unintended consequences to luck. As philosophy professor Nicholas Rescher from the University of Pittsburgh noted, “Belief in fortune and luck, good and evil, is one of the most widespread and persistent human beliefs.”8 Rescher defines luck as a situation in which “as far as the affected person is concerned, the outcome came about ‘by accident.’ ”9 Eighteenthcentury French philosopher Voltaire conversely argued that “there is no such thing as an accident.”10 This book explains how unintended consequences occur beyond consideration of Merton’s human failings and the pervasive belief in misfortune. In the spirit of Voltaire and Bastiat, I describe eight social mechanisms—too often unforeseen—that complicate matters and cause our best laid plans to go awry. The purpose of this book is to help readers avoid unintended consequences by explaining how each of these eight social mechanisms helps to foresee the potential outcomes of decisions. The eight social mechanisms are as follows: • • • • • • • •
Organized Complexity: a system of many components interacting in nonrandom ways; Chain Reactions: a sequence of reactions, the results of which can cause other reactions to occur in a way that amplifies events; Reinforcing Feedback: events that reinforce themselves, resulting in their growth or decline; Balancing Forces: forces that oppose changes in a system, thereby keeping it in balance; Lock-In: the continual amplification of events that grow out of proportion and lead to irrational outcomes; Adaptation: a process whereby a person or social organization becomes better suited to its environment; Emergence: phenomena that, under the right circumstances, can unexpectedly come into existence; Disturbed Equilibria: the disruption of a stable, well-established system that can unleash unpredictable outcomes.
The origin of each of these concepts is disparate. The mechanisms have been well-established in scholarly works and popular literature, but this is the first book to explain how all eight of them cause unintended consequences. Organized Complexity: Charles Darwin and other 19th-century biologists recognized the complex interconnections among the flora and fauna in ecosystems and how tinkering with a single species could trigger changes in seemingly unrelated parts of the ecosystem. In 1948, U.S. scientist Warren
The Tyranny of Unintended Consequences
7
Weaver coined the phrase “organized complexity” to describe biological and social systems as having “a sizable number of factors which are interrelated into an organic whole.”11 Chain Reactions: Hungarian scientist Leo Szilard first conceived the notion of a nuclear chain reaction in 1934. Social scientists call chain reactions among humans collective behavior—colloquially referred to as the herd instinct—and use this concept to explain mass hysteria, fads, and panics. French sociologist Gustave Le Bon was one of the first to describe collective behavior in his 1896 book, The Crowd: A Study of the Popular Mind, and the phrase was coined decades later by Robert Park from the University of Chicago. Reinforcing Feedback and Balancing Forces: German American psychologist Kurt Lewin introduced the notion of reinforcing feedback and balancing forces in social organizations in his 1945 book, Field Theory in Social Science. In the late 1950s, MIT professor Jay Forester introduced a new management science field called system dynamics, which was based on reinforcing feedback and balancing forces, which he popularized in a 1961 best-selling book Industrial Dynamics. Peter Senge, also from MIT, reintroduced Forester’s work in a popular 1990 book The Fifth Discipline. Acknowledgment of the existence of balancing forces to maintain order in our biological and social worlds has a long history in biology and economics. In 1865, French physiologist Claude Bernard made the observation that animals had self-regulating bodies. U.S. physiologist Walter Cannon built on Bernard’s work in his 1932 book, The Wisdom of the Body, and used the word homeostasis (which, in Greek, means to remain the same) to describe the phenomenon. The discovery of the economic balancing force called declining marginal utility may date back to fourth-century b.c.e. Greek philosopher Aristotle. The concept of declining marginal utility means that the more one consumes a particular good, the less value it has to that consumer. For example, the first serving of ice cream may be particularly gratifying, but each subsequent one in a row becomes less so. Aristotle wrote of such abundant goods that “too much of them they must either do harm, or at any rate be of no use.”12 Lock-In: The concept of lock-in dates to early 20th-century British economist Alfred Marshall’s theory on increasing returns, the success-breedingsuccess-to-excess mechanism that causes lock-in to occur. The subject of increasing returns and lock-in resurfaced in the 1980s with economists Paul Krugman from Princeton University and Paul David from Stanford University. Brian Arthur, formerly an economist from Stanford University, popularized the phenomenon of lock-in with his 1994 book, Increasing Returns
8
Best Laid Plans
and Path Dependency in the Economy, in which he described how reinforcing feedback taken to an extreme can cause inferior products to become the industry standard. Adaptation: The notion of adaptation originated at least as early as Adam Smith’s 1776 book, The Wealth of Nations, in which he wrote about how people and societies adapt to changing economic conditions. He described, for example, how people adapt to the rising prices of a good in short supply by making more of it in hopes of making a profit. In his famous 1859 book, On the Origin of Species, Charles Darwin explained how species evolved by adapting to their environments. Emergence: Adam Smith was among the first to introduce the notion of emergence, or things coming into being and taking on a life of their own. In The Wealth of Nations, Smith explains how a working economy can come into being all by itself as if “led by an invisible hand.” Emergence is a central concept in modern biology. The notion that ecosystems and other biological phenomena emerge on their own without a central control mechanism dates back at least to former Harvard entomologist William Morton Wheeler’s popular 1924 article, “Emergent Evolution and the Development of Societies.” Disturbed Equilibria: A common example of disturbed equilibria is the introduction of an invasive alien species to a healthy ecosystem, which can result in a natural disaster. Some of the better known examples of the destructive impact of alien species include the invasive weed kudzu, fire ants, killer bees, gypsy moths, and sparrows. Examples of disturbing social equilibria date back at least to Jane Jacobs’s 1961 book, The Death and Life of Great American Cities, in which she documented how ill-conceived urban renewal efforts destroyed well-established older cities in the United States. All of these concepts must be considered in examining and attempting to understand the origins of unintended consequences. The goal of this book is to improve readers’ decision-making abilities in their professional and personal lives by explaining how these eight social mechanisms cause unintended consequences. The explanation begins in chapter 2 with a discussion of how the sheer complexity of interconnections in modern societies can mask the impacts that acts may have across time, place, and sector. The following chapters describe how these complex interconnections give rise to strange and unexpected behaviors. Chapters 3 and 4 introduce two social mechanisms that can cause small events to unexpectedly mushroom out of control. The first is the chain reaction, which can cause outcomes to spread through society just like an epidemic infects an ever-increasing number of people. One infected person
The Tyranny of Unintended Consequences
9
transmits the disease to several others, who in turn infect many others, and so on. Chapter 4 introduces the reinforcing feedback mechanism, where success reinforces itself to produce more success and unexpected outcomes. Chapter 5 introduces the lock-in mechanism and illustrates how reinforcing feedback, when unabated, can unexpectedly cause inferior ideas and products to become de facto industry standards. Chapter 6 introduces the balancing mechanism that restores order when events deviate from the norm and, in the process, can create unexpected results. The chapter explains, for example, how a dominant party wielding too much power can lose its dominance. Chapter 7 introduces adaptation, explaining how humans adjust their behavior to changing conditions. For example, people respond to new incentives by gaming the system to maximize their benefit with unintended results. Chapter 8 introduces emergence and how phenomena can unexpectedly come into being under the right circumstances. It describes, for example, how the deportation of illegal immigrants who committed crimes spawned one of the most dangerous gangs in the world. Chapter 9 introduces the disturbed equilibria mechanism, showing how meddling with established biological and social systems can have unimagined disastrous outcomes. Each chapter ends with suggestions on how to deal with each social mechanism to avoid unintended consequences. Chapter 10 concludes with a general framework on how to intervene in social systems without causing too many things to go awry.
Chapter 2
The Web of Life The Ethanol Enterprise Few government programs have been as self-defeating as the U.S. Energy Policy Act of 2005, which mandated the addition of ethanol to gasoline to promote energy independence and reduce air pollution. Despite these laudable goals, the complexity of the ethanol program has caused it to backfire with many deleterious side effects. The act has not only failed to decrease reliance on imported oil but in fact has wasted energy, raised the cost of fuel at the pump, increased air and water pollution, exacerbated global warming and smog, and depleted scarce fresh water. Tragically, it has also increased world hunger and has fomented political unrest in developing countries worldwide. The U.S. Congress passed the Energy Policy Act of 2005, mandating the use of ethanol as a gasoline supplement and calling for the use of 4 billion gallons of ethanol in 2007 and 35 billion gallons by 2017. Scientists, however, have questioned whether the net energy savings from using ethanol as a gasoline supplement are negligible—or even negative—considering all of the energy consumed in making and shipping it. Cornell University professor of ecology and agriculture David Pimentel conducted a study on the energy efficiency of ethanol production and concluded, “there is just no energy benefit to using plant biomass for liquid fuel. These strategies are not sustainable.”1 Ethanol is primarily made by distilling fermented corn. This process requires considerable energy in producing fertilizers and pesticides; operating tractors, harvesters, and irrigation pumps; fermenting and extracting ethanol; and disposing of waste water. Ethanol gets contaminated with moisture when it is transported via lowcost pipelines used for gasoline, so it must instead be shipped by petroleumconsuming trucks, trains, and barges. Pimentel estimated that the entire process for making and shipping ethanol consumes 29 percent more energy than it yields. Furthermore, economists project that, by 2012, ethanol will replace only about 1 percent of the total amount of gasoline consumed in
12
Best Laid Plans
the United States and, thus, will have a negligible impact on U.S. dependence on imported oil. Ethanol has also increased the cost of fuel for motorists, because it is more costly to make than gasoline and has only two-thirds of its energy content. The reduction in carbon dioxide from automobile exhaust from the use of ethanol in fuel has been offset by carbon dioxide emitted in making and transporting ethanol. Worse still, farmers seeking to grow more acres of corn have cleared forested and fallow lands, releasing the carbon stored in trees and the biomass living in the soil, which otherwise would have consumed carbon dioxide as part of their life cycle. Furthermore, ethanol increases the emission of poisonous nitrous oxide, which is 296 times more damaging as a greenhouse gas than carbon dioxide. A 2008 study published in Science indicated that the use of ethanol will increase greenhouse gases twofold in the next 30 years. Corn also consumes vast quantities of nitrogen fertilizer, which flows off fields into streams, rivers, bays, estuaries, and oceans, promoting the growth of oxygen-depleting algae that kill fish and other wildlife. This is an acute problem in the agricultural Midwest, where tons of nitrogen fertilizer run off farms into the Mississippi River and down to the Gulf of Mexico, which has created a dead zone the size of New Jersey. Making a single gallon of ethanol consumes four gallons of increasingly scarce fresh water. Farmers have also begun planting corn in arid lands that require irrigation. Furthermore, the fermentation of corn creates tons of waste water that contaminates fresh water resources. The mandated use of ethanol in gasoline has made both corn and oil substitute fuels for automobiles for the first time in history. The result has been disastrous for food consumption worldwide. Before Congress passed the Energy Policy Act in 2005, corn sold for two dollars per bushel; two years later, its price rose to five dollars per bushel as production of ethanol consumed one-fourth of total corn yield in the United States. With mandates to increase the ethanol content in gasoline, experts predict that ethanol will consume one-half of total corn production in the United States. Ethanol has also increased prices for noncorn food. High corn prices prompted farmers to divert to corn production lands previously used for soybeans, wheat, rice, and other crops, thereby decreasing the supply and increasing the price of noncorn products. Farmers in 2007, for example, diverted 20 million acres from soybean to corn production. During the first six months of 2008, the price of soybeans, wheat, and rice increased by 76, 54, and 104 percent, respectively. Ethanol has also increased the price of meat, poultry, dairy, and other animal products as farmers passed along the higher
The Web of Life
13
costs of grains to feed their cattle, pigs, and chickens. A Purdue University study found that, in 2007, ethanol production had increased the cost of food prices in the United States by $15 billion, or about $130 per household. Unfortunately, the mandated use of ethanol has increased world hunger. As the world’s breadbasket, the United States is the leading producer and exporter of corn, accounting for 40 percent of the world’s total corn production and over half of its total exports. World Bank president Robert Zoellick noted that demand for ethanol and other biofuels is a significant contributor to soaring food prices around the world. Lester Brown, president of the Earth Policy Institute, noted that “the world’s breadbasket is fast becoming the U.S. fuel tank.”2 To make the amount of ethanol needed to fill a sport utility vehicle’s 25-gallon tank requires 450 pounds of corn, which would be sufficient to feed one person for a year. The surge in grain prices has been especially devastating for the estimated 3.7 billion malnourished people in the world who subsist on grain diets. The World Bank estimated that, in 2001, 2.7 billion of these people lived on less than two dollars a day, 50 to 80 percent of which was spent on food. People in Yemen, for example, spend 25 percent of their income on bread, the price of which has recently doubled. Ethanol-induced inflation in corn prices has been especially punishing in the 20 countries where people subsist on corn. Half of Mexico’s 107 million people, for example, live substantially on corn tortillas, and 80 percent of corn consumed in Mexico is imported from the United States. In 2006, the price of tortilla flour in Mexico doubled because the price of imported corn from the United States rose from $2.80 to $4.20 a bushel. Although Mexicans make tortillas from domestically grown white corn, the dramatic increase in U.S. import prices caused Mexican importers of U.S. yellow corn to switch to cheaper domestic white corn, thereby driving up the prices for tortilla corn. Ethanol’s impact on food prices has caused riots in Mexico, Egypt, Haiti, Cameroon, Guinea, Indonesia, Mauritania, Morocco, the Philippines, Senegal, and Yemen. The U.S. government’s ethanol program illustrates how unintended consequences can arise from the sheer complexity of and inextricable interconnections in life. Everything about using ethanol as a gasoline substitute is complex: the agricultural impacts in growing the vast amount of corn needed to make it; the production impacts of making and shipping ethanol and disposing of its by-products; and the chemistry of ethanol pollution. The ethanol program also illustrates how subtle linkages across time, sector, and place can create unexpected results. The oil embargoes of the 1970s prompted the adoption of ethanol as an alternative fuel decades later. The
14
Best Laid Plans
use of ethanol as fuel has affected the food sector by substantially increasing food prices, which has exacerbated hunger and political instability thousands of miles away in impoverished developing countries. Since the 19th century, scientists have understood that the living world consisted of complex systems of interdependent plants and animals. The extinction of one species in an ecosystem, for example, could cause the unexpected extinction of others. Charles Darwin used the following story in his 1859 book On the Origin of Species to illustrate the highly interconnected web of life: Bumble-bees are almost indispensable to the fertilization of the heartsease [a flower] . . . as other bees cannot reach the nectar. . . . The number of bumblebees in any district depends in great measure on the number of field-mice, which destroy their combs and nests. . . . Now the number of mice is largely dependent, as everyone knows, on the number of cats. . . . Hence . . . the presence of feline animals in large numbers might determine . . . the frequency of certain flowers in that district!3
By the early 20th century, social scientists began to envision societies as complex systems consisting of webs of human interaction via family, friends, employers, associations, religions, political parties, government organizations, and numerous other organizations and institutions. U.S. sociologist Stanley Milgram conducted a famous study on human interconnectivity in 1967. He sent 160 packages to randomly selected people in Omaha, Nebraska, asking them to forward their package to an acquaintance who would be best suited to deliver it to a named stockbroker in Boston, Massachusetts. Milgram found that it took on average six handoffs from acquaintance to acquaintance to get the packages delivered to the Bostonian, and so he concluded that people are separated from each other by only six contacts. Milgram published his findings in the popular Psychology Today article, “The Small World Problem.”4 Although his findings have been heavily criticized, more extensive research around 2001 using the Internet has generally supported his findings. Milgram’s conclusion is generally known as “six degrees of separation,” a phrase coined by U.S. author John Guare, who wrote a play on human interconnectedness. Social complexity can be a major cause of unintended consequences. As the late physician and popular author Lewis Thomas cautioned, “You cannot meddle with one part of a complex system . . . without the almost certain risk of setting off disastrous events that you hadn’t counted on in other remote parts.” 5 Robert Jervis, a political scientist from Columbia University, further argued that “interconnections can defeat purposive behavior. Not only can
The Web of Life
15
actions call up counteractions, but multiple parties and stages permit many paths to unanticipated consequences. With so many forces responding to each other, unintended consequences abound and the direct path to a goal often takes quite a different direction.”6
The Late Great Recession Failure to regulate the U.S. financial system when it was rapidly becoming too complex to manage had the unintended consequence of helping convert a housing bubble that burst in 2007 into a huge recession. This was the worst economic downturn since the Great Depression. The stock market dropped by half, causing consumers to lose trillions of dollars. The U.S. economy shrank 6 percent in a single quarter, and unemployment reached double digits. With increasing interlinkages among the world’s financial institutions and economies, the recession has spread globally. The fact that there is so little agreement on what ultimately caused the recession illustrates how complexity can defeat our best intentions. Boston Globe staff writer Drake Bennett lyrically described the complex linkages in the failing economy as “the dense, invisible lattice connecting house prices to insurance companies to job losses to car sales, the inscrutability of the financial instruments that helped to spread the poison, the sense that the rating agencies and regulatory bodies were overmatched by events, [and] the wild gyrations of the stock market.”7 Drake goes on to note that “an entire field of experts dedicated to studying the behavior of markets failed to anticipate what may prove to be the biggest economic collapse of our lifetime.”8 Economists generally agree that the housing bubble led to a banking crisis, which triggered the recession. Bubbles are frequent economic events: the U.S. economy had just recovered from the dot-com bubble that burst in 2000, and real estate has been notorious for recurring bubbles. Escalating home prices spawned a speculative frenzy of housing development, subprime lending, securitization of mortgages, and international investment. The bubble burst in 2007 with precipitously declining prices that prompted borrowers to default when they owed the banks more money than their homes were worth. This is where agreement on the causes of the recession generally ends, and there is heated debate over what actually caused the housing bubble. Some economists blame Federal Reserve chairman Alan Greenspan for keeping interest rates too low, thereby flooding the market with easy money. Others blame a surge in global savings from China and oil-rich countries that poured too much money into the housing market. Right-leaning pundits,
16
Best Laid Plans
economists, and politicians blame the housing bubble on the Community Reinvestment Act (CRA) of 1977, passed during President Carter’s term and amended under President Clinton’s in 1995 to encourage banks to offer mortgages to low-income, urbanites. They see the CRA as ill-conceived social engineering that backfired with the massive mortgage defaults by lowincome people who should never have been given loans in the first place. “This law,” according to the Wall Street Journal, “compels banks to make loans to poor borrowers who often cannot repay them.”9 Conversely, other economists argue that it was not possible for the 1977 CRA to have caused the housing bubble 30 years later, because mortgage defaults would have spiked much earlier. Federal Reserve economists Neil Bhutta and Glenn B. Canner conducted a study of mortgage defaults and originations and found that only 6 percent of the problematic subprime loans were originated by banks under the CRA regulation and that many more came from mortgage and finance companies and credit unions exempt from CRA regulation. They also found that mortgages in low-income, urban neighborhoods—the focus of CRA—performed as well as those from higher-income neighborhoods and concluded that “the available evidence seems to run counter to the contention that the CRA contributed in any substantive way to the current mortgage crisis.”10 Further complicating matters was a study released on May 5, 2010, by researchers Edward Glaeser and Joshua Gottlieb from Harvard University and Joseph Gyourko from the University of Pennsylvania, which found that neither low interest rates nor lax lending standards was a major cause of the housing bubble. Their study concluded that only 10 percent of the 70 percent increase in housing prices during the bubble could be attributed to low interest rates and that credit standards had not appreciably changed before, during, or after the housing bubble. The researchers, however, failed to find the actual cause of the housing bubble and stated that their study “aims to show that there are no easy answers to what went wrong and refute the easy but comforting notion that we understand what happened.”11 Explanations on how the housing bubble caused the near collapse of the financial system are even murkier. Root causes variously include complex financial products, deregulation, excessive leverage, off–balance sheet lending, easy money, fraud, failed risk management models, “shadow banks,” and feckless financial executives with risk-promoting bonuses and too-big-tofail mentalities. Even in hindsight, there seem to be as many explanations for the recession as there are economists proffering them. The complexity of U.S. and global financial systems had escalated beyond control, especially with the creation of derivatives, which are bonds
The Web of Life
17
derived from packaging pools of traditional assets to be sold to investors as securities. The mortgage-backed security is a derivative created by pooling mortgages originated by banks and other lending institutions and securitizing them for sale to investors. These derivatives have transformed mortgage lending from a system in which local banks invested their own deposits in mortgages that they retained on their books to one where banks sell off the mortgages they originated to investment banks, which then securitized them and sold them to investors worldwide. Mortgage products had become more complex than the traditional fixed-rate, 30-year loan with adjustable rates, zero-money-down, and other default-promoting features. Furthermore, investors seeking higher returns on investment in mortgage-backed securities prompted investment bankers to add highyielding subprime mortgages to their packages, making it difficult to determine the risk of the mortgage-backed securities they sold. Wall Street created more sophisticated “opaque securities,” including the credit default swaps that insured against default risk and were sold along with mortgage-backed securities and other derivatives to make them more attractive. The selling of swaps guaranteeing billions of dollars of subprime mortgages caused the failure of AIG, an otherwise creditworthy and profitable insurance company. In his 2002 letter to Berkshire Hathaway investors, Warren Buffett called derivatives “time bombs, both for the people who deal with them and the economic system.”12 The complexity of these derivatives and other exotic securities caused rating agencies to underestimate their risks until their prices plunged. While the U.S. financial system was becoming immensely complex, Federal Reserve chairman Alan Greenspan and other financial regulators promoted deregulation and lax oversight, believing in the soundness of the free market. Nobel Prize–winning economists Paul Krugman and Joseph Stiglitz blame the recession on the Gramm-Leach-Bliley Act of 1999, which repealed the Glass-Steagall Act, which was passed in 1933 to prohibit commercial banks from engaging in investment banking. Stiglitz noted that “the culture of investment banks was conveyed to commercial banks and everyone got involved in the high-risk gambling mentality. That mentality is the core to the problem we’re facing now.”13 In the depths of the financial crisis in December 2009, Senators John McCain and Maria Cantwell proposed reenacting Glass-Steagall. In January 2000, Congress passed the Commodity Futures Modernization Act, which exempted derivatives and swaps from federal regulation. A heated debate had broken out two years earlier, when Brooksley Born, chair of the Commodity Futures Trade Commission, drafted a proposal to regulate
18
Best Laid Plans
derivatives. As members of Bill Clinton’s Presidential Working Group on Financial Markets, Alan Greenspan, Treasury secretary Robert Rubin, and Securities and Exchange Commission chairman Arthur Levitt issued a proposal to keep derivatives and swaps unregulated, Senators Phil Gramm and Richard Lugar and Representative Thomas Ewing submitted a proposal to Congress under the Commodity Futures Modernization Act as a rider to an 11,000-page Appropriations Act that passed with little discussion. Federal Reserve chairman Ben Bernanke summed up the financial crisis as “financial innovation + inadequate regulation = recipe for disaster.”14 Jeremy Grantham, cofounder of investment manager Grantham Mayo, noted in his quarterly newsletter that “the current financial system is too large and complicated for the ordinary people attempting to control it; and most members of Congress know that they hardly understand the financial system at all.”15 While running for president, for example, Senator John McCain confessed that “the issue of economics is not something I’ve understood as well as I should.”16 Managers of financial firms similarly have limited understanding of the risks they have assumed. Grantham observed that “many of the banks individually are both too big and so complicated that none of their own bosses clearly understand their own complexity and risk taking.”17 Citigroup’s former director and chairman, Robert Rubin, confessed in a November 2008 interview with Newsweek that “he did not even know what a CDO liquidity put was,”18 which was the derivative that cost his bank $25 billion in losses and necessitated a federal bailout. Not until AIG’s near bankruptcy in September 2008 did senior management recognize the massive risks its small, London-based financial products division had created by selling credit default swaps guaranteeing billions of dollars of subprime mortgages. The Federal Reserve, the Securities Exchange Commission, and other financial regulators failed to anticipate or comprehend the emerging problems in the complex economy. Even leading economists question their understanding of the recession. Harvard economist David Laibson noted in 2008 that “everyone that I know in economics, and particularly in the worlds of academic finance and academic macroeconomics, is going back to the drawing board.”19 Likewise, the Great Depression remains a calamitous event that historians and economists have yet to fully explain despite decades of scholarly research. Massachusetts Institute of Technology economist Peter Temin noted that, “given the magnitude and importance of this event, it is surprising how little we know about its causes.” Some economists attributed the Great Depression to the 1919 Treaty of Versailles’s punishing war reparations that made it difficult for Germany and Europe to recover from World War I. Other
The Web of Life
19
economists claim that the stock market crash in 1929 led to the onset of the Great Depression. Other economists believe that the U.S. economy might have recovered but for the destructive trade war that erupted after the U.S. Congress passed the Smoot-Hawley Tariff Act of 1930. Some monetarists, like Nobel Prize–winning Milton Friedman, claim that the Great Depression was caused by a contraction in money supply, while Keynesian economists attribute it to a decline in consumer, business, and government spending.
Robbing Peter to Pay Paul The U.S. government’s attempt to protect its declining steel industry had the unintended result of damaging other industries. At the urging of steel industry lobbyists during the summer of 2001, President George W. Bush petitioned the International Trade Commission to impose tariffs on lowcost imported steel. In March 2002, the U.S. government set tariffs of up to 30 percent on 13 million tons of imported steel. The Bush administration hoped that the new tariffs would appeal to voters in Pennsylvania, West Virginia, Illinois, and other steel producing states. The steel tariffs quickly backfired both economically and politically. The tariffs immediately made construction firms, automobile manufacturers, appliance makers, and other manufacturers that used imported steel less competitive with foreign competitors. Jobs saved in the steel industry were lost in many other sectors of the U.S. economy. A study conducted by the Consuming Industries Trade Action Coalition found that the higher steel prices caused by the tariffs cost 200,000 U.S. jobs, which was more than the total employment in the U.S. steel industry. Foreign governments quickly retaliated with their own trade protections. Within weeks of enactment of the steel tariffs, the European Union threatened to impose tariffs on important imports from the United States—HarleyDavidson motorcycles and orange juice, for example—while Brazil, China, Korea, and other countries threatened to impose tariffs on U.S. cars, airplanes, computers, and other products that accounted for $200 billion in U.S. exports and 2.5 million U.S. jobs. The tariffs also encouraged consumers of steel products to seek cheaper materials with more stable prices. Lester Trilla, president of Trilla Steel Drum Corp., testified before the House Small Business Committee on how the tariffs posed a threat to his business, which makes steel drums to store waste: The price of the domestic steel that we now must buy has increased by over 54 percent since the imposition of the steel tariffs. Our customers are starting to look for lower-cost plastic and bulk packaging. More significantly, if this
20
Best Laid Plans
situation continues for any length of time, some of our larger customers will choose to fill drums overseas. This would dramatically reduce production and jobs at Trilla and other American drum manufacturers.20
The steel tariff backfired economically because it threatened to damage the U.S. economy. Efforts to save the U.S. steel industry ironically caused its market to shrink. The tariff backfired politically because it threatened to eliminate more jobs in many U.S. industries than it would have saved by protecting U.S. steel producers, enraging voters throughout the country. Domestic and foreign political pressures and a November 2003 ruling by the World Trade Organization that the steel tariffs violated international treaties eventually forced President Bush to repeal the steel tariffs in December 2003. President Bush’s steel tariff shows how the complex social interconnectivity among people and organizations can cause plans to go astray. No act is ever performed in isolation. Every act sends ripples of change throughout innumerable pathways in society, making it difficult to anticipate the many possible outcomes. Efforts to protect one industry invariably travel along the immense web of international trade with the unintended consequence of harming others.
Connections across Time Sometimes deeds have unintended impacts that occur much later. The U.S. Supreme Court legalized abortion in January 1973 to give women the choice to avoid harm and the “distress, for all concerned, associated with the unwanted child.”21 Nevertheless, John Donohue III from Stanford Law School and economist Steven Levitt from the University of Chicago theorized that the legalization of abortion has likely caused a dramatic drop in crime two decades later. In their compelling and controversial article “The Impact of Legalized Abortion on Crime,”22 the authors claim to be neither for nor against abortion. The authors begin with the observation that there are no probable explanations for the dramatic decrease in crime from 1991 to 2000, when murder rates dropped 40 percent and property crime and violent crime plummeted 30 percent. Neither increased policing nor improved economic conditions, for example, can account for these extraordinary statistics. There had, however, been a dramatic increase in the number of abortions 18 years earlier, when they became legal in 1973. In that year, the Supreme Court ruled that it was unconstitutional for states to outlaw abortions. Thereafter, the number
The Web of Life
21
of documented abortions rose from 750,000 in 1973 to 1.6 million in 1980, which was about one abortion for every three live births. The authors theorized that the drop in crime 18 years after the legalization of abortions could be explained by the fact that most crimes are committed by people who are between 18 and 24 years of age. This means that the surge in abortions in the early 1970s would decrease the number of people in their peak crime ages 18 to 24 years later. The authors offered a deeper analysis by noting that abortions reduce the number of unwanted children. The authors cited a number of studies indicating that teenage, unmarried, and economically poor women are the ones most likely to seek abortion and are also the ones most likely to give birth to unwanted children. In turn, these children may be neglected or provided with inadequate parenting and, thus, may be more prone to commit crimes when they reach their peak crime years. Fifty-seven percent of prisoners, for example, had either one or no parents versus 27 percent for the general population. The authors further observed that crime rates had dropped a few years earlier in the five states that had legalized abortion a few years before the Supreme Court decision. Moreover, the states with the highest rates of abortion had the most substantial drops in crime years later. The theorized impact of abortions on crime committed some 20 years later illustrates how time lags in causality add to life’s complexity. Another case involving substantial time lags is how the decision in the early 19th century to use wagons as the first railroad cars inadvertently led to the current use of inefficient railroad gauges in the United States. The U.S. Railroad Gauge Standard specifies that all U.S. railways must be spaced four feet, eight and a half inches apart. This standard gauge is too narrow for high-speed travel, and wider gauges of six feet or more would provide greater stability. We are stuck with the 4/8.5 gauge today, because changing to a broader-gauge track would require rebuilding the entire U.S. railway system, which would be prohibitively expensive. Decisions made many decades ago regarding rail gauges have persisted to this day. By the time of the Civil War, major U.S. railroads employed seven different rail gauges, ranging from four feet, three inches to six feet. The use of multiple track gauges seriously impeded the development of a national rail system, because goods had to be loaded and unloaded to and from trains using different track gauges. In the 1860s, the U.S. government decided to pick a single gauge for all U.S. railways, prompting a “battle of the gauges” among railroad companies. Although President Lincoln favored the use of broader gauges, the U.S. government ultimately picked the 4/8.5 gauge, because 53 percent of the total track mileage in the United States at the time
22
Best Laid Plans
used the narrower gauge. Changing to a different gauge would have involved the re-laying of thousands of miles of tracks and the widening of tunnels and ledges through mountain passes. The 4/8.5 gauge became popular by 1860 because the U.S. railway industry had imported British-made locomotives, which were designed to run on Britain’s standard 4/8.5 gauge track. In the 1840s, Britain had its own battle of the gauges that pitted the 4/8.5 gauge against another track system that used a gauge of just over seven feet. Initially, Britain’s Parliament selected the seven-foot gauge, but the 4/8.5 gauge ultimately won, because it had the largest share of tracks. In 1846, Britain passed the Gauge Act, requiring all rail construction to conform to the narrower 4/8.5 gauge. The reason that the 4/8.5 had become the standard gauge in the early days of British railroads dates back to around 1560, when horse-drawn carriages were used to haul coal from mines using wooden tracks. The use of tracks reduced friction and enabled horses to pull much heavier loads. In 1813, the owners of the British Killingworth coalmine asked engineer George Stephenson to develop a steam locomotive to haul coal. Stephenson determined that iron tracks would be needed to support his heavy locomotive, and in laying the new iron rails, he maintained the same gauge as the original wooden rails, which were spaced four feet, eight inches. In 1821, the British Stockton & Darling Railway company appointed Stephenson to build a railway to transport coal from the mines to the British coast, and, in 1825, he opened the first public steam-powered railway system. In laying the tracks, Stephenson used the same rail gauge that he previously used for the Killingworth mine railway, adding a half-inch to the rail gauge to reduce friction. Ironically, the U.S. standard railway gauge of 4/8.5 is based on the wheel spacing of horse-drawn wagons in 16th-century Britain, the unintended consequence of which has been to make modern high-speed trains less stable than if a broader gauge had been chosen.
Paved with Good Intentions Sometimes our deeds have unintended impacts in faraway places. Few Americans, for example, are aware that donating their used clothing to charitable organizations like Goodwill contributes to impoverishment in sub-Saharan Africa. Americans donate two and a half billion pounds of used clothing each year to places like Goodwill Industries and the Salvation Army, an amount that greatly exceeds what these charitable organizations can sell locally. These charitable organizations sell the bulk of the donated clothing to
The Web of Life
23
wholesalers for a few pennies per pound, who in turn pack the used clothes in bales and ship them to Africa. Approximately 80 percent of all clothing donated to charitable organizations gets exported to Africa; in fact, about 16 percent all of goods shipped to Africa are used clothing. African merchants purchase the bales for about $180 and resell individual clothing items for a few dollars, which is cheaper than the cost to produce such an item by local textile manufacturers. The used clothing tends to be of higher quality than what can be produced locally and is often valued by Africans as fashionably appealing. Although Africans have become more fashionable by Western standards, this has come at the expense of their local textile industries and overall economic development. The decline of the textile industry in Zambia is a good example. A small landlocked country of about 10 million people, Zambia gained its independence in 1964 after being a British colony for 75 years. It began building its economy on the mining of copper and other minerals and on a thriving textile industry. Shielded from foreign competition by import tariffs, Zambia’s textile industry by 1991 grew to 140 companies that employed about 34,000 people. When copper prices plunged in the early 1990s, Zambia sought loans from the World Bank, which, in granting them, encouraged the country to adopt a free-market economy. Zambia, in response, repealed its import taxes on textiles in 1992. Almost immediately, importers flooded the country with used clothing from the United States. Zambians believed that the used clothing came from deceased Americans and called them “dyed in America.” Howard Gatchyell, chairman of the Chamber of Commerce in Ndola, Zambia, noted that “you can walk for miles at a time here and not see anyone wearing anything remotely resembling African clothing.”23 The popularity of imported used clothing caused the demise of Zambia’s textile industry. By 2000, only 8 of the original 140 textile firms remained, causing a drop in textile employment from 34,000 to 4,000 jobs. The World Bank subsequently called the collapse of Zambia’s textile industry an “unintended and regrettable consequence of the free-market policies promoted by the organization.”24 Some economists specializing in developing countries claim that the importation of cheap used clothing from the United States and other wealthy countries has greatly impeded Africa’s overall economic progress. Textile industries are one of the first steps toward industrial development. Basic textile manufacturing requires little capital or technology, and, because it is labor intensive, it is a big employer. As a percent of Africa’s total manufacturing
24
Best Laid Plans
output, textile production declined from 3.l percent in 1970 to 1.6 percent in the 1990s. University of Toronto economist Garth Frazer states, “African progress has been pre-empted for thirty years by the donations of used clothing to charitable thrift shops throughout the industrialized world.”25 Another instance of actions in one part of the world unintentionally affecting other parts of the world thousands of miles away is how the development of the drug Viagra by U.S. pharmaceutical firm Pfizer in 1998 has helped save endangered species from around the world whose body parts are traditionally used to enhance virility in Asia. Approximately 150 million men in the world suffer from erectile dysfunction (ED), a figure that is expected to double in the next 15 years due to an aging population. Asian men have traditionally sought to cure ED by consuming potions made from animals, including seal and deer penises, deer antler velvet, green turtle eggs, sea cucumbers, geckos, and sea horses. Some of these animals, like the green sea turtle, are endangered, and their consumption as part of traditional Chinese medicine (TCM) threatens their dwindling populations. Frank von Hippel, a biologist from the University of Alaska, and his brother William, a psychologist at the University of New South Wales in Australia, have contended that the introduction of Viagra should eventually supplant TCM and help save endangered species, because the drug is highly effective with immediate results and is cheaper than traditional drugs made from animals. Because black market trade involving endangered species was too difficult to study, the two researchers focused on Viagra’s impact on the harvesting of harp seals whose penises are used to make medicinal potions for ED in Asia. In a series of articles beginning in 1998, they cite as evidence that the drop in the number of harp seals harvested in Canada coincided with the introduction of Viagra: sealers harvested 244,000 (the maximum allowable) harp seals in 1999 and about 92,000 in 2000. The von Hippels also reference the sharp drop in the market for harp seal penises from 40,000 organs sold in 1996 to 20,000 in 1998 and the decline in the price for harp seal penises from $100 in 1996 to $15 in 1999. Critics have challenged the von Hippels’ theory, citing the lack of direct evidence that Viagra was supplanting TCM in Asia. They note that the harvest of harp seals rebounded in 2001 and that the previous decline in the harvest was more likely due to the economic downturn in Asia at the time. They further argue that Asians were skeptical of Western medicine, especially drugs such as Viagra that had occasional side effects. The critics cite as an example the fact that the introduction of aspirin had not curtailed the use of ground tiger bones in Asia to relieve pain.
The Web of Life
25
In 2004, the von Hippels sought to address these challenges by surveying 256 men over 50 years of age at a Hong Kong clinic to determine the use of Viagra versus TCM. They found that the men seeking a cure for ED were more likely to be using Western drugs like Viagra and that two-thirds of previous TCM users for ED had switched to Western drugs, with only a third continuing to rely on TCM. Furthermore, none of the men who had switched to Western drugs ever switched back to TCM. The development of Viagra in the United States seems to be reducing the use of TCM in Asia and thereby reducing the slaughter of harp seals in Canada—at least for their penises. The von Hippels found that Asians still preferred TCM for other disorders. A third example of how actions in one part of the world have unintended remote impacts is NASA’s finding in 2009 that the Clean Air Act’s success in dramatically reducing airborne sulfate emissions in the United States may have significantly contributed to the warming of the Arctic. Originally passed in 1963, the Clean Air Act was amended in 1970 to reduce airborne contaminants that were known to be harmful to humans. It was amended again in 1990 with the intent to reduce acid rain by cutting emissions of sulfates like sulfur dioxide and nitrogen oxides. The act has successfully reduced sulfate emissions by 50 percent in the United States over the past three decades, dramatically improving air quality and public health. This result prompted the Economist magazine to hail the Clean Air Act as “probably the greatest green success story of the past decade.”26 Europe has passed similar air quality laws and has experienced a 60 percent drop in sulfate concentrations in its atmosphere. Although global warming is usually associated with carbon dioxide and other greenhouse gases, NASA scientist Drew Shindell’s reported in April 2009 that his research indicated that changes in tiny airborne particles called aerosols may “account for 45 percent or more of the warming that has occurred in the Arctic during the last three decades.”27 There are two types of aerosols that have been known to affect climate change: carbon black and sulfates. The unclean industrial burning of diesel oil primarily in Asia generates carbon black, which absorbs solar radiation and has a warming effect on the climate. Conversely, sulfur dioxide and other sulfates come from power plants burning coal and oil and have a cooling effect because they block solar radiation from reaching the Earth’s surface. Europe and the United States have dramatically reduced these cooling sulfates in the middle and upper latitudes of the Northern Hemisphere, including the nearby Arctic (Antarctica, being remote from Europe and the United States, is much less affected). Shindell’s model indicated that the impact of
26
Best Laid Plans
sulfates and other aerosols had a huge impact on global warming. Moreover, he discovered that the regions of the Earth that were most sensitive to changes in aerosols were those like the Arctic, which had in fact witnessed the greatest increases in temperature. NASA claims that, “since the 1890s, surface temperatures have risen faster in the Arctic than in other regions of the world. . . . Clean air regulations passed in the 1970s, for example, have likely accelerated warming by diminishing the cooling effect of sulfates.” Fast Company journalist Anya Kamenetz said that NASA’s research showing that the Clean Air Act is a major contributor to global warming “must rank as the mother of all unintended consequences.”28 Research reported in 2008 by Europe’s Institute for Atmospheric and Climate Science supports NASA’s findings. The institute’s researchers sampled aerosol concentrations from six northern European locations from 1986 to 2005 and found that their concentrations had plummeted 60 percent during the 19 years. They also found that solar radiation reaching Earth’s surface during the same period had increased by about 9 percent. The evidence suggests that the reduction in sulfates and other aerosols in Europe was permitting more sunlight to penetrate the atmosphere. Coauthor of the study, Rolf Philipona, reported, “The decrease in aerosols probably accounts for at least half of the warming over Europe in the last thirty years.”29 The complexity of global warming and the other cases in this section poignantly illustrates how our actions can have unexpected impacts for thousands of miles.
6 ÷ 2 = 3 Degrees of Separation A discussion on how the complexity of human society can cause unintended consequences would be incomplete without the following point: our social world is rapidly getting much more complex. Recent advances in communications technology and social media surely have made humans globally more interconnected. The Internet makes it easy to reach people around the world, a feat that would have been unthinkable not too many years ago. Social media like Facebook and Twitter promote networks of people who frequently stay in touch with each other. In fact, French mobile phone carrier O2 issued a press release in August 2008 titled “The Six Degrees of Separation Is Now Three,” claiming that, due to new communications technologies, “the conventional notion of six degrees of separation is out of date.”30 O2 commissioned a study to repeat sociologist Stanley Milgram’s 1967 experiment by asking participants to make contact with randomly selected unknown persons from around the world using their personal connections. The O2 press release claimed that its research showed
The Web of Life
27
that it now only took three steps to reach an unknown person. The O2 study, however, was different than Milgram’s in that it studied degrees of separation within a “shared interest network” such as clubs, hobbies, and religion, which would be expected to produce a lower number than six. Nevertheless, the O2 study included interviews with people from a wide age range that showed that respondents now felt much more interconnected using Internet, cell phones, texting, and social media than even five years ago. Greater interconnectivity means more complexity and an increased likelihood that our deeds will have unexpected outcomes. The impacts of our acts will cascade faster, further, and broader throughout networks of people than they would have in the past. This growing complex interconnectivity among people, organizations, and institutions will increasingly cloud our ability to anticipate the multiple outcomes of our deeds, some of which will be unexpected and undesired. The complex interconnectivity among people sets the foundation for more complicated social mechanisms that give rise to unintended consequences that will be addressed in the following chapters. The next chapter, for example, shows how human interconnections can lead to chain reactions of fear, hysteria, and other forms of herd behavior that can unintentionally be triggered by small events.
Chapter 3
The Domino Effect Thousands of Rapes In December 2003, Llyas Kuncak exploded a deadly car bomb in downtown Istanbul, killing himself and 12 others and destroying the Turkish headquarters of the HSBC building. Kuncak’s son Nurullah reported that his father committed the terrorist act to avenge the thousands of rapes committed by U.S. soldiers in Iraq. Nurullah said: “Didn’t you see, the American soldiers raped Iraqi women. My father talked to me about it. . . . Thousands of rapes are in the records. Can you imagine how many are still secret?”1 Influential Islamic journals had reported the rapes, and Turkish politicians and religious leaders castigated the United States for the incident. The mass rapes, however, never happened. They were a false rumor unintentionally started by California sex therapist Susan Block. Block has a PhD in philosophy, calls herself a sex educator, has written a book called The Ten Commandments of Pleasure, hosts an erotic television program, and maintains a pornographic Web site. On April 15, 2003, Block posted an antiwar article titled “Smile and Enjoy It: The Rape of Iraq” on the popular Web site counterpunch.org under the seemingly authoritative name of “Dr. Susan M. Block.” It made no mention of U.S. soldiers raping Iraqi women but drew metaphorical parallels—indeed, very racy ones—between the act of rape and the damage to Iraq inflicted by the U.S. invasion. The leading Islamist journal, Yeni Safak, started the false rumor on October 22, 2003, when it published a front-page story about the rape of Iraqi women by U.S. soldiers. It stated, “In addition to the occupation and despoliation, thousands of Iraqi women are being raped by American soldiers. There are more than 4,000 rape events on the record.”2 The author cited Block’s article “The Rape of Iraq” as its source, falsely claiming that her article documented incidents of mass rapes. The Yeni Safak article launched a rumor mill that rapidly spread until the stories about the rape of Iraqi women pervaded Turkish society as an established fact. Middle East columnist Charles Radin reported, “The allegations
30
Best Laid Plans
can be heard almost everywhere in Turkey now, from farmers’ wives eating in humble kebab shops, in influential journals, and from erudite political leaders: American troops have raped thousands of Iraqi women and young girls since ousting Dictator Saddam Hussein.”3 The Iraqi rape story illustrates how a small deed can trigger a chain reaction of rumors that unintentionally caused a suicide terrorist attack and further diminished sympathy for the United States’ war on terrorism. Reflecting on the incident, Block said, “I am a sex therapist and I use sexual terminology for political commentary. . . . I am appalled to be misquoted and even more appalled that the story inspired someone to such violence.”4 The cascading fall of dominos caused by tipping a single piece is an apt metaphor for social chain reactions. Social chain reactions work like epidemics. If, for illustration, one sick person infects three others per day, who in turn each infect three others, and so on, then the initial sick person would cause 59,000 others to become infected 10 days later. This happened between 1918 and 1919, when an influenza pandemic infected one-third of the world’s population and killed an estimated 50 million people. The spreading of rumors is a simple social chain reaction. The initiator of a rumor tells a few people, each of whom tells a few others, and so on, causing the rumor to spread rapidly through society like an epidemic. As Roman poet Virgil noted, “Rumor flies.” The instigation of rumors gives rise to unexpected outcomes, especially given that rumors distort messages as they get passed along. Faced with the uncertainties of combat, crowded conditions, and limited information, military organizations are breeding grounds for rumors. One soldier catching a tidbit of sensitive information could set off a chain reaction of gossip that might ultimately aid the enemy; ergo, the World War II slogan, “Loose lips sink ships.” George Washington understood the perils of rumor and shared his battle plans with only a few trusted officers. Even as his troops mobilized, they had no advanced warning before they evacuated Long Island, New York, to escape from the British or before they crossed the Delaware River to attack the German mercenaries in Trenton, New Jersey. Another instance of a rumor with huge consequences began with a fictitious letter written in 1831 that unintentionally triggered a massive migration to the U.S. West. A Methodist journal featured an apocryphal letter from a Native American in the Oregon Territory stating his desire to convert to Christianity. The letter prompted a number of Christian missionaries to head west to convert the pagan tribes. Although the Indians resisted conversion, the missionaries wrote to friends back home extolling Oregon’s beauty, inducing thousands of Easterners to set forth on the Oregon Trail.
The Domino Effect
31
Instigating rumors is an age-old weapon for ruining political opponents’ reputations. Rumors that John Adams was a monarchist helped Thomas Jefferson defeat him in the United States’ fourth presidential election. Some historians have claimed, as did Adams, that Jefferson started the rumor. Rumors helped George W. Bush win two presidential elections. Rumors that Senator John McCain had fathered an illegitimate African American child helped Bush defeat McCain in the 2000 Republican nomination. Rumors that Senator John Kerry had exaggerated his military achievements during the Vietnam War helped Bush defeat him in the 2004 presidential election. The advent of the Internet makes social chain reactions all the more powerful, as illustrated by the experience of Massachusetts Institute of Technology graduate student Jonah Peretti, who unwittingly became a celebrity in the global labor movement. In January 2001, Peretti decided to order a pair of running shoes from Nike’s Web site and was intrigued by the option to print a personal message on the shoes. Seeking to protest Nike’s use of foreign labor, Peretti requested that Nike imprint his shoes with the word sweatshop. Nike sent back an e-mail refusing Peretti’s request, saying, “Your personal ID contains profanity or inappropriate slang.”5 Peretti countered that sweatshop was neither profane nor slang and that he chose the word so he could “remember the toil and labor of the children who made my shoes.”6 As a lark, Peretti forwarded a copy of his e-mail exchanges with Nike to his mother and a few friends, who, unbeknownst to him, forwarded it on to their contacts, who in turn forwarded it on to their contacts, until the e-mails reached labor activists. Soon, Peretti began receiving 20 to 30 e-mails per day. Liberal publications like the Village Voice began publishing Peretti’s e-mail exchange with Nike, which induced mainstream publications like the Wall Street Journal and Business Week to cover the story. Within weeks, Peretti’s one-man protest mushroomed to the point where he began receiving 500 e-mails each day from teachers, unions, labor activists, and church groups around the world, including everything from hostile notes to marriage proposals.
Herd Mentality Speculative manias, panics, crazes, riots, witch hunts, and other forms of mass hysteria are chain reactions that occur when the behavior of one or a few persons catches on and rapidly spreads throughout society. Sociologists call these tumultuous events collective behaviors, which they define as “spontaneous, unorganized, and unpredictable social behaviors.”7 Such
32
Best Laid Plans
behaviors can be triggered by small events and can create unintended outcomes. People behave differently in crowds. Crowds distort reasoning, loosen inhibitions, and intensify emotions. During a lynching, for example, otherwise peaceful people commit barbaric deeds. Over100 years ago, French scientist and philosopher Gustave Le Bon observed, “Crowds exert a profound and inherently negative influence on people.”8 He also noted, “Behavior of people in crowds where individual behavior deviates from normal, tends towards unpredictable and potentially explosive behavior.”9 Mass panic, for example, occurred in August 2005, when thousands of Iraqis were crossing a bridge to a sacred Shiite shrine in Baghdad. Terrorist suicide bombers had recently attacked such large gatherings of Shiites, and the crowd was tense. Someone in the crowd started a rumor that a suicide bomber was in their midst, which created a stampede that killed more than 960 people who were crushed to death or drowned in the Tigris River when a railing gave way. Controversy remains over whether the rumor was instigated to induce mass hysteria. One of the most famous instances of unintended mass panic occurred on Halloween night in 1938, when actor Orson Welles broadcasted on CBS radio a dramatization of The War of the Worlds based on H. G. Wells’s popular novel. He began his program by pretending to interrupt a popular CBS music popular program with pressing news about a Martian invasion, which listeners mistook for a real crisis. Welles crafted the radio program to sound exactly like a live news broadcast, replete with fake news flashes such as “A huge flaming object has just dropped in Grover’s Mill, New Jersey.”10 Thousands of terrified Americans fled their homes to seek safety from invading Martians, while doctors and nurses called the police offering to care for the injured. A similar dramatization of The War of the Worlds produced by a Chilean radio station in 1944 drove people into the streets of Santiago, prompting a provincial governor to mobilize troops to fight the invading Martians. The same thing happened in 1949 in Quito, Ecuador, where a radio dramatization of The War of the Worlds caused tens of thousands of panicked citizens to run down the streets in their nightclothes. The Martian attack hoax shows that “you need only fool a relatively small portion of people for a short period to create large-scale disruptions to society,”11 noted psychologist Robert Bartholomew. The ease with which popular media can cause mass panic has, in fact, prompted the U.S. Federal Communications Commission to fine media broadcasters $250,000 for knowingly broadcasting erroneous information.
The Domino Effect
33
In 1974, another mass panic broke out in Nebraska and South Dakota involving the mysterious mutilation of cattle. A rancher claimed that some of his cattle had been killed, partially dismembered, drained of blood, and had their sex organs, tongues, and ears removed. Within a few months, local authorities received nearly 100 reports of cattle mutilation caused by a hairy, humanlike beast. Inspections of the dead cattle later revealed that they had died of natural causes and that scavengers feasting on the carcasses had caused them to appear mutilated. Again, this incident shows that one individual can put into motion an unfounded rumor that affects many others. Retreat during combat is another form of mass panic whereby a single retreating soldier can unintentionally foment a chain reaction of fear, causing an entire army to flee. Retreats become routs as fleeing soldiers become highly vulnerable to attack from behind and enemies become emboldened by the chase. “Everyone rates the enemy’s bravery lower once his back is turned and takes much greater risks in pursuit than while being pursued,” noted Prussian military philosopher Carl von Clausewitz.12 To avoid such devastating routs, Roman generals condemned to death by beating, stoning, or flogging 1 out of every 10 soldiers in divisions with one or more retreating soldiers. Stock market crashes, bank runs, and other panics are chain reactions of fear of financial ruin that can be unintentionally triggered by small events, as in the infamous case of the Babson Break. The sponsors of the Annual National Business Conference held on September 5, 1929, had no idea that its guest speaker, investment advisor Roger Babson, would trigger the crash of 1929, one of the worst stock market crashes in U.S. history. Babson gave his perennial “the crash is coming” speech at the conference’s luncheon, which the audience ignored because he had cried wolf one too many times. Unfortunately, however, Babson’s speech was broadcast live over the Dow Jones financial news tape universally used by Wall Street brokers and it started a selling frenzy that drove the Dow Jones Industrial Average down 3 percent by the end of the day. Although the market recovered over the following few days, the Babson Break set the stage for more serious price declines during October 1929, which were fueled by panic and margin calls. By the end of 1929, the Dow Jones Industrial Average dropped 35 percent, causing investors to lose $25 billion, or approximately $319 billion in 2008 dollars. Former millionaires went bankrupt and some committed suicide. Middle-class speculators had to sell their homes. “Nothing could have been more ingeniously designed to maximize the suffering, and also to insure that as few as possible
34
Best Laid Plans
escaped the common misfortune,” noted Harvard economist John Kenneth Galbraith.13 Bank runs are panics driven by rumors that a bank will fail and that depositors will lose their life’s savings. The epidemic of fear can easily infect other banks—even healthy ones—which caused the demise of the Bank of the United States. Founded in 1923, the Bank of the United States grew rapidly by acquiring and merging with other banks and, by 1930, had 62 branches in New York City and 400,000 depositors, the largest number of depositors in the United States. Although the bank had successfully weathered the crash of 1929, rumors spread throughout New York City that the bank was financially troubled. The New York Times claimed that disgruntled investors had intentionally spread the false rumor to damage the bank, which had the unintended consequence of causing many other banks to fail. The rumors induced 2,500 depositors on December 10, 1929, to wait in line for hours to withdraw their money from a small branch in the Bronx borough of New York City. This event attracted 20,000 onlookers. The next day, a bank run was in full swing at the Bank of United States’ other branches, forcing New York’s superintendent of banks to immediately close it. The failure of the Bank of United States was the biggest bank failure in U.S. history, and the financial panic it precipitated caused 325 other banks to fail by 1933. The fact that some depositors mistook the Bank of the United States as the official bank of the U.S. government helped fuel the banking crisis. Business executives also suffer from mass panics. News of a possible economic slump frequently causes executives to slash discretionary expenditures, freeze hiring, and lay off workers, with the unintended consequence of exacerbating or even causing a recession. As the year 2000 approached, a millennium computer panic spread among business executives who feared a failure of core systems that were not programmed to handle 21st-century dates. Despite the widespread fears, the turn of the century at midnight was a nonevent. In the late 1990s, the overly hyped New Economy based on the Internet and e-commerce spread panic among business executives fearing that Internet-based dot-coms would quickly put them out of business. Proponents of e-commerce claimed that time in the New Economy would run 7 to 10 times faster than business as usual and that anyone who had not dominated a space with e-commerce by 1999 would be a goner by 2000. The New Economy panic caused executives to needlessly spend vast sums adapting their firms to e-commerce. By 2000, for example, 76 percent of major U.S. corporations had hired e-consultants to help them develop e-commerce
The Domino Effect
35
versions of their older business models. According to Fortune magazine in 2000, “Jack Welch mobilized 340,000 employees to make General Electric an e-company immediately. Suddenly thousands of organizations realized they were late to the party and needed to get there fast.”14 Ironically, the first success stories in e-commerce were large, established firms that merely augmented their traditional businesses with Internet features, while the vast majority of the much-vaunted dot-coms vanished.
Speculative Mania and Bubbles The rapid escalation of U.S. housing prices and their sudden collapse in 2007 was a calamitous event that thrust much of the world into the worst economic decline since the Great Depression. Housing prices in the United States had experienced modest increases before 1998, an increase to around 7 percent in 1998 and up to about 20 percent by 2004. Housing prices began to fall in 2007 and plunged 20 percent during each of the following two years, the largest decline in housing prices in U.S. history. The collapse of the U.S. real estate market in 2007 was what the Economist magazine called the “biggest bubble in history.”15 Bubbles are periodic economic events that begin when a sudden increase in the price of an asset like real estate triggers a chain reaction of greed, infecting an ever-expanding web of speculators. Eventually, the bubble becomes poised to burst when the price of the asset far exceeds its true economic value, at which point the smallest of events can unleash a financial panic causing prices go into a free fall as speculators bail out at any price they can get. The ultimate causes of the 2007 housing bubble are complex and include low interest rates, lax lending practices, and exotic mortgage instruments that lured home buyers with initial low payments and caused them to default when they had to make higher payments. An underlying theme, however, was that lenders and borrowers falsely assumed that buying a home was a good investment and that rapidly rising housing prices before 2005 would continue—a common feature of all bubbles. At a June 2010 congressional hearing, investment guru Warren Buffett claimed that the surge in housing prices was the “greatest bubble I’ve seen in my life,” and he too was fooled into thinking prices would continue to rise. “Look at me, I was wrong on it too.”16 Federal Reserve chairman Ben Bernanke noted at meeting in January 2010 that “Both lenders and borrowers became convinced that house prices would only go up. Borrowers chose, and were extended, mortgages that they could not be expected to service in the longer term. They were provided these loans on the expectation that accumulating home equity
36
Best Laid Plans
would soon allow refinancing into more sustainable mortgages. For a time, rising house prices became a self-fulfilling prophecy.”17 Over the past several hundred years, scores of speculative bubbles have involved precious metals, sugar coffee, cotton, wheat, canals, mines, building sites, foreign exchange, and, of course, the stock market. Some economists claim that bubbles are hard to discern until they collapse and in hindsight seem to have been obvious. The dot-com bubble was especially bewildering and occurred just a decade before the housing bubble in 2007. The media and the investment community unintentionally launched the dot-com bubble that thrust the U.S. economy into a recession after the bubble burst in 2000. A few entrepreneurs amassed a fortune overnight by creating Internet businesses like Amazon.com. In December 1998, Merrill Lynch predicted that Amazon.com’s stock, then trading at $243 per share, would reach $400 in three weeks—despite the fact that Amazon.com’s prospectus at the time said it would “continue to incur substantial losses for the foreseeable future.” Almost like clockwork, Amazon.com’s stock hit $400 a share, and its founder, Jeff Bezos, became Time magazine’s Man of the Year. Eager to get a piece of the Internet gold rush, venture capitalists provided start-up capital to just about anyone with an e-commerce business plan. Fund managers quickly snapped up dot-com shares when they went public, making dot-com founders instantly wealthy—at least on paper. (Initial public offerings frequently restricted founders from quickly selling their shares.) A chain reaction of greed spawned the founding of hundreds of Internet firms like pets.com selling everything imaginable, from plants to pornography. Speculation in dot-com shares drove the NASDAQ stock index up 119 percent in little over a year—more than 10 times the long-term average annual return on the stock market. Amazon.com became worth $561 million. eToys became worth more than Mattel and Hasbro combined. Garden.com became worth $186 million, despite the fact that it only had $2.6 million in sales and was years from turning a profit. During the spring of 2000, the whole world seethed with get-rich Internet schemes. Talented people abandoned high-paying, prestigious jobs with major corporations, law firms, consulting firms, and investment banks to join Internet companies in hopes of becoming very rich. Top consulting firms lost 40 percent of their staffs to dot-com startups, and leading law firms had to dramatically boost salaries to retain junior associates. Burning through millions of investors’ dollars had cache. “It’s not about profits” became the mantra, and many dot-coms lost money on every sale they made. Kozmo.com, for example, promised to deliver any item worth
The Domino Effect
37
60 cents or more free of charge within 30 minutes any day of the week, 24 hours a day. “We will deliver a pack of gum for free,” boasted Joseph Park, who, at age 26, founded Kozmo in 1997.18 Typical of such dot-com ventures that seemed too good to be true, Kozmo failed in April 2001. The dot-com craze was poised to burst in the spring of 2000, when investors lost confidence in the future profitability of these ventures, and rumors of their imminent demise triggered a global contagion of fear. Widespread panic erupted, causing the NASDAQ index to drop 53 percent in nine months, ushering in the Internet winter. (To put the plunge into perspective, the Dow Jones Industrial Average dropped 35 percent during the crash of 1929.) Speculators lost their investments, dot-com entrepreneurs lost their paper fortunes, and thousands lost their jobs. The New Economy myth burst along with the dot-coms. “Companies are souring on the hot management ideas of the New Economy with unusual speed,”19 reported a Bain & Company study. A Boston Consulting Group survey showed that most dot-coms failed because they had poor business concepts, had no competitive advantage, and offered few consumer benefits—a damning indictment of the New Economy. For the thousands of would-be Internet tycoons who lost their jobs, B-to-B and B-to-C, which used to refer to business-to-business and business-to-consumer, came to mean back to banking and back to consulting. In the aftermath of the dotcom crash, investment manager David Schafer concluded that “The Internet bubble was a mania like we’ve never seen before.”20 Schafer perhaps had never heard about tulipomania. In the 1630s, an eccentric, speculative mania erupted in Holland called tulipomania. The Dutch have been fond of tulips since they were introduced from Turkey in 1550. One hundred years later, merchants, hoping to get rich by purchasing large stocks of bulbs, unintentionally launched a speculative mania. Bulb prices surged with these large purchases, and rampant speculation gripped people of all professions and classes, including middle-class merchants, noblemen, washerwomen, seamen, mechanics, and chimney sweeps. In 1634, speculators mortgaged their homes and sold their jewels, land, belongings, and businesses to invest in tulip bulbs. The speculative mania drove prices up to where a single bulb would sell for around 5,000 florins, which at the time would have bought 160 fat swine or 40,000 pounds of cheese. The speculative frenzy peaked in January 1637, when prices for tulip bulbs increased 20-fold. At the height of the speculative craze, people traded their entire farms for a single tulip bulb. Princeton economist Burton Malkiel called tulipomania “one of the most spectacular get-rich-quick binges in history.”21
38
Best Laid Plans
The tulip bubble burst in 1637 with a selling frenzy that drove bulb prices so low that they became worth as much as a single onion. The crash ruined many Dutch families and thrust the Dutch economy into a deep depression. According to historian Charles Mackay, “Many who, for a brief season, had emerged from the humbler walks of life, were cast back into their original obscurity. Substantial merchants were reduced almost to beggary, and many a representative of a noble line saw the fortunes of his house ruined beyond redemption.”22 The 1949 gold rush in California was another spectacular bubble that had the unintended consequences of transforming the American West. Native Americans were the primary inhabitants of California when the United States acquired California from Mexico in 1848. The same year, mill builder James Marshal stumbled upon a few nuggets of gold, the news of which unleashed the California gold rush that drew 250,000 speculators to California. As a result, the population of Native Americans shrank 83 percent in eight years, from 150,000 in 1848 to 25,000 in 1856.
Fads Fads are chain reactions of conformity. “Through fads, people can experience a sense of identity and unity with groups to which they aspire or belong,” noted David Miller from Western Illinois University.23 Fads begin when a few influential people affect a behavior, which induces others to copy it. Once a fad infects a critical mass of people, a chain reaction of copycat behavior erupts. Fads quickly peak and precipitously decline as they become “the mark of the boor,” according to Miller.24 With its explosive and short-lived popularity, the Hula-hoop has become the icon of fads. Copied from a bamboo ring used in Australian gym classes, the Hula-hoop was a plastic hoop named after the Hawaiian hula dance and the centuries-old European practice of spinning hoops around one’s waist called hooping. U.S. toy maker Wham-O introduced the Hulahoop in the United States in 1958 and within two years sold 100 million of them—enough plastic to span the world five times. Incredibly, the widespread popularity of Hula-hoops abruptly ended in 1959. Hula-hoops, of course, are just the tip of the iceberg. Fads are a common fixture of modern society, as is evident from the many fads shown in Table 3.1. Although most fads are harmless, ones involving management can have serious unintended consequences by wasting resources and diverting organizations’ attention in pursuit of false hopes. Management fads can be
The Domino Effect
39
TABLE 3.1. Some 20th-Century Fads • Beanie Babies (1990s) • Bungee Balls (2003) • Cabbage Patch dolls (1980s) • Calculator wristwatches (1980s) • Cargo pants (late 1990s) • Celebrity air (2005) • Chia Pets (1980s) • Clackers (1980s) • Coonskin caps (1950s) • Designer jeans (1980s) • Droodles (1950s) • Energy drinks (2009) • Fitness boot camps (2009)
• • • • • • • • • • • • •
Flash mobs (2000s) Furby (early 2000s) Hula-hoop (1958) Ultra wide-leg jeans (1990s) Macarena (1996) Machinima (2003) Mood ring (1970s) Nerf (1990s) Numa Numa (2005) Office olympics (2004–2005) Pacifiers for teens (1990s) Pet rocks (1970s) Pokemon (1990s)
• Purse dogs (2009) • Rubik’s Cube (1980s) • Scooby Doo (1980s and 2000) • Slap bracelets (1980s) • Livestrong yellow bands (2005) • Tickle Me Elmo (1990s) • Scooters (2000s) • Yo-yo (1930s and 1990s)
Source: crazyfads.com.
launched by popular books written by esteemed corporate executives and influential management gurus. In his book, Managing on the Edge, Richard Pascale found 27 short-lived management fads that corporate executives pursued during the 1980s and 1990s, observing that “ideas acquired with ease are discarded with ease. Fads ebb, flow—even change by 180 degrees.”25 More recent surveys conducted by consulting firm Bain & Company, as shown in Table 3.2, indicate the continued allure of management fads today. John Micklethwait and Adrian Wooldridge, authors of a book on management consultants, The Witch Doctors, state that management fads have “always appealed to thousands of people who want to get ahead; now [they] have tapped into the market of the millions who are scared of being left behind. Humble businessmen trying to keep up with the latest fashion often find that by the time they have implemented the new craze, it looks outdated.”26 Efforts to improve the quality of products and services have been especially subject to management fads, including Quality Circles, Kaizen, Total Quality Management, Reengineering, ISO 9000, the Malcolm Baldridge Award, and Six Sigma along with its brand extension Lean Six Sigma. Each of these ideas followed the same fad dynamic of spreading rapidly with a chain reaction of popularity and abruptly ending when the fad became passé. A Business Week study reported, for example, that 75 percent of the
40
Best Laid Plans
TABLE 3.2. The Faddish Nature of Management Tools The Top 10 Management Tools in 1993 Mission and Vision Statement Customer Satisfaction Total Quality Management Competitor Profiling Benchmarking Pay-for-Performance Reengineering Strategic Alliances Cycle Time Reduction Self-Directed Teams
The Top 10 Management Tools in 2006 Strategic Planning Customer Relationship Management Customer Segmentation Core Competencies Outsourcing Knowledge Management Scenario and Contingency Planning Benchmarking* Mission and Vision Statements* Reengineering*
(* The three top-rated management tools in 1993 that remained among the top-rated tools in 2006.) Source: Bain & Company
Quality Circles started in 1982 had stopped by 1986. By 2006, none of these once-popular quality improvement methods ranked among Bain’s top 10 management tools. The persistent need for businesses to improve their products and services will likely foster many future management fads.
Protests and Witch Hunts Cindy Sheehan’s 24-year-old Army son was killed in Baghdad in April 2004. Although she and other military families had met with President Bush in June 2004, Sheehan claimed that the meeting was inadequate because she did not get a chance to share her thoughts on the Iraqi war. In protest, she created a makeshift roadside camp on August 6 of that year near Crawford, Texas, where the president was beginning a five-week vacation. She vowed to remain there until the president met with her. On August 12, hundreds of supporters visited her camp, and the Veterans for Peace installed nearly 1,000 white crosses along the road near Sheehan’s camp, each bearing the name of a soldier killed in Iraq. On August 13, the Washington Post wrote a front-page story about Sheehan’s protest, and nearly 1,500 people gathered for a peace rally at a park in Crawford, Texas. On August 17, protesters from around the United States held 1,600 candlelight vigils in support of Sheehan’s antiwar cause, including one in front of the White House. Over the next two weeks, folk singer Joan Baez, country musician Steve Earle, actress Margo Kidder, and other celebrities visited Sheehan’s camp.
The Domino Effect
41
By the end of August, Sheehan estimated that more than 12,000 people had visited her roadside camp. She began a bus tour to take her protest to the rest of the country. Three days after her bus tour arrived in Washington, DC, on September 21, nearly 300,000 protesters joined in an antiwar rally. President George W. Bush’s refusal to meet with Cindy Sheehan in early August 2004 had the unintended consequences of making Sheehan a famous antiwar figure and of igniting a chain reaction of sympathy in a mere six weeks that evolved into a large national protest against the U.S. occupation of Iraq. Sheehan’s story shows how one person’s plea can explode into widespread protest. It also shows that failure to stop an emerging chain reaction can cause events to get out of hand with unexpected and undesired results. Like protests, witch hunts are best stopped before they get out of hand. They are chain reactions of fear and false accusation that rapidly infect society like an epidemic and then decline when people recognize the lunacy of the groundless claims. Historian Michael Shermer described the dynamics of witch hunts as follows: The mere accusation of potential perpetrators makes them guilty. Denial of guilt is further “proof ” of guilt. Once victimization becomes well known in a community, others suddenly appear with similar claims. The movement hits a critical peak of accusation where almost anyone can be a potential suspect and no one is above reproach. The pendulum swings the other way as the innocent begin to fight back against their accusers and skeptics demonstrate the falsity of the claims. Finally the movement begins to fade, the public loses interest, and proponents are shifted to the margins of belief.27
The iconic witch hunts in the United States occurred in 1692, when doctors in Salem, Massachusetts, concluded that several girls affecting strange behaviors had come under the influence of witches. A court of law condemned three women who were known as nonchurchgoing social outcasts. Further accusations of witchcraft had so rapidly spread through Salem and neighboring communities that, by the end of 1692, at least 185 people in the area had been condemned as witches. The witch hunt ended in May 1693 with the disbanding of the witch trial court and acquittal of the accused. Although hunting for witches is a thing of the past in the United States, similar episodes of false accusation have continued to pervade modern society. After the attack on Pearl Harbor, for example, the U.S. government rounded up people of Japanese decent residing in the United States and put them in internment camps for the duration of World War II. Similarly, after
42
Best Laid Plans
the Islamic terrorist attacks on September 11, 2001, the U.S. government rounded up tens of thousands of innocent Arab and Muslim Americans for interrogation and detained several thousand of them without charges or access to legal representation. U.S. fears of communism began what was perhaps the most harmful witch hunt event since the 17th century in the United States. In 1938, the U.S. Congress launched a Special Committee on Un-American Activities, proclaiming that Communists had massively infiltrated U.S. society as part of a Moscow-directed conspiracy. Throughout the 1940s and 1950s, the actions of this investigative body had all the trappings of a medieval witch hunt. In 1947, the United States instituted a program that required government employees to affirm their loyalty to the federal government. By 1953, 13.5 million Americans had been subjected to loyalty checks; those who were deemed to be Communist sympathizers were fired. State and local governments joined in with their own loyalty programs and banned books from libraries and repressed schoolteachers deemed to be subversive. Universities fired professors who refused to identify suspect colleagues. Hollywood blacklisted liberal entertainers. The U.S. government even persecuted civil rights leaders believing them to be Communist sympathizers. In 1950, the inquisition peaked with Senator Joseph McCarthy’s farcical show trials during which he claimed to have a list of 205 known Communists in the U.S. State Department. McCarthy distorted facts and falsified evidence to link an ever-growing list of innocent people to his putative Communist conspiracy. Typical of witch hunts, the persecution ended quickly when it became obvious it had gotten out of hand. In a 67–22 vote, the U.S. Senate in December 1954 condemned McCarthy for his abusive hearings. The inquisition, like other witch hunts, had the unintended consequence of ruining many innocent peoples’ lives. Ironically, it also backfired in that it discouraged the aggressive pursuit of actual Communist infiltrators, prompting some senators to joke that McCarthy had been the Soviet Union’s most effective agent. Like witch hunts, massacres are driven by chain reactions of fear and hatred, but with far more devastating outcomes.
Massacres On February 21, 2001, mobs of Dayaks from the hinterland surrounding the lumber town of Sampit in the province of Central Kalimantan, Indonesia, began massacring some 500 of their Madurese rivals, beheading 100 of them. Overwhelmed police stood by as Dayaks walked through the streets
The Domino Effect
43
carrying the severed heads of Madurese victims in a killing spree so violent that it revived the Dayak’s headhunting tradition that had died out almost 100 years earlier. The massacre spread to nearby cities and, within a few months, drove out nearly all the 100,000 surviving Madurese that lived in Central Kalimantan in 2001. Central Kalimantan is on the island of Borneo, which Indonesia shares with Malaysia and Brunei. Central Kalimantan is the ancestral homeland of the indigenous Dayaks, traditionally a forest-dwelling people. In 2001, Dayaks accounted for the majority of the province’s population despite decades of immigration from other parts of Indonesia. The Dayaks’ headhunting was based on their belief that beheading enemies would deprive them of their spiritual lives after death. After centuries of Dutch colonial rule, the Dayaks had largely abandoned their headhunting tradition by the early 20th century. The massacre in Sampit arose from the Dayak’s anger over being continually marginalized and displaced from their ancestral homeland. Indonesian president Suharto had given large tracks of land in Central Kalimantan to logging companies, which destroyed the Dayak’s forest habitat. Furthermore, Indonesia passed a law in 1979 to standardize local governments throughout the country. This legislation undermined the authority of village leaders and disrupted the residents’ communal lives. Particularly vexing was the process of transmigration imposed by the Dutch colonial administration in the 1930s to resettle people from densely populated parts of Indonesia into less crowded areas. President Suharto continued transmigration for two decades after World War II, with the resettling of about 2.5 million people throughout Indonesia. By 2000, 21 percent of Central Kalimantan residents had migrated from elsewhere in Indonesia, including about 100,000 Madurase from the crowded Indonesian island of Madura. Although the Madurase were a small minority in the Central Kalimantan province, they were the majority of the population of Sampit. The increasingly marginalized Dayak and immigrant Madurase developed a mutual hatred that spawned periodic violence in 1996 and 1997. The immigrant Madurase became prosperous as they took control of logging, mining, and plantation operations. They considered the Dayaks to be uncivilized, primitive people with a notorious tradition of headhunting. Conversely, the Dayaks considered the Madurase to be arrogant, aggressive, and untrustworthy. On December 15, 2000, a Madurase stabbed a Dayak to death, prompting several hundred vigilante Dayaks to go on the rampage seeking the murderer and damaging Madurase homes and cars in the process. Further
44
Best Laid Plans
isolated killings of Dayaks and Madurase occurred, and, on February 18, 2001, a band of Madurase demonstrated in Sampit declaring that they owned the town. In response, Dayaks launched their massacre of the Madurase and drove them from the region. The massacre in Sampit, Indonesia, illustrates how small events can unexpectedly ignite a chain reaction of widespread murder—even sometimes involving otherwise peaceful people. In the aftermath of massacres, there may be considerable debate over whether they were premeditated or spontaneous. Declaring a massacre spontaneous is a convenient way for those who committed murder to become blameless. However, major massacres since the 20th century appear to have been planned. Mehmed Talaat, leader of the Young Turks in Turkey, triggered the massacre of 800,000 Armenians by declaring in January 1915 that they had to leave the country. Hitler and his Nazi party had a plan that culminated in the slaughtering of 9 million Jews during the World War II era. Maoist Pol Pot had a vision to remake Cambodia into a preindustrial society for his country that led to the elimination of 2.5 million of its people by his Khmer Rouge rebels. The Hutu in Rwanda had been stockpiling guns and machetes for at least a year before they murdered 800,000 of their Tutsi neighbors and Tutsi-friendly Hutu. Serbian president Slobodan Milosevic was indicted and tried by the International Criminal Tribunal for planning the mass murder of 200,000 Bosnian Muslims from 1992 to 1995; he died before the trial ended. Even the seemingly spontaneous slaughter of Madurese in 2001 had an alleged group of instigators seeking to drive an enemy from their province. A key unintended consequence of massacres is that they engage so many otherwise normal people in a contagion of mass murder. Fear of a target group, like the Madurese, leads to xenophobia and hatred. Once the killing begins, annihilation spreads like wildfire. Murder becomes the norm. Social pressures drive peaceful people to avoid being seen as sympathetic to the enemy. The massacre in Rwanda was so contagious that Hutu quickly killed 27 percent of the Tutsi population and hundreds of thousands of fellow Hutu who sympathized with them. According to Human Rights Watch, “Once the killing began, the killers could not stop and turned to killing suspect Hutu once all local Tutsi were killed. Women, men, and children took part in the killing and even killed their former neighbors.”28 Another unintended consequence of an attack on targeted individuals is that it often leads to the killing of many more people than the instigators had planned. This happened during the Cultural Revolution in the People’s Republic of China. Mao launched the Great Proletarian Cultural
The Domino Effect
45
Revolution on May 16, 1966. He commanded China’s youths to form the Red Guard and rise up against counterrevolutionary forces in Chinese society. To explain his purpose, Mao instructed that “revolution is violence.” According to journalist Teresa Poole, Mao “unleashed a decade of savagery and chaos encouraging a whole generation of teenagers to run riot in a frenzy of mass violence so fanatical as to be scarcely credible.”29 The Red Guard beat and imprisoned teachers and attacked “capitalist roaders,” “counterrevolutionary revisionists,” and other class enemies. As a 12-year-old member of the Red Guard in 1966, Li Jiang recalled, “I thought that the beatings were correct, and in order to express that we were revolutionary, we must beat other people.”30 Violence spread with the ransacking of factories, government offices, homes of landlords, wealthier peasants, and intellectuals. Enemies were beaten, tortured, humiliated with show parades in the streets, denounced at public meetings, and moved to the countryside to perform hard labor. In the suburbs of Beijing, the Red Guard eradicated entire families, including infants and the elderly. By the close of 1966, China had gone mad. No one was beyond accusation, and “one could easily be accused of being an active counterrevolutionary if she/he questioned any practice of the [Cultural Revolution] or even merely unwittingly damaged a picture of Mao,” according to former Red Guard Lu Xiuyuan.31 People committed suicide to avoid torture and beatings; workers joined the melee in even greater numbers than the Red Guards; and factions within the Red Guard fought one another, employing mass executions and cannibalism to demonstrate the depth of their class hatred and revolutionary zeal. Mao confessed at a party meeting that he had not foreseen that his Cultural Revolution would engulf China in total anarchy, and, in October 1966, he ordered the People’s Liberation Army (PLA) to quash it. The PLA massacred thousands of Red Guards and exiled 16 million of them to remote areas in China to work as peasants for 10 years. Although the violence of the revolution ended in 1968, its turmoil lasted another 10 years. Exiled dissident Liu Binyan called the Cultural Revolution a “destructive movement unprecedented in human history.”32 In 16th-century France, another planned assault led to limitless slaughter when Protestant leaders visited Paris. Queen Catherine de Medici had arranged for her Catholic daughter to marry Protestant leader Henry of Navarre on August 18, 1572, to ease tensions between the rival religious groups in France. Tensions mounted as Catholic priests called the marriage a “perverse union that God would avenge.”33 False rumors circulated that
46
Best Laid Plans
4,000 Protestant troops billeted outside Paris for the wedding were planning to attack Parisian Catholics. During the wedding festivities, several Catholic noblemen bungled an attempt to assassinate a Protestant leader. Fearing a Protestant reprisal, Catherine ordered 100 of the king’s guards to murder several dozen Protestant leaders in their sleep. The assassinations induced Parisian Catholics on St. Bartholomew’s Day to engage in a mass slaughter of Protestants, which further incited massacres in Rouen, Lyon, Bourges, Orleans, and Bordeaux. The failed assassination of one Protestant leader had the unintended consequence of launching the St. Bartholomew’s Day Massacre that claimed the lives of 6,000 French Protestants. The fact that chain reactions of fear and hatred can cause massacres to escalate beyond more limited plans to attack a particular group of people is one of the most sinister aspects of human behavior.
The Riot That Didn’t Happen During the 1960s, the Hell’s Angels and other motorcycle gangs held large rallies that often turned into drunken riots and caused substantial property damage. In 1965, rumors circulated that bikers were planning a rally for Labor Day in the tiny town of Upper Marlboro, Maryland. Officials from Upper Marlboro were especially nervous because a similar rally held in Weirs Beach, New Hampshire, had devolved into a disastrous riot. The Labor Day rally in Upper Marlboro came and went, however, with minimal unrest, due to the help of two social psychologists hired to prevent rioting. The psychologists studied previous riots at motorcycle rallies and concluded that police could either prevent or provoke riots and that a delicate balance of firmness and fairness was needed to keep the peace while keeping police brutality from becoming a rallying call to riot. Upper Marlboro’s police acted firmly, fairly, and neutrally in dealing with the bikers. They quickly stopped bikers from breaking the law but avoided stereotyping and needless harassment, despite the motorcyclists’ malevolent appearance. The police even accommodated the bikers by letting them occupy a local park, which helped contain their activities. Sociologists Robert Shellow and Derek Roemer concluded from their studies on “The Riot That Didn’t Happen” that “Riots do not have to happen. . . . The element of luck certainly cannot be controlled. A chance remark, a blocked street, a misfired gun, an unavoidable accident all can touch off panic in a crowd—or in a police unit. On too many occasions, however, it is not luck that is lacking but the elements of fairness, common sense, and communication.”34
The Domino Effect
47
Lessons from Chain Reactions Officials in Upper Marlboro learned the first lesson about chain reactions: avoid starting one. It had trained its police officers to avoid inciting the bikers to riot. In contrast, an ill-trained National Guard unit in May 1970 needlessly fired upon a crowd of protestors at Kent State University, killing four students—including two bystanders—and wounding nine others. An unknown guardsman fired the first shot, prompting others to follow suit. A number of the guardsmen later testified that they “had turned and fired because everyone else was.” The National Guard should have trained its armed guardsmen on how to handle unarmed student protestors safely. The second lesson about chain reactions is to defuse them quickly before they get out of hand. Unattended problems can ignite chain reactions of rumor, panic, negative press coverage, government investigations, and destroyed reputations. Ignoring or denying problems can be lethal. Early intervention is, in fact, a central tenet in the profession of crisis management. In 1982, Johnson & Johnson, for example, effectively responded to its poisoned Tylenol crisis by quickly pulling the product off the shelves and publicizing its efforts to solve the problem. Johnson & Johnson invited the media to film its CEO leading a task force to solve the Tylenol problem. In contrast, Firestone’s persistent denials during the late 1990s that its tires had caused several fatal accidents destroyed its reputation. A third lesson about chain reactions is to avoid panicking when faced with spreading fear. In battle, panic causes catastrophic retreats. Panic destroys wealth when investors sell out whenever the stock market takes a plunge. Panic causes sports teams to choke after early defeats and businesses to squander precious resources warding off false threats. During the Great Depression, President Franklin Roosevelt assuaged quaking radio audiences by declaring, “There is nothing to fear but fear itself.” Rudyard Kipling offered sound advice in his famous poem IF: If you can keep your head when all about you Are losing theirs . . . Yours is the Earth and everything that’s in it.35
A fourth lesson about chain reactions is to avoid crazes, fads, and manias. Craze avoidance requires the ability to distinguish a craze from a long-term trend and the confidence to hold fast when faced with herd mentality. Neither is easy. Crazes sucker in the best and the brightest. Nobles and peasants alike fell for tulipomania. Elite investment managers fell for the dot-com craze. Nevertheless, when things sound too good to be true, they probably
48
Best Laid Plans
are. Be wary of hot new ideas that promise quick fixes to complex management problems; they do not exist. Be cautious of hot investments with speculative earnings and popular alternative medicines whose efficacy has yet to be scientifically established. The late Michael Hammer, creator of the lauded concept of reengineering, offered sound advice regarding management fads: “If you live by the sword, you’ll die by the sword.”36 The unintended consequence of latching onto the latest management fad could be the demise of a career, if not a company, when the fad goes out of fashion. Craze avoidance means bucking the crowd, being the lone voice of reason in a raging sea of mass hysteria. Often, it means being ostracized. Investment managers who failed to go along with the dot-com craze were dubbed losers, dinosaurs, idiots, and jokes; some lost their jobs. The real heroes in the wake of the dot-com craze were the business executives and investment managers who stuck to their principles and kept their heads when others around them were losing theirs. This chapter illustrated how chain reactions of herd behavior can cause unintended consequences and escalate events. The following chapter shows how events can similarly escalate via a different social mechanism: reinforcing feedback.
Chapter 4
The Vicious Cycle The Evolution of the IED President George W. Bush declared victory in Iraq on May 1, 2003, during his “mission accomplished” speech given aboard the USS Abraham Lincoln. Soon after, however, U.S. forces began experiencing deadly attacks from Hussein loyalists, Sunni and Shiite insurgents, and foreign fighters under the banner of al-Qaeda in Iraq. The insurgents’ weapon of choice were homemade bombs called improvised explosive devices (IEDs) assembled from cast-off artillery shells included in Hussein’s abandoned supply of munitions that U.S. forces left unguarded for months after it defeated Iraq’s army. Controlled by cell phones and garage-door openers to remotely detonate the bombs, the IEDs were cheap, easy-to-make deadly weapons in endless supply. The insurgents’ main target was the thin-shelled Humvee workhorses that plied Iraq’s mostly unsecured roads. The Humvee was an unarmored vehicle designed for high speed to avoid combat that became an easy target for the insurgents and a death trap for U.S. troops. Under pressure to rectify the Humvee problem, the U.S. Department of Defense began adding armor to Humvees used in Iraq. The insurgents responded by making more powerful IEDs from their vast supply of leftover munitions. The United States counteracted by adding more armor. Using armor-penetrating shape-charge technology, the insurgents then made even bigger and better IEDs that were capable of destroying heavily armored Bradley fighting vehicles and all Humvees however reinforced with steel. The miniature arms race between the U.S. Department of Defense and Iraqi insurgents has had other unintended consequences that made the Humvee even deadlier. The 3,000 pounds of armor added to the vehicle made it much slower and less able to escape combat as well as prone to rolling over. Worse still, explosions caused the doors on the armored Humvees to jam, trapping the soldiers inside. The Humvee arms race in Iraq illustrates how a reinforcing dynamic can cause small events to escalate with unexpected outcomes. It comes into play
50
Best Laid Plans
when a feedback loop between two parties amplifies events, as in the simple case of an escalating argument between two persons. Person A, for instance, may start the argument by making a small remark that unintentionally prompts person B to respond in a way that aggravates A. Person A then says something that further annoys B, who then says something even nastier to A, and so on and so on. The reinforcing feedback loop has caused A’s initial harmless remark to escalate into an unanticipated hostile exchange. Similarly, reinforcing feedback is a driving force behind escalating armed conflict, arms races, trade wars, and other conflicts that often have dire unintended consequences. For example, a nation seeking to improve its security by acquiring new weapons can unintentionally trigger an arms race. The acquisition of arms induces its rivals to increase their military might, which in turn causes the first nation to invest even more in new weapons, and the stockpiling on both sides continues to escalate. Ironically, nations that seek to increase their security by amassing more weapons can find themselves in more danger than before they acquired the weapons. Reinforcing feedback can also cause the imposition of trade restrictions to protect local industries to backfire when they foment trade wars. Popular metaphors and aphorisms reveal the presence of reinforcing feedback in daily life. The snowball effect, for example, refers to a snowball that gets bigger as it rolls downhill, and the bigger it becomes, the faster it rolls. The notion that success breeds success implies that success today can lead to more success in the future, because success generates a positive reputation, builds confidence, and increases access to resources. The old adage that the rich get richer reflects a similar notion. Reinforcing feedback can also work in reverse to make matters quickly get worse, as implied by the phrases downward spiral or death spiral. A small failure, for example, could lead to greater failures due to lost confidence and ruined reputation—success breeds success working in reverse. Tooth decay, cars in need of repair, and other problems deteriorate at an escalating pace. An aphorism pertinent to this phenomenon is “a stitch in time saves nine.” The tar baby metaphor in Joel Chandler Harris’s Tales of Uncle Remus illustrates how reinforcing feedback can exacerbate events. Brer Rabbit punches the tar baby and gets his hand stuck. The harder he tries to extricate himself from the tar baby, the more stuck he becomes. This causes him to try even harder to free himself, only to get even more enmeshed, until his situation becomes hopeless. This death spiral happens when people get stuck in quick sand. The more they panic trying to extricate themselves, the more they sink, causing them to become more panicked. Divers trapped in deep water often succumb to a
The Vicious Cycle
51
similar vicious cycle. The panicked diver’s heart and respiratory rates surge, which rapidly drains remaining air reserves. The plummeting needle on the gas gauge of the oxygen tank induces further panic and a hastening of heart and respiratory rates, which increases the rate of air depletion.
Feuds Reinforcing feedback causes small provocations to escalate into unintended feuds. In 1997, for example, a minor altercation in Boston, Massachusetts, escalated into a feud that culminated in a brutal murder. While driving down a narrow street, Boston University medical student Daniel Mason found his way blocked by a moving van. Mason demanded that the driver, Eugene Yazgur, move his truck. Yazgur dawdled, prompting an impatient Mason to try to move the truck himself. A pushing and shoving match broke out during which Mason pulled out a knife and slashed Yazgur’s ear, which required 30 stitches to mend. Mason was convicted for assault and was released on probation. Unsatisfied with the court’s decision, Yazgur filed a civil suit against Mason and won an $118,000 judgment. Mason’s defiant boast that he would never pay any of this award prompted Yazgur to obtain a court order forcing him to pay the settlement. The court’s decision enraged Mason, and, shortly before dawn the next day, he broke into Yazgur’s apartment and shot and killed his roommate and dog. He then shot Yazgur 15 times in the head, chest, and legs. Yazgur’s miraculous recovery enabled him to testify against Mason, who, in December 2001, was sentenced to life imprisonment without parole for first-degree murder. A similar small event triggered a reinforcing feedback process that became the notorious Hatfield-McCoy feud. In reflecting on his family’s ruinous conflict, Ron McCoy noted, “It is the smallest of things that can serve as a catalyst for a major catastrophe. . . . So it was that an argument over livestock would set into motion years of bitter strife.”1 While visiting his neighbor Floyd Hatfield in 1878, Ranel McCoy found one of his razorback hogs and accused his neighbor of stealing it. Such an accusation was a serious matter in the mountain community that valued honor and integrity. Two years later, tensions mounted with the disastrous love affair between 18-year-old Johnse Hatfield and 21-year-old Roseanna McCoy. Roseanna left her family to live with the Hatfields, who refused to let Johnse marry her. Roseanna soon became pregnant, and Johnse abandoned her to marry her cousin Nancy McCoy. Roseanna is said to have died at the early age of 30, lacking any ailment other than profound grief.
52
Best Laid Plans
The tipping point in the feud came in 1882, when a minor argument erupted into a fight during which three McCoys killed a Hatfield. The Hatfields tracked down and executed the three McCoys in a fusillade of over 50 bullets. According to Ron McCoy, “The families had now crossed the point of no return. Blood was drawn and family honor demanded justice.”2 The feud became a miniature war in 1888, when 20 Hatfields launched a surprise attack to exterminate the McCoys. The surviving McCoys retaliated by raiding Hatfield homesteads, which forced the Hatfields to form a militia to protect themselves. The two clans ultimately collided in a bloody gun battle involving 18 McCoys and 13 Hatfields, causing the death of at least 20 clan members. The reinforcing feedback process became so powerful that even the patriarchs of the Hatfield and McCoy clans could not keep the feud from spinning out of control. “Two noble, strong-willed families [became] locked in the throes of mortal combat, bound by personal honor to avenge the smallest of grievances,” noted Ron McCoy.3 The feud is a classic vicious cycle of an ever-growing conflict that began unintentionally with a minor altercation.
The Onset of War On the night of November 26, 2008, 10 heavily armed men came ashore in rubber rafts in downtown Mumbai, India, and began a four-day shooting and bombing spree that overwhelmed local police, killed 173 people, and wounded at least 308 others. Indian security forces eventually killed nine of the attackers and captured a lone survivor who confessed that they were members of the Pakistani-based Lashkar-e-Taiba, considered to be a terrorist organization by both India and the United States. Lashkar-e-Taiba in Pakistani Urdu roughly means “army of the pure.” Its objectives variously include liberating fellow Muslims in Kashmir from Indian control, establishing an Islamic state in southern Asia, restoring Islamic control over India, and annihilating Hindus and Jew as enemies of Islam. India claimed that the assault originated from within Pakistan, citing as evidence the interrogation results from the lone survivor and cell phone records that indicated the attackers were being directed during the assault from controllers in Pakistan. While Pakistan denied the accusation for several months, India threatened to strike terrorist camps in Pakistani-administered Kashmir, which prompted Pakistan to move troops to its border with India. Such an outrageous attack on Mumbai could have escalated into a war between two nuclear-armed, mortal rivals and could have resulted in unimaginable consequences.
The Vicious Cycle
53
The Lashkar-e-Taiba had struck earlier, on December 13, 2001, with a suicide bomb that exploded in India’s Parliament building. India blamed Pakistan and demanded the arrest of those involved. Pakistan denied involvement and demanded proof of its culpability. The two countries pulled their diplomatic corps, cut transportation links between them, and sent tens of thousands of troops and strategic missiles to their common border. On December 29, 2001, Pakistan’s foreign minister, Sattar, feared that the conflict was “growing dangerously tense and could easily spin out of control.”4 The terrorist attacks on India show how small incidents can foment war by unleashing an escalating, reinforcing feedback process. “The slightest quarrel,” said German war strategist Carl von Clausewitz, “can produce a wholly disproportionate effect—a real explosion.”5 Small incidents driven by reinforcing feedback have, in fact, erupted into wars throughout U.S. history. In 1754, for example, George Washington’s attack on a small band of French soldiers in the Ohio territory set in motion events that led to the French and Indian War, the bloodiest war in 18th-century North America. For over a century, the French and English had peacefully coexisted in North America until their settlements began to collide in the Ohio Valley in the 1750s. France had claimed the territory west of England’s coastal colonies and had built a series of forts, missions, trading posts, and villages stretching from Montreal to Louisiana to keep it. Meanwhile, English colonists, prompted by population pressures and encouraged by land grants, settled in the Ohio Valley in 1749. Tension mounted over the next five years as both the English and French built forts in the disputed territory. In 1753, the governor of Virginia sent a military force led by George Washington, then a young and inexperienced colonel, into the Ohio Valley to demand that the French abandon the territory. The French refused and, in 1754, the governor gave Washington permission to build a fort near what is today Pittsburgh. Washington’s troops attacked and routed a small band of French soldiers and, anticipating a counterattack, retreated to his crude fort. France’s superior forces forced Washington to surrender when he lost half of his men and ran out of ammunition. A series of battles ensued, leading to a formal declaration of war between France and England in May 1756, which raged on until the English prevailed in 1759. The Treaty of Paris, which finally resolved this conflict, was signed in 1763. Twelve years later, America’s war for independence began with the famous “shot heard around the world.” While marching to Concord, Massachusetts, in 1775 to capture a colonist armory, British forces encountered 70 armed citizens in Lexington called the minutemen. The British ordered the
54
Best Laid Plans
minutemen to lay down their arms and go home. The minutemen complied, except for an unknown man who fired at the British. The British returned fire, killing 18 minutemen. After word of the conflict reached Concord, enraged minutemen held their ground when the British arrived. As tensions mounted, the two armies exchanged volleys, forcing the British to make a perilous retreat to Boston, during which they lost 273 soldiers. When news of the battles in Lexington and Concord reached Philadelphia, the Continental Congress scrapped its proposal for reconciliation with Britain and decided to fight for independence instead. The detaining of a neutral British battleship during the U.S. Civil War nearly caused the British to ally with the South, which could have greatly altered the course of U.S. history. The U.S. Navy ordered Captain Charles Wilkes to deliver the San Jacinto, a ship stationed off the West African coast, to Philadelphia. Wilkes, an impetuous officer with a checkered military career, decided instead to hunt for Confederate privateers in the West Indies. Upon hearing that two Confederate commissioners were traveling to England aboard Britain’s HMS Trent, Wilkes seized the Trent and removed the two Confederates. This action most likely violated international law. The British threatened to enter the war allied with the South until President Lincoln returned the two Confederate officials. In 1898, the explosion of the battleship USS Maine triggered a feedback process that escalated into the Spanish-American War. The United States ostensibly sent the Maine to Cuba to protect U.S. citizens from the island’s mounting civil unrest. While anchored in Havana Harbor, the Maine suddenly exploded, killing 262 of its 354 men onboard. U.S. officials instantly blamed the Spanish for sinking the Maine, and the U.S. press fanned anti-Spanish sentiment with the slogan “Remember the Maine, to Hell with Spain.” The destruction of the Maine set in motion a cycle of events that rapidly erupted into war. Theodore Roosevelt, then assistant secretary of the Navy, blamed the explosion on “the dirty treachery on the part of the Spaniards” and mobilized U.S. warships in the Atlantic and Pacific.6 Two months later, the American Naval Court concluded that a Spanish mine had blown up the Maine, and the U.S. Congress demanded the complete withdrawal of Spanish forces from Cuba. Seeking to retain the last vestige of its colonial empire, Spain declared war on the United States. Heavily outgunned in both Cuba and the Philippines, Spain surrendered in July 1898. Thus ended what John Hay, U.S. ambassador to Britain, called a “splendid little war.” Recent underwater examinations of the Maine’s damaged hull indicate that the explosion that sank it came from within the ship and that a spontaneous
The Vicious Cycle
55
fire in its coalbunker most likely ignited the ship’s magazines and blew it up. The sinking of the Maine had the unexpected consequences of ending Spain’s 400 years of colonial rule in the New World. It also led to the occupation of former Spanish colonies in Puerto Rico and the Philippines, the latter of which made the United States become a naval power in the Pacific. Similarly, an imagined incident in 1964 caused the United States’ noncombatant role in Vietnam to erupt into a full-scale war. Early in the Cold War, the United States witnessed the Communist takeover of China and feared that Vietnam and other Southeast Asian countries could also fall like dominos if undeterred. Many in President Kennedy’s administration, including Secretary of State Dean Rusk, military advisor General Maxwell Taylor, and Defense Secretary Robert McNamara, were eager to go to war to prevent a Communist takeover of South Vietnam. Kennedy, however, was reluctant to send U.S. troops to Vietnam and, according to historian Richard Parke, “wanted no war in Asia and had clear criteria for conditions under which he’d send Americans abroad to fight and die for their country.”7 Kennedy believed that sending troops was a last resort and should only be done with a multilateral force under guidance of the United Nations’ Security Council. When General Taylor requested sending thousands of U.S. combat troops to South Vietnam, Kennedy agreed to send only noncombatants to train the South Vietnamese to do their own fighting. The United States’ cautious stance toward Vietnam ended with Kennedy’s assassination, and his successor, President Johnson, vowed that Vietnam would not fall to the Communists during his watch. Johnson’s hawkish administration found justification for sending combat troops to Vietnam on August 4, 1964, in the Gulf of Tonkin incident. On August 2, 1964, three North Vietnamese patrol boats fired at the destroyer USS Maddox while it was on a routine reconnaissance mission in the Gulf of Tonkin off the shores of North Vietnam. Hit by a single machine gun bullet, the Maddox left the gulf and returned two days later, accompanied by the destroyer the USS C. Turner Joy. During the night of August 4, a U.S. sonar technician thought he detected a threatening North Vietnamese ship, which prompted the two U.S. destroyers to commence firing into the dark at the shadowy sonar image for two hours. On August 5, President Johnson sent planes from two aircraft carriers to destroy North Vietnamese torpedo boats and fuel facilities and asked Congress for a mandate for military action. Defense Secretary Robert McNamara testified before Congress that there was “unequivocal proof ” that the North Vietnamese attack was unprovoked. On August 7, Congress nearly unanimously passed the Gulf of Tonkin Resolution granting the president war
56
Best Laid Plans
powers without formally declaring war on North Vietnam. The resolution read, in part: “Congress approves and supports the determination of the President, as Commander in Chief, to take all necessary measures to repel any armed attack against the forces of the United States.”8 A reinforcing feedback loop caused the Vietnam conflict to escalate into a full-scale war claiming the lives of 50,000 Americans and millions of Vietnamese. Armed with a congressional mandate, the Johnson administration sent tens of thousands of U.S. troops to Vietnam. In response, the North Vietnamese shipped more divisions to South Vietnam, prompting the U.S. Department of Defense to request more troops. Hanoi responded to the U.S. bombing of North Vietnam by sending even more divisions to South Vietnam and requesting military aid from China and the Soviet Union. Thus ensued a vicious cycle, which journalist David Halberstam described in his book The Best and the Brightest as follows: “If Westmoreland had enough troops, then Hanoi would send more; if Hanoi sent more, then Westmoreland would want more. The cycle was out of their hands.”9 Considerable evidence suggests that the Gulf of Tonkin incident never happened. Most likely, the North Vietnamese did not attack the U.S. destroyers on the night of August 4 and that two U.S. ships fired into empty water. John J. Herrick, captain of the USS Maddox, claimed that the incident was triggered by an “overeager sonar man hearing the ship’s own propeller beat.” Squadron commander James Stockdale, who flew over the Gulf of Tonkin on the night of August 4, stated in 1995 that there were no North Vietnamese ships attacking the two U.S. destroyers. “I had the best seat in the house to watch that event, and our destroyers were just shooting at phantom targets. There were no PT boats there. . . . There was nothing there but black water and American fire power.” In the immediate aftermath of the supposed attack, Stockdale recalled thinking that “we were about to launch a war under false pretenses, in the face of the on-scene military commander’s advice to the contrary. The decision had to be driven from way up at top.”10 The sonar technician’s probable mistake had the unintended consequence of launching the Vietnam War.
Arms Races and Trade Wars At the close of World War II, the U.S. monopoly on nuclear weapons technology prompted the Soviet Union to quickly develop its own atom bomb, which it did in 1949. In 1958, the mutual acquisition of nuclear weapons escalated into a missile race when the two superpowers began stockpiling missiles capable of delivering nuclear weapons to distant targets. By 1980, the
The Vicious Cycle
57
Soviet Union and the United States had a combined arsenal of about 18,000 nuclear warheads capable of destroying each other and the rest of humanity many times over. The missile race was a classic example of an arms race with the rapid expansion of military might induced by competition between two rivals and driven by a reinforcing feedback process. An arms race begins when a country increases its military might and induces its rivals to add to their own arsenals to stay even, which then prompts the first country to acquire even more weapons. The reciprocal process creates an upward spiraling of weapons procurement. Ironically, the country that initially sought to increase its security by acquiring weapons can become much less secure after an adversary dramatically increases its own arsenal in response. Since recorded time, reinforcing feedback has induced perpetual arms races involving weapons and countermeasures. The skull-crushing stone mace begot the helmet. Armor led to armor-piercing crossbows and longbows. The invention of the stirrup, iron horseshoes, saddles, and the armored knight enabled cavalry to dominate infantry. This led to the pike, halberd, and other long weapons with points, blades, and hooks that enabled infantrymen to pull knights from their horses. Fortresses spawned siege weapons, such as the Roman military’s 150-foot towers equipped with battering rams. The 15th-century siege cannon demolished stone walls within days, which led to the use of cannon-proof earthen ramparts. According to noted historian William McNeil, “Revetments of earth could protect stone walls from the destructive force of gunfire. . . . The advantage to the attacker that the cannon had briefly conferred was abruptly cancelled.”11 Trench warfare gave rise to field mines, booby traps, poisonous gas, barbed wire, and the mortar. These weapons gave rise to the armored tank capable of maneuvering over battlefields scarred with barbed wire and trenches. Tanks then became vulnerable to bazookas, and antitank bombs dropped from planes. Missiles induced the development of antiballistic missiles. The submarine led to destroyers and sonar, which begot quieter submarines capable of eluding them. A major criticism of the United States’ controversial Strategic Defense Initiative, a satellite system designed to destroy intercontinental missiles and commonly called Star Wars, has been that adversaries would quickly devise ways to circumvent it. Similar to arms races, trade wars are escalating conflicts driven by reinforcing feedback that often backfire. Countries seeking to protect their industries by instituting tariffs, quotas, and other trade barriers often induce their trading partners to retaliate with their own trade restrictions, which may induce counter retaliation, and so forth. Ironically, a country seeking to
58
Best Laid Plans
protect its industries can trigger a trade war that causes its industries to lose business, which happened to the United States in the 1930s, when it passed the Smoot-Hawley Tariff Act. During World War I, U.S. farm exports greatly increased as European farms became battlefields. The recovery of European farms after the war, however, created a glut of agricultural products on the world market, which caused U.S. farm product prices to plunge and exports to stagnate. Armed with substantial political clout, farmers successfully lobbied the U.S. Congress to increase tariffs on agricultural imports. Lobbyists from other industries also called for trade protection, and, on June 17, 1930, Congress passed one of the most protectionist tariffs in U.S. history called the Smoot-Hawley Tariff Act. The tariff boosted already high tariffs by 50 to 100 percent on virtually every product that faced international competition. This act unleashed a wave of retaliation with more than 40 countries passing their own tariffs and other restrictions on U.S. imports. In describing the emerging trade war, Senator Connally from Texas told the New York Times in October 1932, “It is a game that two can play. Foreign countries struck back and they struck back hard.”12 Canada, one of the United States’ largest trading partners, imposed trade restrictions that caused the importation of U.S. goods to drop about 60 percent from $802 million in May 1931 to $317 one year later in 1932. Italy imposed a tariff on U.S. cars five days after the United States passed the Smoot-Hawley Tariff Act. Great Britain imposed a 50 percent tariff on 23 U.S. imports. Switzerland instituted quotas and tariffs on U.S. cars, machinery, produce, oil, coal, and other U.S. imports, and groups of its citizens swore never again to purchase U.S. products. U.S. exports to Europe dropped about 70 percent from $2.3 billion in 1929 to $784 million in 1932. On August 28, 1930, Edward Ewing Pratt, a former acting secretary of commerce, told the media, “The Smoot Hawley Tariff bill had done more to injure American foreign trade than any official act by the Government of the United States.”13 Ironically, the Smoot-Hawley Tariff Act originally intended to help U.S. farmers but caused U.S. agricultural export income to drop 68 percent from 1930 to 1933 and had other unintended consequences. Although economists disagree on whether the Smoot-Hawley Tariff Act caused the Great Depression, it surely exacerbated seriously declining world economies during the early 1930s. Furthermore, the tariff soured U.S. international relations at a time when events were leading up to World War II. Trade wars have two reinforcing feedback processes that cause protectionist efforts to backfire. The first is the strong likelihood that retaliation will launch an escalating spiral of trade restrictions among trading partners. The
The Vicious Cycle
59
second reinforcing feedback loop involves a downward spiral. When governments shield their local industries from foreign competition, they reduce the competitive pressures on the industries to become more efficient and produce better products. Thus, they become weaker and in need of even further protection from foreign competitors.
Collective Behaviors Revisited The previous chapter described how chain reactions can drive protests, social movements, and other forms of collective behavior to become unpredictable. Some of these collective behaviors also involve reinforcing feedback loops, a good example of which is the investment bubble. Bubbles begin with a chain reaction of speculative greed regarding some asset reputed to yield big returns. The purchasing of the asset by a large number of speculators can drive up the price of the asset, making it seem like a great investment. The surge in the asset’s price induces further speculative buying, driving its price ever higher. The bubble bursts when speculators discover that their precious asset is overvalued. Then a chain reaction of fear creates a panicked selling spree, which drives down the value of the asset, which creates more panic. This sets in motion a downward spiral of the asset’s worth and the investor’s confidence in it. Reinforcing feedback loops can cause an authority’s attempts to suppress protests to backfire. A crackdown on protest movement creates sympathy for the protesters, making their cause more celebrated. Sympathy for the protesters grows the more authorities crack down, and this sympathy emboldens the protesters to expand their activities, which, in turn, requires tougher action to suppress. For example, King Gyanendra of Nepal’s efforts to suppress a brutal Maoist revolutionary movement in 2005 caused it to become very successful. Nepal’s democratically elected government embarked on an aggressive crackdown on the Maoist rebels, during which police tortured suspects, drove villagers away from homes, and raped women. As the rebellion spread, the government came down harder, which had the unintended consequence of increasing support for the Maoist rebels, enabling them to gain control of Nepal except for Katmandu and district capitals. Similar efforts to suppress the civil rights movement in the United States made it more successful. In December 1, 1955, an African American woman named Rosa Parks refused to give up her seat to white passengers on a Montgomery, Alabama, bus and was arrested for violating segregation laws. Her arrest triggered a sequence of boycotts, protests, and legal battles that led the U.S. Supreme Court to pass antisegregation laws. Martin Luther King
60
Best Laid Plans
Jr. called Rosa Parks the “great fuse that led to the modern stride toward freedom.” Efforts in the southern United States to quash the civil rights movement backfired. In 1961, a multiracial group of civil rights activists called the Freedom Riders headed for the Deep South to test the enforcement of recently enacted antisegregation laws. Local authorities opposed to the antisegregation laws permitted racist vigilantes to brutally attack the Freedom Riders. National television coverage of these violent attacks created widespread sympathy for the civil rights movement. According to historian John Dunn, the racial violence that was shown to millions of viewers around the country, “Aroused the conscience of the nation.”14 It also led to the passage of the Civil Rights Amendment to the U.S. Constitution in 1964. A similar phenomenon occurred during the early days of the American Revolution. The majority of American colonists in the early 1770s were proud British citizens who viewed the independence movement as a fringe protest led by rabble-rousers. Britain’s crackdown on its unruly American colonists, however, helped shift the tide of colonial sympathy in favor of the independence movement. In particular, events like the British occupation of Boston and the Boston Massacre helped foment protest and expand the independence movement. In 2005, an effort to suppress the protest of Syria’s occupation of Lebanon backfired. Syria invaded Lebanon in 1976 and occupied it with thousands of troops and intelligence agents. The invaders also installed a Syrian-friendly government. Lebanon’s popular former prime minister, Rafik Hariri, was a loud opponent of the Syrian occupation of his country. His assassination on February 14, 2005, by suspected pro-Syrian terrorists triggered massive antiSyrian protests that forced Prime Minister Omar Karami to dissolve his proSyrian government and Syria to promise to pull its forces from Lebanon. Reinforcing feedback significantly complicates authorities’ efforts to squelch protest. It creates tremendous uncertainty regarding whether their tactics to crack down on protest will stop it or make it more successful. So far, government authorities have successfully quelled protest in Iran and China; however, whether they will be able to continue to do so in the future is highly uncertain.
Staying the Course Fred Manly, an executive at Brand & Company, convinced his firm’s management committee to invest $20 million in developing a new computer system to replace a decades-old manual process used by most of the firm’s
The Vicious Cycle
61
employees. He launched the project with widespread enthusiasm, forming a large internal task force and hiring a prestigious consulting firm to help design and build the system. The task force completed the system within a year and began deploying it throughout the company. Six months later, however the task force discovered that almost no one was using the system, and employees had resorted to using their existing manual processes. Interviews with employees indicated that their failure to use the system was because it lacked important features that made the system less efficient than the old manual processes. The task force created specifications for the needed system enhancements, which they estimated would cost at least $10 million to implement. The executive committee quickly approved Manley’s request for the additional $10 million to ensure that its original $20 million investment would produce a favorable return. The task force completed the new features a year later and deployed the enhanced system throughout the organization. Six months later, the task force found once again that employees stopped using the system because further deficiencies made it less efficient. Manley returned to the executive committee to secure an additional $10 million to fix the system. Brand & Company management continued to invest millions of dollars to enhance the system for five more years, and employees still refused to use it. Although some senior managers began to doubt whether the project would ever work, the executive committee continued to spend money shoring up the system, because it seemed irrational and politically dangerous to abandon it after having made such a large investment. Ten years after the project had begun, Brand & Company’s new chief executive officer ceased further development of the system and wrote off the project as a total loss. Although I have disguised some of the details, this Brand & Company saga actually happened and exemplifies the widespread human condition that economists call the sunk-cost fallacy, which encourages managers to waste colossal resources spending good money after bad. Reinforcing feedback is at the heart of the fallacy: the more we invest in a failed project, the more we become committed to making it succeed, which in turn creates justification to further invest in the project. According to the sunk-cost fallacy, the past is irrelevant—a truism that humans have a difficult time recognizing. Money invested in a venture is gone; it can never be retrieved. The future is the only thing that matters. Whether additional money invested in a venture will provide a favorable return is a new question, regardless of what happened in the past. We continue to throw good money after bad for a least two reasons. Humans have an inherent aversion to wasting time and money devoted to some venture, so past investments
62
Best Laid Plans
justify its continued support. Second, to stop a venture before it has been successfully completed is a declaration of failure, and the fear of getting being blamed for wasting money on a failed project drives one to continue supporting it in the hope that it will ultimately succeed. Ironically, however, this vicious cycle sets the stage for ever greater failure. The sunk-cost fallacy pervades our lives in many ways. Business managers continue to support products, systems, acquisitions, and other ventures long after it is clear that such projects were failures and not worthy of further expenditures and resources. In our personal lives, we continue reading a boring book because we have spent so much time already getting through a couple of chapters. We continue with failed relationships because we have sustained them for many years. We continue with jobs and the pursuit of college degrees because we have already devoted so much time to these endeavors. The most deleterious instance of the sunk-cost fallacy is continuing an ill-fated war to justify the lives already lost, as in the case of the war in Iraq that began in 2003. On October 25, 2005, when the death toll of U.S. soldiers in Iraq had reached 2,000, President George W. Bush addressed a luncheon for the Joint Armed Forces Officers Wives’ Club. He stated, “We’ve lost some of our nation’s finest men and women in the war in Iraq. And the best way to honor the sacrifice of our fallen troops is to complete the mission and lay the foundation of peace by spreading freedom.”15 The president relied on loss of past lives to justify continuing the war in Iraq, which would lead to further soldier fatalities. He used the phrase “we will stay the course” so frequently that it ultimately became a political liability, and he stopped using it by October 2006, just before the midterm elections. According to the Washington Post, President Bush was “cutting and running from ‘staying the course.’ ”16 President Bush was trapped by the sunk-cost fallacy. Instead of using the loss of lives in the past as justification for continuing the war in Iraq, he should have been contemplating what the future holds. He should have been asking questions such as: What value do Americans or the newly liberated Iraqis derive from winning the war? Does the U.S. occupation cause more harm than good in terms of creating a cause célèbre for terrorists? What chance of success do we have if we continue? What will it cost in American and Iraqi lives and U.S. taxpayers’ money? With striking similarity, President Johnson also got trapped by the sunkcost fallacy in his continuation of the Vietnam War long after winning the war became doubtful. Johnson even used the same justification for continuing the Vietnam War as Bush did for pressing on in Iraq. On March 15, 1967, Johnson told the Tennessee State Legislature that “America is committed to
The Vicious Cycle
63
the defense of South Vietnam until an honorable peace can be negotiated. . . . We shall stay the course.”17 As U.S. casualties mounted, more troops had to be sent to win the war in order to justify the loss of human lives. Journalist Barry Schwartz noted, “It became more and more difficult to withdraw, because war supporters insisted that withdrawal would cheapen the lives of those who had already sacrificed. We ‘owed’ it to the dead and wounded to ‘stay the course.’ We could not let them ‘die in vain.’ What staying the course produced was perhaps 250,000 more dead and wounded.”18 The cycle persisted until massive protests prompted Johnson to leave office after a single term and forced President Nixon to leave Vietnam under the short-lived settlement with the North Vietnamese called Peace with Honor. The reinforcing feedback in dealing with sunk costs creates a powerful force that keeps us committed to failed causes, including investments of time and money and human casualties in war. Recognizing the cold reality that past investments in unsuccessful initiatives are irrelevant for making decisions for the future is a way to avoid the sunk-cost trap.
Lessons from Reinforcing Feedback Understanding how reinforcing feedback can cause events to spin out of control helps one avoid many pitfalls in life. The lessons about reinforcing feedback mostly relate to avoiding conflicts and are similar to those regarding chain reactions: avoid creating a conflict, and, if one has started, put a quick end to it. In particular, the art of diplomacy takes on new meaning when one considers the power of reinforcing feedback that drives feuds, wars, arms races, and trade wars. It is tragically ironic that our social world is so rife with conflict when studies have shown that cooperation among rivals is much more effective. Robert Axelrod, a professor of political science at the University of Michigan, conducted a famous experiment on conflict and cooperation that is documented in his seminal book The Evolution of Cooperation. Axelrod set up a contest to find a strategy that “will yield a player the highest possible score in a world of selfish individuals seeking to maximize their own gains.”19 He tested 14 competitive strategies submitted by game theorists using 120,000 computer simulations and found that the following rules of engagement maximized participants’ success in dealing with rivals: •
Do not start a conflict: given the power of reinforcing feedback, it is best not to start a feud. Axelrod calls this being nice. • Do not be envious of your rival: your individual success is what matters; your rival’s success is irrelevant.
64
Best Laid Plans
•
•
Retaliate in order to avoid being exploited by your rival, but do so only once. This, according to Axelrod, strikes a “balance between punishing and forgiving.”20 Do not be too clever. Axelrod found that complex strategies often failed, because they were “making inferences about the other player and these inferences were wrong.”21
History suggests that Axelrod’s rules make a lot of sense. A number of U.S. presidents have needlessly, and in some cases dangerously, provoked their rivals. On March 1, 1962, correspondent Stewart Alsop interviewed President Kennedy for an article in the Saturday Evening Post during which Kennedy baldly stated that the United States would consider a first nuclear strike against the Soviet Union: “In some circumstances we might have to take the initiative.”22 The Soviets immediately ordered a special military alert and called for a greater investment in nuclear missiles. President Reagan provoked the Soviet Union by calling it an “evil empire.” U.S. relations with China soured when President Clinton publicly criticized its human rights abuses. U.S. relations with Iraq, Iran, and North Korea deteriorated after President George W. Bush called them an “axis of evil.” Axelrod’s first and third rules are reminiscent of Teddy Roosevelt’s “speak softly, but carry a big stick.” Avoid provoking rivals, but let them know that you can and will retaliate with strength. Axelrod’s notion of retaliating only once reflects the different outcomes of World Wars I and II. After winning World War I, the French sought to punish Germany with ruinous war reparations—retaliating twice—which set the stage for the rise of Adolf Hitler and the onset of World War II. In contrast, the victorious allies at the close of World War II helped Germany and Japan rebuild their war-torn countries under the Marshall Plan and sought peace with their former adversaries. History also suggests that rivals can unilaterally stop conflicts if they possess sufficient courage, humility, and forgiveness. In 1963, President Kennedy addressed an audience at American University saying, “If we cannot now end our differences, we can at least make the world safe for diversity.”23 Kennedy’s speech reduced tensions with the Soviet Union and paved the way for the two countries to sign the Nuclear Test Ban Treaty six weeks later. Two decades later, Soviet premier Mikhail Gorbachev, frustrated by the stalemate in arms talks with the United States, unilaterally decided to stop the missile race by reducing the Soviet Union’s nuclear arsenal. The United States soon followed suit with its own reduction in missiles. In 1977, Egypt’s president Anwar Sadat, frustrated by years of war and conflict with Israel, flew to Jerusalem to present a peace offering to Israel’s
The Vicious Cycle
65
parliament. Two years later, the two countries signed the first treaty between Israel and an Arab country, and Israel returned to Egypt the Sinai Peninsula, which Egypt had lost to Israel during the Six-Day War in 1967. Despite the devastating attack on Mumbai, India, in 2008 by terrorists from Pakistaniadministered Kashmir, the leaders of the two subcontinental rivals began peace talks in 2010. Finally, there is the question of how to avoid getting stuck in sunk costs that involves a perplexing two-horned dilemma. An overly cautious commitment to a project could cause it to fail for lack of leadership, investment, and attention. Excessive commitment could, alternatively, trigger the vicious sunk-cost cycle. The only solution is to contain the initial scope of a project with strict milestones and modest expectations and then to proceed as follows: • • • •
Limit the scope of the initial effort to a pilot project. Push the project hard while keeping an optimistic but dispassionate perspective about its probable success. Evaluate and candidly report the project’s status and likelihood of success. End the project, regardless of past investment, as soon as its success becomes doubtful or the level of success cannot justify further investment.
Chapter 5
The Bandwagon Effect Lock-In In competitions, small events can trigger an unabated success-breedssuccess dynamic that determines an uncontested winner. At the outset of a war, for example, it is generally unclear who will prevail. This was certainly true for World Wars I and II. The process may begin with small successes such as winning a simple battle or with misfortunes such as encountering bad weather. Small successes can breed more success, while small misfortunes can trigger a downward spiral of greater calamity. The party that gains a slight edge increasingly deprives its enemy of soldiers, supplies, and strategic vantages. Battlefield success encourages citizens to further invest in the war; failure works in the opposite direction. The continued process of success breeding success can escalate to a point where an irreversible pathway of future events determines the victor. Economists call this irreversible pathway of events lock-in or the bandwagon effect. Lock-in occurs during competitions among people, parties, ideas, technologies, organizations, countries, and other social entities when reinforcing feedback escalates until the pathway to success is determined for the winner. Lock-in concludes when a victor achieves a monopoly status and is difficult to dislodge. Lock-in creates highly unpredictable and sometimes irrational outcomes. Lock-in, for example can cause inferior products, ideas, and technologies to become the industry standard. The onset of lock-in can be difficult to anticipate, especially for people who believe— correctly or otherwise—that their ideas are superior. At the outset of a rivalry, it is uncertain who the winners and losers will be. It is even more difficult to determine the threshold where success begins to lock in a sure winner. Historians, for example, continually argue about the possible thresholds of reinforcing feedback that spelled the end of World War II or that triggered the Great Depression. Small events, decisions, acts, and chance events can have the unintended consequences of triggering the lock-in process.
68
Best Laid Plans
The De Facto Standard Apple Computer’s refusal to share its superior Macintosh operating system in the 1980s ironically caused it to become a minor player in the personal computer (PC) industry. Founded in 1976, Apple was one of the first developers of the PC and, with the introduction its popular Apple II in 1977, was fast becoming the industry standard for software developers. Sales escalated when VisiCorp developed the first spreadsheet program for the Apple II called VisiCalc—the predecessor of Microsoft’s Excel. Apple had minimal competition until IBM introduced its own PC in 1981 using Microsoft’s Disk Operating System (DOS). In the mid-1980s, Apple Computer was poised to dominate the personal computer market when it introduced the highly innovative Macintosh computer with the first commercial graphic user interface (GUI) with a point-and-click format that was vastly superior to Microsoft’s primitive, command-oriented DOS operating system. The advent of the microprocessor, known as a computer chip, transformed the industry by enabling new manufacturers around the world to manufacture PCs. While Apple carefully guarded the inner workings of its hardware, IBM enabled other PC manufacturers to copy the architecture of its hardware by using standard components and publishing the IBM-PC Technical Reference Manual. New PC manufacturers called IBM clones flooded the market with inexpensive IBM-compatible PCs equipped with Microsoft DOS, and the IBM/DOS platform was becoming favored by developers of complementary hardware and software products. Amid this intensifying competition in January 1984, Apple introduced its highly innovative Macintosh computer. Apple was not the sole developer of GUI, and, in fact, it had borrowed the idea from Xerox. VisiCorp also had its own GUI called VisiOn, which had inspired Microsoft’s CEO Bill Gates to develop a GUI operating system called Interface Manager, which was later renamed Windows when it was released in November 1985. Meanwhile, Microsoft had become a major software developer for Apple since the late 1970s, creating applications like Applesoft Basic. At one point, Microsoft had more employees developing applications for the Macintosh than Apple had employees. Gates became concerned about Microsoft’s dependency on Macintosh when its sales stagnated in 1985. Reflecting on the pivotal year of 1985, Gates said in a 1996 speech, “We’d really bet our future on the Mac. We stopped doing DOS application development and did all our work in the graphical-operating environment, so we were very worried in ’85 when Mac sales slowed down actually from the first year.”1 The story that Gates stole the idea for Windows from Macintosh is a bit unfair. He greatly admired Macintosh and had offered to help Apple make
The Bandwagon Effect
69
it become the industry standard, despite the fact that Microsoft was developing its own GUI operating system. In a legendary letter sent to John Scully on June 25, 1985, Gates urged Apple’s CEO to license Macintosh to other computer manufactures, arguing: Apple must make Macintosh a standard. . . . But no personal computer company, not even IBM, can create a standard without independent support. . . . The industry has reached the point where it is now impossible for Apple to create a standard out of their innovative technology without support from, and the resulting credibility, of other personal computer manufacturers. Thus, Apple must open the Macintosh architecture to have the independent support required to gain momentum and establish a standard.2
Gates’s memo went on to say that Apple’s proprietary strategy inhibited the PC industry from enhancing Macintosh with new hardware and software applications. Conversely, Gates said that IBM’s open architecture “probably has more than a hundred times the engineering resources applied to it [than Macintosh] when investment of compatible manufacturers is included.” After receiving no reply from Apple, Gates wrote a second memo to Scully, saying, “I remain enthusiastic about the benefits of licensing Mac technology. Currently I think the following companies are the best choices. . . . I want to help in any way I can with the licensing. Please give me a call.”3 Apple believed that its products were the wave of the future and had resisted licensing them. Its second refusal to respond to Gates prompted Microsoft to launch its own GUI operating system called Windows in November 1985. Despite being vastly inferior to the Macintosh operating system, the IBM/Windows platform became the de facto global industry standard for PCs, and, as it cornered the market, Apple’s market share plummeted to around 7 percent in 1995. Apple eventually began licensing Macintosh in 1995, but by then it was too late. The market had already locked onto the IBM/Windows platform, and users and developers were reluctant to switch to a new one based on the combination of the Macintosh operating system and IBM-compatible PCs — even if it were far superior. Furthermore, Microsoft’s Windows 95 had largely caught up with Macintosh, and Apple’s future at the time looked bleak. Reflecting on crucial events in the mid-1980s, Apple cofounder Steve Wozniak commented on Apple’s failed strategy: Apple saw itself as a hardware company. In order to protect our hardware profits, we didn’t license our operating system. We had the most beautiful operating system, but to get it you had to buy our hardware at twice the price.
70
Best Laid Plans
That was a mistake. What we should have done was calculate an appropriate price to license the operating system. We were also naïve to think that the best technology would prevail. It often doesn’t.4
Apple’s management failed to see that the PC industry was rapidly becoming locked in to the IBM/Windows platform as the industry standard. Industry standards provide numerous benefits to almost all players in an industry, including users, manufacturers, service providers, and distributors. A single standard reduces costs and promotes innovation. It was highly inefficient, for example, for software firms to develop applications for both Macintosh and IBM/Windows platforms. It was also highly inefficient for corporations to maintain, support, and train employees on two incompatible PC platforms. A single standard promotes innovation by reducing the risk that investments in a particular standard will be squandered if it becomes extinct. In his book Standards Strategy and Policy, Peter Grindley notes that single standards also benefit an industry through greater connectivity, complementarity, and portability.5 Standards enable multiple users to interconnect with each other using equipment made by different companies. Single standards create a much larger market for complementary products, which increases competition and production scale, lowers prices and training costs, enhances product variety, and increases the number of people available for support. Single standards also make it easier and efficient for users to move their applications to different manufacturers’ products. In the absence of an official, agreed-upon standard, market dynamics can cause a de facto one to emerge. Reinforcing feedback drives the process. As the installed base for a product increases, it becomes that much more valuable to users and manufacturers of the product and its complements. It gains credibility, becomes more portable, reduces investment risk, and attracts more complementary products. The success-breeding-success process can continue until the industry has—in an unplanned way— established the product as the de facto industry standard. Grindley described the lock-in process as follows: “The leader gets a disproportionate number of adoptions and ‘bandwagon’ effects take over. The leader may soon have such a large advantage that all new users choose it, ‘tipping’ support towards the leading standard and sweeping the market.”6 Economists call this reinforcing feedback dynamic that can create a de facto standard a network effect” in which the value of a network—like the IBM/Windows platform—increases with its size. Ethernet founder Robert Metcalfe sought to quantify the network effect in what is called Metcalfe’s
The Bandwagon Effect
71
Law: the value of a network increases with the square of it size. When, for example, the number of users of the Windows/IBM platform doubled, its value to the industry quadrupled. Metcalfe’s Law is a metaphor that illustrates that the value of a network grows exponentially. The world needed a single standard for PCs and eventually got one—the wrong one. The standard platform for PCs should have been Macintosh/ IBM, as Gates concluded during the summer of 1985, and had Apple agreed to license Macintosh in 1985 with Gates’s help, it might have succeeded. The information technology world might be very different today if the entire PC industry devoted all its resources to enhancing a standard Macintosh operating system that in 1985 was 10 years ahead of Windows 1.0.
When Bedfellows Trigger Lock-In It is strangely ironic that today’s keyboards are laid out in an odd, inefficient format created in 1874 called QWERTY, so named for the first six letters on the top row of keyboards. The story of how QWERTY became locked in as the standard for keyboards begins with Milwaukee newspaper editor and printer Christopher Sholes’s efforts in the 1860s to create the first typewriting machine. The crude model he patented in October 1867 had two rows of letters in alphabetical order, which resulted in a serious jamming problem that greatly impeded typing speed. Sholes spent the next six years experimenting with different keyboard arrangements and discovered that the jamming problem could be reduced by separating letters that were frequently typed sequentially, for instance, S and T. Keys frequently used in a sequence would then strike from wide angles as opposed to striking sideby-side, which tended to jam. His final keyboard format is the one used today with its illogical and obviously inefficient arrangement. Sholes patented his typewriting machine along with its QWERTY keyboard in 1873 and hired gun manufacturer E. Remington & Sons to make them. By the end of the decade, sales took off and gave birth to the typewriter industry. By 1880, however, the future of QWERTY was in question as its installed base included only 5,000 machines and, from 1909 to 1924, competitors had patented seven improved keyboards. One called the Ideal, for instance, arranged the letters DHIATFNSOR on home row—the resting place for fingers on the keyboard. This arrangement enabled users to type 70 percent of all English words with minimal finger movement, which greatly improved typing speed and accuracy. Furthermore, engineering improvements in typewriters had minimized the key-jamming problem and, thus, the original need for QWERTY.
72
Best Laid Plans
The biggest challenge to QWERTY came in 1932, when Professor August Dvorak from Washington State University developed the Dvorak Simplified Keyboard. Dvorak’s keyboard increased typing speed by placing all five vowels and the five most frequently used consonants on home row. This enabled users to type about 400 of the most frequently used English words and spend about 70 percent of their time on home row. In contrast, QWERTY users spent only 32 percent of their time on home row, from which they could type only 100 frequently used words. When Apple Computer introduced its IIC model, which allowed users to automatically switch from QWERTY to Dvorak, it claimed that using Dvorak increased typing speed by 20 to 40 percent. Despite these challenges, QWERTY has remained the keyboard standard even to this day, when key jamming is impossible on PCs. The process that led to QWERTY’s dominance began around 1890 with the advent of touch typing as a vast improvement over the traditional fourfinger hunt-and-peck typing method. Instructors began teaching touch typing using the QWERTY format, which triggered a bandwagon effect favoring the older keyboard. Corporations in the 1890s were the biggest buyers of typewriters, and they favored machines with QWERTY keyboards because the largest pool of typists had learned touch typing using this format. The majority of aspiring typists, in turn, chose to learn touch typing on QWERTY keyboards to increase their job opportunities. This reinforcing feedback increasingly favored QWERTY and ultimately caused it to become locked in as the de facto industry standard. QWERTY remained locked in, because switching to other formats—even more efficient ones— would have been prohibitively expensive. Stanford economist Paul David noted in a famous article titled “Clio and the Economics of QWERTY” that typewriter users and makers today are “held fast in the grip of events long forgotten and shaped by circumstances in which neither they nor their interests figured.”7 The lock-in of QWERTY illustrates how complementary goods and services—in this case, touch typing instruction—can trigger the bandwagon effect. The same thing happened during the competition for VCR technologies between Sony and Matsushita. The two firms had banded together to create a single standard for an early version called the U-Matic, which had some successful commercial applications but was too expensive for the mass consumer market. The souring of their U-Matic partnership, however, had serious consequences when Sony and Matsushita developed their own standards for the mass-market VCR. Sony had developed the first consumer VCR in July 1974 using a technology called Betamax and tried to convince Matsushita to adopt it as an
1975 1976 1977 1978 1979 1980 1981
Lock-In Taking Hold with Introduction of Prerecorded Videocassettes
1982 1983 1984 1985 1986 1987 1988
Beta
VHS
Source: Michael A. Cusumano, Yiorgos Mylonadis, and Richard S. Rosenbloom, “Strategic Maneuvering and Mass-Market Dynamics: The Triumph of VHS over Beta,” Business History Review 66, no. 1 (1992): 51.
0
5,000
10,000
15,000
20,000
25,000
30,000
35,000
40,000
45,000
50,000
FIGURE 5.1. Annual Production of Videocassette Recorders, 1975–1988
Number of VCRs (in thousands)
74
Best Laid Plans
industry standard. During the meeting, Matsushita failed to mention that it was developing its own VCR using a different technology called VHS. Failing to get a partnership agreement with Matsushita, Sony launched its Betamax VCR in February 1976. In April 1976, Matsushita tried to get Sony to adopt the VHS standard, but Sony refused, claiming that VHS was a knockoff of its Betamax technology. A colossal battle of competing technologies erupted in June 1977, when Matsushita introduced its VHS-based VCR. At the outset, product reviewers had difficulty determining which firm had the better product. Sony’s Betamax VCR was more compact and better designed for handling tapes; Matsushita’s VHS model had longer playing time. Reviewers disagreed over which firm had the best picture quality. Having entered the market earlier, Sony had an initial sales advantage from 1975 to 1977, but Matsushita caught up in 1978 by charging lower prices. Betamax and VHS technologies may have coexisted as long as VCRs were used primarily for taping television programs. The rules of the game changed, however, with the advent of prerecorded videocassettes and video rental stores. Prerecorded videocassettes became an important complement to the VCR that triggered a bandwagon effect. Rental stores had to pick one of the two technologies, because it was too costly to maintain duplicate movie inventories for both Betamax and VHS. Rental stores began favoring VHS due to its slightly larger installed base, and the initial success of VHS bred more success to the point where the bandwagon effect rapidly drove Betamax out of the market and established VHS as the locked in de facto industry standard for VCRs, as shown in Figure 5.1.
Jumping on the Bandwagon Lock-in also happens in political elections. In fact, the phrase “bandwagon effect” originated during 19th-century political campaigns. Wagons carrying bands were frequently used during parades, circuses, and other public events, and, in 1848, a popular clown named Daniel Rice began using bandwagons to attract attention during political campaigns. Around 1900, politicians began sitting on bandwagons to flaunt their supposed leading positions. The notion of jumping on the bandwagon became a derogatory metaphor for mindless voters supporting politicians merely because the politicians appeared to be winning. By the mid-20th century, social scientists coined the bandwagon effect to describe faddish behavior in consuming goods and services. Former Harvard economist Harvey Leibenstein described the bandwagon effect in a 1950 article as “the extent to which the demand for a commodity is
The Bandwagon Effect
75
increased due to the fact that others are also consuming the same commodity.”8 The more that others wanted the commodity, the more you would too. The bandwagon effect today is frequently used synonymously with lock-in to describe a process where success breeding success leads to a single dominant player. The bandwagon effect is ever present in today’s political campaigns, and election officials are always concerned that the victors will be those who gained an early lead as opposed to those having the best credentials. This is why news organizations delay reporting election returns on Election Day—to avoid influencing late voters who might favor the candidates who had gained early leads. Of particular concern is that the disclosure of East Coast voting during presidential elections might influence voting in the rest of the country. Presidential candidates routinely seek to exploit the bandwagon effect by focusing campaign resources on winning the early contests in Iowa and New Hampshire. Such a strategy, for example, helped Senator John Kerry secure the Democratic nomination in January 2004. After months of arduous campaigning in New Hampshire, Kerry ranked a disappointing third in the state’s polls conducted just a month before its primary. Furthermore, interviews with potential voters raised the question of whether Kerry was electable. Kerry suspended his New Hampshire campaign to focus on winning the earlier Iowa Caucus. His win in Iowa triggered a bandwagon effect that rekindled his campaign in New Hampshire. A few days after Kerry’s Iowa victory, polls showed that he had taken a commanding lead in New Hampshire, receiving 31 percent of voter support, up from a mere 12 percent in the weeks before the Iowa Caucus. Kerry’s chief fundraiser, Alan Solomont, noted that, after the win in Iowa, “everything has changed. . . . We’re hearing from a lot of people who thought John Kerry’s candidacy was dead a month ago, and are now coming back after Iowa.”9 Kerry’s successful gamble on winning the Iowa Caucus staved off near certain defeat in New Hampshire and the possible end of his national campaign.
Lingua Franca It is interesting to speculate whether global communications and technology will have the unintended consequence of locking in English as the world’s first lingua franca, despite the fact that English ranks fourth behind Mandarin, Arabic, and Spanish as a native tongue. Lock-in frequently occurs when a dominant language in a multilingual region becomes commonly
76
Best Laid Plans
spoken by the vast majority of people, which can cause the decline and even extinction of minor languages. People most frequently learn a new language when it is practical or necessary to do so and when it makes them more successful commercially and socially. Wayt Gibbs, senior writer for Scientific American, noted, “Small communities choose, often unconsciously, to switch to the majority language because they believe it will boost their social and economic status.”10 A common second language in a region is called its lingua franca, a language that is commonly used to communicate among persons speaking different primary languages. Sometimes common second languages become so dominant that countries declare them to be the official language for efficient communication in government affairs, legal matters, and educational systems. British linguist David Crystal noted, “Such an [official] language is often described as a ‘second language’, because it is seen as a complement to a person’s mother tongue, or first language. . . . To get on in these societies, it is essential to master the official language as early in life as possible.”11 A language is more likely to become the lingua franca in a region when spoken by its elites, as in the case of Latin in the Roman Empire, Spanish in the Americas, Russian in the former Soviet Union. A common second language, or lingua franca, has all the benefits of an industry technical standard. It makes it highly efficient for people speaking multiple mother tongues to communicate using a single language versus translating many combinations of primary languages. Knowing the lingua franca increases a person’s marketability in the workforce and reduces the costs of switching jobs in different locations. A common language also increases the market for and variety of complementary products like literature and media. The emergence of English as the world’s dominant second language came about by a number of chance events. First, the dominance of the British Empire spread the English language around the world for several centuries. Second, the dominance of U.S. technology, communications, and global commerce in more recent decades has built English into communications infrastructure. Crystal predicted in his 2003 book English as a Global Language that “We may well be approaching a critical moment in human linguistic history. It is possible that a global language will emerge only once . . . after such a language comes to be established it would take a revolution of world-shattering proportions to replace it. . . . All the signs suggest that this global language will be English.”12 In reflecting on the lack of a special status for English, Crystal further noted that the locking in of English as the world’s lingua franca is unplanned and unintended: “There is hardly any conscious justification for the role of English.”13
The Bandwagon Effect
77
Around 1950, for the first time in history, a number of global governing organizations emerged, including the United Nations (UN), the World Bank, the United Nations Educational, Scientific, and Cultural Organization, the World Health Organization, and the International Atomic Energy Agency. The UN today has 190 member countries discussing global and regional matters and communicating in so many languages that it incurs costly translation services. Similarly, the Atomic Energy Agency spends half its budget on language translation. It may, in fact, be next to impossible to translate every combination of so many different primary languages. Increasingly, these global organizations use English as a standard second language. A 1996 survey of over 12,000 international organizations indicated that 85 percent of them used English as an official language and that 90 percent of those from the Asia Pacific region exclusively used English. Although the European Union initially used French as its common language, the addition of the United Kingdom and English-competent Denmark, Finland, and Sweden tipped the balance in favor of using English. The further addition of Eastern European countries accelerated the process as people from this region were much more likely to speak English than any other European language. English is quickly becoming the lingua franca of global commerce and multinational firms, even those domiciled in countries with native languages other than English. The Organization of the Petroleum Exporting Countries conducts its meetings in English. English is the official standard language for airline pilots and ground controllers. All major science publications are written in English. A 2004 survey found that 80 to 90 percent of scientific journals are written in English, up from 60 percent in the 1980s. Scientists publishing their work in languages other than English are greatly disadvantaged. As Alistair Wood, professor at Vanderbilt University, noted, “For a scientist to publish in a journal other than English therefore is increasingly to cut herself off from the worldwide community of scientists who publish in English. The work may be ignored simply because it is published in a language unknown to the rest of the world.”14 English heavily dominates communications media. English-speaking countries publish 57 percent of the world’s newspapers, and the world’s top five newspapers are written in English: the New York Times, the Washington Post, and the Wall Street Journal from the United States and the Times and the Sunday Times from the United Kingdom. Major journals from non–English speaking countries like Germany’s Der Spiegel have created English-language editions and Web sites to have a greater impact on global debate, garner more international advertising revenues, and gain greater access to world leaders.
78
Best Laid Plans
The fact that English-speaking countries invented modern mass media has further propelled the dominance of English. The United States made the first public radio broadcast in 1920. Hollywood introduced sound to films in the 1920s and English has dominated film ever since. By the mid1990s, U.S. film studios accounted for 85 percent of global film revenues, and 97 percent of actors and actresses in the world performed in English. Ninety-nine percent of popular music groups around the world frequently perform in English. The use of the Internet is further locking in English as the world’s lingua franca. Although the U.S. Department of Defense developed the Internet strictly for military purposes, it evolved into a global platform enabling the world to freely communicate, with English being the language most commonly used. Although the Internet is used throughout the world, about two-thirds of its traffic emanates from the United States. Japan is a distant second, generating 7 percent of Internet traffic. One study estimated that 80 percent of the world’s electronic information is composed in English. As Michael Specter, writer for New Yorker magazine, noted, “If you want to take full advantage of the Internet, there is only one way to do it: learn English.”15 A quarter of the world’s population today can communicate in English and another one-third of it is learning to do so. Three hundred and fifty million Asians can speak English, which exceeds the combined populations of the United States, the United Kingdom, and Canada. Sixty percent of young Europeans are competent English speakers, and their second language is five times more likely to be English than French or German. English has an official status used in governing 70 countries, including India, Singapore, and Ghana. India, for example, has 20 official first languages and more than 1,000 spoken ones and officially uses English to run its government, educational system, and commerce. Even jihadists speak English to communicate with non-Arabs. What might stop English from becoming the world’s lingua franca? One possible answer is that no language has ever achieved this status before. Latin, for example, was Europe’s lingua franca, but its use as a common second language declined after the 15th century. Some linguists argue, however, that history is a poor predictor in this case, because English is too widespread and too embedded into global technology, science, education, governance, and commerce. John McWhorter, a linguist from research group Manhattan Institute, commented that, “English is dominant in a way that no language has ever been before. It is vastly unclear to me what actual mechanism could uproot English.”16 Linguist David Crystal estimated that
The Bandwagon Effect
79
it would take something from the realm of science fiction to reverse the lock-in of English, such as a cataclysmic event or a new translation technology that effortlessly permits people to communicate with thousands of mother tongues.
Resegregation A lock-in process created the ironic situation where blacks who left the segregated South became resegregated in northern cities. Before the 20th century, 90 percent of African Americans lived in the segregated South. Deteriorating economics, social adversities like job discrimination, and horrific events like lynchings prompted a mass migration of blacks in the early 20th century to northern cities. A second and larger northward migration of 5 million African Americans occurred between 1940 and 1970. University of Chicago sociologist Morton Grodzins theorized that this second migration was the unintended consequence of the federal government’s efforts to enforce civil rights laws in the South. “The ‘push’ from the South may grow stronger as the consequence of growing white antagonisms following attempts to enforce the Supreme Court’s non-segregation decisions.”17 As southern blacks migrated to northern cities, the predominantly white urban population left en mass for the suburbs, leaving blacks once again segregated. One theory for this departure is called white flight. Grodzins described the white flight dynamic in his 1958 book, The Metropolitan Area as a Racial Problem. He theorized that predominantly white neighborhoods would stay integrated as long as the percentage of nonwhites was relatively low. But once a threshold of nonwhites was reached, whites would leave in droves, which would create an all-black neighborhood that would stay that way for the foreseeable future. Grodzins called this threshold the tipping mechanism and described it as follows: “Once the proportion of non-whites exceeds the limits of the neighborhood’s tolerance for interracial living (this is the ‘tip point’), the whites move out.”18 Interestingly, Grodzins was the first to introduce the notion of tipping points that Malcolm Gladwell popularized four decades later in his 2000 best-selling book The Tipping Point: How Little Things Can Make a Big Difference. Critics of the white flight theory offered an alternative explanation for the departure of urban whites: the massive suburban development after World War II lured them into leaving cities for safer, cleaner, and roomier environs with better schools and lower taxes. A number of employers also left cities for the suburbs, so urban whites may have left cities to find jobs. Eric Bickford from the University of California, Berkeley, noted, “One
80
Best Laid Plans
cannot necessarily conclude that the black /white–city/suburb dichotomy arose purely on racial grounds. It is entirely possible that nonwhite concentration is simply a result of large-scale white suburbanization, not the cause.”19 Both notions of white flight and its tipping points have been controversial. Professor William Frey from the University of Wisconsin in 1979 concluded, for example, “White flight is an inappropriate description of the suburban ward movement of city whites.”20 Other critics have called Grodzins’s tipping points a myth. Although the reason for the massive departure of urban whites to the suburbs remains controversial, Figure 5.2 clearly indicates the lock-in of resegregated blacks in northern cities like Detroit. Another episode of resegregation occurred with the use of forced busing to integrate public schools in the United States. Although the U.S. Supreme Court outlawed segregated schools in 1954, the practice implicitly persisted because students attended local schools in their segregated neighborhoods. In the 1970s, however, federal courts sought to enforce integration by mandating the busing of students to schools outside their neighborhoods. A 1973 Gallup Poll showed that most U.S. citizens were comfortable sending their children to integrated schools but were strongly opposed to busing. Seventy-three percent of northern white parents indicated they had no objection to sending their children to a school that was half black; the figure for southern whites was 64 percent. The same Gallup Poll reported that only 4 percent of white parents and 9 percent of black parents approved of busing as a means for integrating schools. Forced busing had the unintended consequence of resegregating schools as white students left urban public schools. Busing prompted many urban white families to move to the suburbs, and those that remained enrolled their children in private and parochial schools or used relatives’ addresses to enroll their children in suburban schools. The process continued until urban schools became locked in as predominately black. The failure of busing to integrate urban schools prompted courts to largely abandon the controversial program in the early 1990s. The unintended consequences of busing were especially acute in Boston, where racial conflict erupted into violence. Before busing, Boston’s public schools were about 60 percent white and 30 percent black, but schools were highly segregated due to Boston’s racially distinct neighborhoods. In 1974, U.S. District Court Judge W. Arthur Garrity Jr. ruled that Boston’s schools were unconstitutionally segregated and ordered the forced busing of students across the city’s highly segregated neighborhoods. Judge
0
10
20
30
40
50
60
70
80
90
100
1900
1910
1930
1940
1950
1960
1970
1980
Source: “American Fact Finder,” U.S. Census Bureau, Washington, D.C., 2010
1920
African American
White
FIGURE 5.2. The Lock-In of Resegregation in the City of Detroit, 1900–2000
Percent of Population
1990
2000
2010
82
Best Laid Plans
Garrity became so involved in enforcing the busing program that he became known as the de facto superintendent of Boston schools for eight and a half years. Particularly ill-fated was Judge Garrity’s decision to integrate two of Boston’s most ethnically distinct and poorly performing schools: the predominantly poor Irish South Boston High School and the predominantly black Roxbury High School, which was located in a ghetto. Dozens of racial incidents erupted at South Boston High School, forcing its closure. It reopened a month later with metal detectors and 500 police officers patrolling its halls. Known for its early support of abolition in the 19th century, Boston ironically acquired a reputation for bigotry, hatred, and racial conflict. Judge Garrity’s efforts to integrate Boston’s schools backfired, causing the schools to become predominantly black. Figure 5.3 shows that, although white students were already leaving Boston schools before 1974, possibly as part of the general migration of whites to the suburbs, busing greatly accelerated the process. Boston’s schools became locked in to a resegregated state, transforming them from being 60 percent white to 14 percent white in a city where the overall population had remained 55 percent white. Boston’s busing program had further unintended consequences. Boston’s school enrollment during the 1974 to 1990 busing era dropped from 93,000 to 57,000 students, which forced the closure of 78 schools. At the end of the busing era, Boston schools ranked 275 out of 279 cities in Massachusetts schools on SAT scores. In 1999, former Boston mayor Raymond Flynn reflected on the busing years, “I only wish that the decision [Judge Garrity] made about the Boston public schools didn’t disrupt the education of our children and the stability of our city as it did. In my opinion, public schools in Boston still haven’t recovered from the decision.”21
Lessons from Lock-In Lock-in causes extreme outcomes and is fraught with uncertainty and paradox. In conflicts between people, ideas, tools, technologies, cultures, and other social entities, the victor wins with the unconditional surrender of the vanquished. Lock-in can be hard to anticipate, especially for those who believe that they have the superior product, idea, technology, or practice. Early events can unwittingly trigger lock-in and ironically cause superior ideas to lose out to inferior ones. Failure to share leading ideas can cause their owners to lose out, and sometimes sharing proprietary ideas is the best strategy to avoid getting locked out of the contest.
Court-Ordered Busing
White Students
African American Students
Source: Boston Facts and Figures 1990 and 2002, Boston Municipal Research Bureaus and “Boston’s Experience with School Desegregation and White Flight,” Massachusetts State Department of Education, 1977.
0 4 6 8 2 4 6 8 0 0 4 6 2 2 8 6 4 8 0 /6 /7 /7 /6 /6 /8 /7 /7 /7 /8 /9 /9 /9 /8 /8 /9 /9 /8 00 63 65 77 79 67 69 71 73 75 81 93 91 95 97 9/2 83 85 87 89 9
0
10
20
30
40
50
60
70
80
FIGURE 5.3. Resegregation in Boston’s Public Schools, 1963–2000
Number of Students in All Grade Levels (in thousands)
84
Best Laid Plans
The first step in avoiding the lock-in dilemma is to recognize situations where it is likely to occur. Consider, for example, whether the world would benefit from a single standard. Would such a standard increase efficiency, expand markets, and reduced risks? Is there a potential network effect? Do people need to freely exchange information? Are there complementary products that would greatly benefit from a single standard? Would people greatly benefit from a standard product when they change jobs, locations, or manufacturers? If lock-in seems possible, it is next important to consider which strategy to pursue. It might be desirable to negotiate with competitors to develop an agreed-upon industry standard, as Bill Gates sought to do for Macintosh. If this fails, you can either follow industry developments and adopt emerging de facto standards or seek to make your product become the de facto standard. The latter would be a bold plan with uncertain outcomes. Peter Grindley suggests a number of tactics to maximize success in pursuing the de facto industry strategy. First, make it easy for all parties to jump on your product’s bandwagon by charging low prices and licensing fees. Share the product’s inner workings so that others can copy it or make complementary products for it. Conversely, keeping one’s product proprietary is a losing tactic, because it impedes adoption and is an unrealistic expectation of having your cake and eating it too. Such a tactic may also be impossible. As Grindley noted, “Protecting innovations from copying is notoriously difficult.”22 Next, it is important to increase your installed base in order to “establish a ‘critical mass’ of users more quickly than competing standards and start the network [bandwagon] effects working for the product.”23 This involves strong promotion, penetration pricing, and forming alliances with distributors, competitors, and makers of complementary products. Establish credibility to overcome users’ fears by demonstrating commitment to the product, opening its architecture, and using the support of alliance partners. Timing is also essential during the early stages, because small events can tip the balance in the lock-in process in many uncertain ways. “There are usually many possible outcomes depending on what happens in the early stages,” noted Grindley.24 Finally, it is more important to get an adequate product to market as early as possible than spend more time to perfect it. Adobe’s strategy for making its Acrobat software an industry standard exemplifies this strategy. Acrobat creates and reads Portable Document Format ( pdf) files that enable users to share documents of all types in their original format without converting them. Adobe created Acrobat in the early 1990s to solve its internal file sharing problems and then began
The Bandwagon Effect
85
commercializing it. Adobe’s initial success was impeded by people’s habit of storing paper files and by a number of existing competing products, such as Microsoft Reader. Adobe’s senior management concluded that it needed a bold strategy to gain significant market share and decided to make Acrobat easily downloadable for free, dropping its original price of $50. It made Acrobat’s code available to outside parties and formed alliances with some of the industry’s most influential firms, including Microsoft, AOL, and Google. Also, Adobe quickly launched early versions of Acrobat followed by many subsequent releases to improve it. Adobe’s strategy made Acrobat the de facto industry standard, allowing it to gain 300 million users by 2002 and becoming “one of the world’s most widely used software applications,” according to Monitor Group partner Bhaskar Chakravorti.25 These tactics also can be used in implementing best practice procedures in large organizations, a process usually fraught with many obstacles. Often, many practices in large organizations compete to accomplish a particular task. The widespread implementation of a new practice requires tactics to influence the organization to lock-in to the new one and displace existing competing procedures. Tactics for establishing an industry standard can help with the implementation of standard procedures. Such tactics might include efforts to do the following: • Launch the procedure into the organization as soon as possible without delaying to perfect it. • Make it easy to use, copy, adopt, and/or incorporate into other procedures or systems. • Build the installed base employing promotional and other tactics. • Establish credibility with senior management sponsorship and internal alliances.
Grindley’s advice for products also pertains to procedures: “The main priority is to establish a ‘critical mass’ of users more quickly than competing [procedures] and start the network effects working for the [procedure].”26 If you are successful, your efforts will trigger a bandwagon effect that enlists an ever-growing population of users. At some point, adoption throughout an organization will reach a threshold, and your idea will become locked in as a de facto standard. Your initial success will breed more success.
Chapter 6
The Balance of Nature Earth Goddess In 1972, British scientist James Lovelock published a theory that living organisms were solely responsible for making Earth habitable: “Earth’s biosphere, atmosphere, oceans, and soil [created] an optimal physical and chemical environment for life on this planet.”1 Lovelock named his theory the Gaia Hypothesis after the Greek earth goddess. Lovelock conceived his theory while working on a NASA project to determine whether there was life on Mars. He first contemplated the composition of Earth’s atmosphere and concluded that it contained far too much oxygen. Oxygen is a highly reactive element and, under normal conditions, should have rapidly combined with nitrogen and carbon and virtually disappeared. Lovelock theorized that “Earth’s highly improbable atmosphere was . . . being manipulated on a day-to-day basis . . . and that manipulator was life itself.”2 The scientific community initially called Lovelock a fool and his Gaia Hypothesis a spiritual explanation of natural phenomena. Lovelock’s biggest problem was that he could not explain how Gaia actually worked. Nearly two decades later, however, Lovelock was vindicated when scientists discovered a plankton-climate connection. They found that singlecelled ocean-dwelling organisms called plankton triggered cloud formation by emitting dimethyl sulfide (DMS), which caused light-reflecting clouds to form, which had a cooling effect on Earth’s climate. Whenever Earth got too hot, the plankton proliferated and emitted more DMS, which would create more clouds that cooled the planet. Conversely, whenever Earth got too cool, the plankton population would decrease and emit less DMS, which decreased cloud cover, thereby heating up the planet by permitting more solar radiation to reach its surface. In 2000, the scholarly journal Nature proclaimed that the Gaia Hypothesis had “emerged with some respectability,” but it was given a more acceptable name: Earth Systems Science.3 The plankton-climate connection worked like a household furnace thermostat. When your house gets too cold, the thermostat turns on the heat;
88
Best Laid Plans
when the house gets too hot, the thermostat shuts off the heat. The household thermostat is a simple example of nature’s balancing forces that stabilize environments. While the mechanisms discussed in chapters 3 and 4 amplify events, balancing forces modulate them. Balancing forces cause nature to be self-correcting. Whenever events deviate from the norm, balancing forces restore order and create a steady state. Balancing forces, in fact, typically keep reinforcing ones from getting out of control. Balancing forces are the central mechanism in the balance of nature, which maintains the populations of plants and animals in Earth’s ecosystems. Whenever a species becomes too numerous, balancing forces come into play in the form of starvation and predation to drive the population back to sustainable levels. A surge in the population of groundhogs, for example, provides more food for coyotes, which become more prolific and, in turn, cull back the surplus rodents. Once the surplus groundhogs are gone, the coyote population returns to normal as it runs out of excess food. Thousands of balancing forces maintain the stability of animal bodies by controlling their temperatures, metabolic rates, blood chemistry, and numerous other conditions. Several balancing forces, for example, maintain our body temperatures at around 98.6 degrees Fahrenheit. When we become overheated, blood vessels near our skin dilate, and this process increases the blood flow near our body’s extremity, where it can emit excess heat. Simultaneously, sweat glands secrete water onto our skin, where it evaporates and cools our body. Conversely, our muscles shiver to increase our metabolic rate, and shivering generates heat when we become too cold. Our brains also make us conscious of being too cold or too hot, prompting us to adjust our clothing. Other balancing forces make us feel hungry or tired when we need food or sleep. Biologists call the stable conditions maintained by balancing forces in organisms and ecosystems homeostasis. Balancing forces also help maintain order in many aspects of our social world. Balancing forces in economies, for example, regulate things like prices of goods and services, currency rates, and international trade balances. In his seminal 1776 book, The Wealth of Nations, Scottish philosopher Adam Smith described how self-regulating forces work as if guided by an “invisible hand.” The scarcity of a commodity, for example, causes its price to increase, which makes it more profitable, which stimulates increased production that eliminates the scarcity of the commodity. If a firm charges too much for its product, competitors will seek to lure away its customers by charging less, which forces the firm to fall back in line with normal market prices. Balancing forces are clearly evident in maintaining order in international trade. Assume, for example, a trade imbalance arises between countries A
The Balance of Nature
89
and B, because country A sold more goods to B than A bought from B. In the process, country A will have acquired a surplus of country B’s currency, which causes the value of B’s currency to fall relative to A’s. This makes B’s export products cheaper for a citizen in A to buy and, conversely, makes A’s exported products more expensive for citizens in country B. All things being equal, country B should then see an increase in its exports to A, and A should end up exporting less to B, thereby restoring the balance of trade. The drop in the value of the dollar against the euro during the first decade of the 21st century is an example of this phenomenon. According to economic journalist Robert Gavin, the decline of the dollar against the euro has helped “hard-hit manufacturers” in the United States by “making their products cheaper when sold in Europe.”4 Economies also have balancing forces that maintain their stable growth by moderating instances of booms and recessions. Since 1950, for example, the United States has experienced 10 recessions lasting from 6 to 16 months before recovery began. Rising inflation and interest rates, for example, are balancing forces that slow down an economy that is growing too quickly. Increased productivity and depleted inventories are balancing forces that help stimulate an economy in recession. Balancing forces can cause unintended consequences when our best laid political, social, economic, and business plans become too extreme. A powerful nation wielding too much influence, for example, can lose its dominance when its adversaries ban together in alliances to thwart its ambitions. Alliance formation as a balancing force can similarly stem the ambitions of tribes, political parties, corporations, and individuals seeking too much power.
The Balance of Power International alliance formation is a self-regulating balancing force that brings a modicum of order to international relations—and unintended consequences for individual players. In fact, one of the mainstays of international political thought is called the balance-of-power theory, which states that nations—following their own self-interest—will ally with other countries for security purposes, giving rise to an unplanned balance of power. The Journal of Peace Research defines the balance-of-power theory as follows: “The main objective of states is to secure their own safety. . . . States, by trying to avoid the dominance of one particular state, will ally themselves with other states until equilibrium is reached.”5 Kenneth Waltz, a former political science professor from Columbia University and author of Theory of International Politics, is considered a key
90
Best Laid Plans
founder of the balance-of-power theory. He noted that the balance of power is a dynamic and resilient force, driven by balancing forces: “International political systems are remarkably stable. . . . Once disrupted, [the balance] will be restored in one way or another.”6 Professor of political science Robert Jervis, also from Columbia University, explained: “States do not strive for balance [of power] . . . restraint and stability arise as ambition checks ambition and self-interest counteracts self-interest. . . . In a way analogous to the operation of Adam Smith’s invisible hand, the maintenance of the system is an unintended consequence of states seeking to advance themselves.”7 Although states prefer to ally with like-minded states, once threatened, they will ally with almost any other state that helps preserve their independence. As Waltz noted, “If pressures are strong enough, a state will deal with almost anyone. . . . States will ally with the devil to avoid the hell of military defeat.”8 Indeed, the popular expressions “politics makes strange bedfellows” and “the enemy of my enemy is my friend” capture this sentiment. A seemingly improbable alliance, for example, was Pakistan’s tacit support of the Taliban Islamic extremists when they ruled Afghanistan in an alliance to mitigate its arch enemy India’s influence in the region. Another assortment of strange bedfellows is the alliance between Iran and Venezuela to counter U.S. strength. The power and dynamism of alliance formation is especially evident in the fractious and volatile Middle East. In his book, The Origins of Alliances, Harvard professor of international relations Stephen Walt studied the 33 treaties signed by Middle Eastern countries from 1955 to 1979.9 During this period of regional conflict and superpower meddling, Middle Eastern countries frequently switched loyalties to preserve their independence, as follows: • • • • • • • •
Egypt and Iraq were allied twice and enemies four times. Egypt and Saudi Arabia were allies five times and enemies three times. Egypt and Syria were allies eight times and enemies four times. Egypt and Jordan were allied five times and enemies four times. Iraq and Saudi Arabia were allies twice and enemies once. Iraq and Syria were allies three times and enemies three times. Iraq and Jordan were allies four times and enemies once. Syria and Jordan were allies once and enemies three times.
The wielding of power by a dominant aggressor can have the unintended consequence of losing all power due to the formation of strong military alliances to counter it. As Japan sought to dominate Asia in the 1930s and 1940s, the United States allied with China and other Asian countries to stop it. When Germany sought to dominate Europe and Africa during the 1930s
The Balance of Nature
91
and 1940s, England, the Soviet Union, and the United States allied to stop it. By exerting their military ambitions, both Japan and Germany lost their dominance. Similarly, much of Europe eventually banded together to stem Napoleon’s excessive military ambitions during the early 19th century, which cost him his reign and his freedom as his enemies exiled him for life on the tiny remote island of St. Helena in the Atlantic Ocean. France also lost its position as a leading military power in Europe. The balance-of-power concept driven by military alliances is as old as history itself. Regarded as the father of history in Western societies, fifthcentury b.c.e. Greek historian Herodotus documented how the Greeks twice repelled Persian invasions in 490 and 479 b.c.e. by forming alliances with otherwise fiercely independent and adversarial Greek city-states. This was especially true in the alliance between Athenian and Spartan rivals. After conquering much of Greece in 479, Persian general Mardonius offered a truce and alliance with Athens. Athens refused the offer and requested help from Sparta to defeat the Persians. The Spartans were disinclined to help their Athenian rival and instead were fortifying the narrow Isthmus of Corinth to keep the Persians from invading their Peloponnesian homeland. An influential person named Chileos from neighboring Tegea, however, convinced the Spartans to ally with Athens by warning that “If the Athenians are not our friend, but league themselves with the [Persian] barbarians, however strong our wall across the Isthmus may be, there will be doors enough, and wide enough open too, by [which] the Persian may gain entrances to the Peloponnese.”10 After the Greek alliance defeated the Persians, Athens created a powerful empire consisting of many subject states throughout the Aegean. Flaunting its power caused Athens’s demise during the fifth-century b.c.e. Peloponnesian War with Sparta and its allies. Born 73 years after Herodotus, fifthcentury b.c.e. Greek writer Thucydides is known as the father of scientific history for his assiduously fact-based account of the war between Sparta and Athens. Thucydides used notions similar to the balance-of-power theory to explain how Athenian excess power and influence prompted Sparta to form alliances that helped it defeat its rival. Athens’s downfall came when it was duped into waging an ill-fated war in far-off Syracuse on the island of Sicily. Athens’s defeat in Syracuse demonstrated its military vulnerability, which emboldened its Aegean allies to defect. With its empire in ruins, Athens was conquered by Sparta in 479 b.c.e. The United States similarly lost political influence when the George W. Bush administration decided to impose its might against Islamic extremists and expected the rest of the world to fall in line with its lead. The unintended
92
Best Laid Plans
consequence of losing power came not by countries allying against the United States but rather from its inability to enlist the help from trusted allies in its war on terror. With the collapse of the Soviet Union in 1989, the United States’ former allies against communism were free to pursue their own interests, and continued alliances with these countries had to be earned through diplomacy and statecraft, not acquired by fiat. The world expressed great sympathy for the United States immediately after the September 11, 2001, terrorist attacks on New York City and Washington, D.C., but the sympathy quickly dissipated when the Bush administration launched an unprovoked invasion of Iraq. In his State of the Union address in January 2002, President Bush declared, “I will not wait on events, while danger gathers; I will not stand by, as peril draws closer and closer. The United States of America will not permit the world’s most dangerous regimes to threaten us with the world’s most destructive weapons.”11 He declared Iran, Iraq, and North Korea—countries that had no involvement in the September 11 terrorist attacks—an “axis of evil” and pledged to rid them of their weapons of mass destruction. A troubled Senator Robert Byrd reacted to the address as follows: “The White House displayed, across the board, a complete lack of interest in pursuing opportunities for international efforts. It followed that a ‘go-it-alone’ Bush team preferred to fly solo in a global war on terror.”12 President Bush’s unprovoked, preemptive invasion of Iraq has had disastrous consequences for U.S. standing in the world. It became evident to the world that the U.S. invasion was predicated on two falsehoods: that Iraq had weapons of mass destruction and that it was linked to the September 11, 2001, terrorist attacks. The United Nations and most of the United States’ strongest allies refused to support the Iraq invasion, and those that participated—like England and Spain—did so in the face of domestic opposition that ultimately forced them to reduce their troops in Iraq or pull them out altogether. Senator Byrd believed that the Iraq war had no basis in existing law and presciently saw that it was likely to create “unintended consequences which might make the world a vastly more dangerous place as countries scrambled to acquire nuclear weapons and long-range missiles to deter the new trigger-happy United States from unprovoked attacks.”13 This, in fact, is precisely what Iran and North Korea have done since the invasion of Iraq. Although the Bush administration initially sought to establish good relations with Russia, Bush’s subsequent actions unnecessarily led to potentially dangerous conflict between the two countries. The plan to station a missile defense system in Eastern Europe prompted Russia to threaten to move its
The Balance of Nature
93
missile launchers to its European borders. The inclusion of former Soviet allies from Eastern Europe in NATO and U.S. support for Georgia in its military conflict with Russia further strained the relationship. Russia’s former president Vladimir Putin warned about the United States’ show of power in his 2007 speech at the Munich Security Conference: “The global dominance of the United States is unacceptable. It leads to war and mass slaughter. Today we are witnessing an almost uncontained use of force, military force in international relations, force that is plunging the world into an abyss of permanent conflicts. We are seeing a greater and greater disdain for the basic principles of international law.”14 The Bush administration’s refusal to negotiate with U.S. adversaries has prompted them to become better armed and more dangerous. Former vice president Richard Cheney claimed that the Bush administration does not “negotiate with evil, we defeat it.”15 Fred Kaplan, author of Daydream Believers, summed up the Bush approach to foreign relations as thinking that the United States “could do pretty much as they pleased: issue orders and expect obeisance, topple rogue regimes at will, honor alliances and treaties when they were useful, and disregard them when they weren’t.”16 A 2009 C-SPAN survey of 64 historians ranked George W. Bush 36 out of 42 former U.S. presidents with respect to international relations. The consequence of Bush’s belligerence in dealing with foreign countries has diminished U.S. influence in the world. According to the Pew Research Center’s annual survey Global Attitudes, the U.S. favorability rating dropped precipitously by the end of the Bush administration everywhere in the world except in Africa.17 Turkey, which bitterly opposed the war with its Iraqi neighbor, gave the United States a 9 percent approval rating. An important ally during the Cold War and an important connection to the Islamic world, Turkey refused to let the United States use its neighboring territory to invade Iraq, which seriously constrained the U.S. invasion to Iraq’s southern border. Iran and North Korea, the remnants of the axis of evil, pose much more of a threat to the United States and the rest of the world after the Iraq invasion. According to author James Carroll, “The main outcome of the American effort in Iraq is the empowerment of Iran.”18 The Iraq War has made Iran become a dangerous and unpredictable regional superpower with nuclear and ballistic missile ambitions and considerable influence with Shiites Muslims that constitute the majority of the Iraqi population. Iran has continually condemned and threatened Israel and is suspected to have armed terrorists in Lebanon and Gaza. The consequences of either Iran attacking Israel or, through its ominous threats, prompting Israel to conduct a
94
Best Laid Plans
preemptive strike would be disastrous. Similarly, North Korea is thought to have developed enough fissionable material for about six nuclear weapons and long-range missiles and continually threatens South Korea. Whenever a country wields too much power, its threatened rivals can be expected to band together to oppose it. The same thing happens when dominant political parties push through their legislative agendas.
The Tea Party and the 41st Senator Balancing forces maintain order in democratic political systems and work like the balance of power on a smaller scale. In democracies, political power swings back and forth to squelch political extremes and ensure, however roughly, that elected officials align their votes with the desires of the general populace. Whenever a ruling party strays too far from popular sentiment, it begins losing power. A good example of the balancing forces in U.S. politics is the emergence of the Tea Party and the election of Scott Brown to the U.S. Senate in 2009. The U.S. Democratic Party’s effort to push through its controversial health care legislation in the fall of 2009 had the unintended consequence of causing it to lose its filibuster-proof dominance in Congress. This happened in January 2010 with the election of Republican Scott Brown from Massachusetts, who had been supported with substantial financial contributions from outside his home state. For decades, health care reform has been controversial and politically dangerous. The medical community, pharmaceutical companies, health insurers, and conservative voters have been strongly opposed to what they see as further incursions of big government in their lives and out-ofcontrol federal spending. The emergence of the cantankerous Tea Party in 2009 is an example of a balancing force to counter President Obama’s health care reform initiatives and Congress’s massive spending on bailing out failing firms and stimulating the economy. The Tea Party gets its name from the Boston Tea Party, a historically famous protest against taxes levied by the British on tea consumed in American colonies. It also is a convenient acronym for “taxed enough already.” The new movement employs disruptive tactics at meetings to send the message that their supporters oppose what they claim is Washington’s socialist agenda. Amid this political turmoil, voters in Massachusetts elected Republican Scott Brown in a special election in January 2010 to replace the late Senator Ted Kennedy, who died in office. A dark horse candidate, Scott Brown was a little-known state senator who was expected to lose to his heavily
The Balance of Nature
95
endorsed Democratic opponent, Martha Coakley. Scott Brown, however, campaigned pledging to be the 41st senator who would enable the Republicans to block President Obama’s health care reform bill. According to the Boston Globe, Scott Brown “campaigned as a potentially decisive vote in the Senate who could slow down or derail Democratic initiatives.”19 Brown’s campaign pledge caught the attention of conservative voters throughout the United States and enabled him to raise a staggering $41.2 million in campaign funds in the 19 days before the January 19, 2010, election. This money came largely from out-of-state contributors, many of whom were seeking to defeat the president’s health care bill. According to the Associated Press, 75 percent of the campaign contributions over $1,000 came from outside of Massachusetts.20 Furthermore, some of the largest donors were executives from JP Morgan Chase, Goldman Sachs, Morgan Stanley, and other financial firms faced with the imminent imposition of tighter regulations from the democratically controlled Congress. The Tea Party strongly supported Scott Brown’s election, which was especially surprising given that the state senator was a pro-choice, liberal Republican. Tea Party activists from around the country contributed funds to his campaign and filled their blogs with pleas to support Brown and oppose the Democratic health care bill, like “Stop Obamacare!!!” Rhode Island Tea Party president Colleen Conley predicted, “If Brown wins this election, it will be the shot heard around the world. This will be a clear indictment of the Obama Presidency and the Democratic Congress overreaching.”21 Scott Brown won the election on January 19, 2010, with a five-point lead to become the 41st Republican senator and seemingly poised to stop the president’s health care initiative. It is interesting to speculate, however, whether the election of Scott Brown as the 41st senator had, in fact, emboldened congressional Democrats to pass the health care bill in March 2010 using a special provision called reconciliation that enabled them to pass the bill with a simple majority. Balancing forces were evident during the Clinton and George W. Bush administrations. At the beginning of his administration, President Clinton pushed programs that his critics called too liberal, which caused the Democrats to lose control in Congress in the midterm elections. With Republican control over Congress, the Bush administration pushed for an overly conservative political agenda, including tax rebates for wealthy citizens. But Republican control in the Senate hinged on one vote, and progressive Republican Senator James Jeffords from Vermont switched parties, tossing control of the Senate to the Democrats. During his speech to Vermonters, Jeffords said, “I have changed my party label, but I have not changed my beliefs.”22
96
Best Laid Plans
Dick Morris, President Clinton’s campaign advisor, claims that Clinton and Bush made the same mistake early in their presidential terms, during which their parties ruled the government. “It is the risk of one-party rule,” he said. “They get so tempted to pass their partisan agendas while they can that both Clinton and Bush created a situation in which the voters want to . . . check and balance their extreme proposals.”23 Balancing forces in the form of political alliance formation may be greater than the will of individual politicians. As Professor Glenn Snyder from University of North Carolina noted, “Individuals in domestic politics sometimes had major effects on alliance formation, more often they were dominated by . . . systemic effects.”24 Although citizens frequently complain about government inefficiency and corruption, democracies are self-regulating systems that roughly deliver what the majority of its citizens desire. According to biologist Stuart Kauffman, who specializes in complex systems, “The apparently disjointed, opportunistic, fractured, argumentative, pork-barrel, swap, cheat, steal-a-vote, cluttered system may actually work pretty well because it is a well-evolved system to solve hard, conflict-laden problems and finds, on average, pretty good compromises.”25 The process of alliance formation applies to business as well as politics.
The Great Race for Cyberspace Google’s efforts to dominate Internet space have had the unintended consequence of prompting major corporations to ban together to thwart its progress. This alliance formation prompted by Google’s strategic threats is another example of balancing forces that come into play to restore order—in this case, to restore order within industries. Corporate alliance formation is closely analogous to countries banding together, as described in the balanceof-power theory, to ensure their survival. Both countries and corporations will band together to ward off threats posed by a more dominant rival. The unintended consequence of Google’s aggressive moves will no doubt spawn many unexpected alliances among threatened rivals. By 2010, Google had come to dominate the search-engine advertising business with 86 percent market share, over $23 billion in sales and a remarkable 37 percent profit margin. Google had rapidly introduced new software and aggressively made acquisitions to extend its reach into numerous applications in its quest to dominate Internet space. In late 2009, Google’s chief executive officer Eric Schmidt announced plans to make one small acquisition each month and larger acquisitions every year or two. Google had already made seven acquisitions during the first three months of 2010.
The Balance of Nature
97
Google has even threatened whole industries in its quest to dominate the Internet by giving away applications for free. It added, for example, free Global Positioning System (GPS) capability to its already free Google Maps. This action threatened Garmin and other firms that sell GPS devices for hundreds of dollars. In 2009, Google added free real estate listings to Google Maps, which presented a challenge to the Multiple Listing Service, Zillow, Trulia, Realtor.com, Craigslist, and other Internet real estate sites. On March 1, 2010, Google acquired the popular online photo-editing site Picnik, which had been the default editor for Yahoo’s photo-sharing site Flickr since 2007. Picnik also offers a basic editing capability for free. Although many firms threatened by Google are too small to fight back, stronger ones like Microsoft, Apple, and other media giants have begun forming alliances to challenge Google. At the beginning of 2010, for instance, Microsoft and Apple were over twice the size of Google, each with over $50 billion in revenue. Google directly competed with Microsoft’s Bing search engine and threatened Microsoft’s core business when it announced in July 2009 its plan to offer for free its Chrome Operating System, which competes with Microsoft’s Windows. Google aggressively entered the Internet video business by purchasing video-sharing giant YouTube on October 10, 2006, for $1.65 billion as part of its strategy to organize the world’s information. YouTube distributes videos on the Internet and had already been an irritant to traditional media providers who claim that the Internet site steals their intellectual property. In response to Google’s entry into Internet videos, News Corporation and NBC Universal announced plans in March 2007 to form an alliance to launch what it claims will be the “largest Internet video distribution network ever assembled with the most sought-after content from television and film.”26 According to MSNBC analyst Bill Briggs, this alliance is a “Strategic Missile aimed at the gritty little heart of You Tube—the media giants were essentially being upstaged (again) by the wildly popular, user-driven site that has driven them to this unholy alliance.” Benjamin Gomes-Casseres, a professor at Brandeis International Business School and a specialist in alliances, noted: “A serious strategic threat is one that creates panic in competitors. Google seems to have done it again. Its You Tube strategy is driving archrivals together.”27 Google’s aggressive moves in the search-engine advertising business prompted Microsoft to form an alliance with Yahoo to fight Google. Microsoft launched its search engine Bing on June 3, 2009, and, by March 2010, it ranked a distant third with 3 percent market share compared with Google and Yahoo’s respective shares of 86 and 6 percent.28 In early 2010, Microsoft
98
Best Laid Plans
began crying foul in its attempt to compete with Google, and, on February 26, 2010, its deputy general counsel Dave Heiner posted an announcement stating that Google’s aggressive business tactics “appear to raise serious antitrust issues.” The fact that Microsoft has a 91 percent share of the PC operating system market prompted Dan Frommer, a writer for the Silicon Alley Insider, to note the irony that the “baddest-ass monopolist on the planet is now expressing ‘concerns’ about Google’s growing market power.”29 Once the leading search engine, Yahoo fell far behind Google and by 2008 began experiencing financial problems. In February 2008, Google approached Yahoo about forming a partnership. In response, Microsoft made an unsolicited bid of $44.6 billion to buy Yahoo, which Yahoo immediately rejected as too low. Over the following year and a half, however, the two parties maintained a dialogue and, in July 2009, announced their plan for a 10-year partnership to share technology and advertising revenues. Some analysts believe that this alliance to challenge Google’s search-engine advertising business was too little, too late.30 Google’s aggressive moves might also drive long-time adversaries Apple and Microsoft to form an alliance to thwart the giant search engine’s ambitions. On January 5, 2010, Google launched its own Smartphone called Nexus One, which competes directly with Apple’s highly popular iPhone. Sometime in 2010, Google plans to launch a music service that will directly compete with Apple’s iTunes, which, since its introduction in 2003, has dominated the market for digital music along with its companion iPod. Apple’s iPhone has used Google as its default search engine. Google’s aggressive moves may, however, prompt Apple to replace Google with Microsoft’s Bing as its default search engine for iPhone, bringing together another pair of strange bedfellows.
The Change Maker’s Dilemma Balancing forces ironically can cause firms paying close attention to customers’ needs and shareholders’ interests—the hallmark of good management practice—to go bankrupt. This is essentially what happened to Digital Equipment Corporation (DEC). Vastly successful in the 1960s and 1970s selling its cutting edge minicomputers, DEC had become the second largest employer in Massachusetts after the state government. It had grown from start-up to a $7.6 billion firm lauded as nimble and fast-moving and was featured as one of the top 20 firms in McKinsey’s 1982 best-selling book, In Search of Excellence. DEC exploited advances in semiconductors to create smaller, cheaper minicomputers, which delighted customers and enriched its shareholders.
The Balance of Nature
99
Yet, despite DEC’s vast engineering prowess, it failed to adopt the next generation of computers based on microprocessors—the computer chips that gave birth to the microcomputer. Meanwhile, two college dropouts named Steve Jobs and Steve Wozniak used the microprocessor to assemble one of the first home computers and launched Apple Computer, Inc. With continual engineering improvements, the home computer evolved into the cheap, reliable, ubiquitous personal computer used throughout corporations. Minicomputers eventually became extinct. DEC’s failure to quickly adopt microprocessor technology caused its eventual demise in 1998. Business Week wrote as early as 1994 that DEC was in “need of triage. . . . Sales are drying up in its key minicomputer line. . . . It has squandered two years trying halfway measures to respond to the low-margin personal computers and workstations that have transformed the computer industry.”31 Facing these difficulties, DEC essentially disappeared after being absorbed in a series of acquisitions. The inability of firms to adopt new technologies is caused by balancing forces that, while ensuring a firm’s short-term survival, constrain its ability to adopt radically new innovations. In their seminal 1978 book, The External Control of Organizations, organizational professors Jeffrey Pfeffer from Stanford University and the late Gerald Salancik from Carnegie Mellon University stated: “Organizations are inescapably bound up with the conditions of their environment.”32 They devised the resource dependency theory, which stipulates that the actions of firms are especially constrained by meeting customer needs and providing shareholder profits, because this is how firms acquire the financial resources needed to survive. If the firm does not provide what customers want, it gets starved of revenue; if it fails to provide steady earnings growth, investors deprive it of capital. Harvard Business School professor Clayton Christensen elaborated on the resource dependency theory in his popular 1997 book, The Innovator’s Dilemma. Christensen ironically noted, “Precisely because these firms listened to their customer, invested aggressively in new technologies that would provide their customers more and better products of the sort they wanted, and because they carefully studied market trends and systematically allocated investment capital to innovations that promised the best returns, they lost their positions of leadership.”33 The dilemma is that firms like DEC, while great at sustaining innovation by making incremental improvements to existing products, fail to develop transformative innovations that revolutionize industries. Christensen attributes this dilemma to the fact that transformative innovations generally create cheaper products that are less profitable and that a firm’s biggest customers rarely ask for. The personal computer, for example,
100
Best Laid Plans
was much cheaper and less profitable than minicomputers, and its initial market consisted of home hobbyists long before the PC became standard equipment for major corporations. Furthermore, firms employ rigid investment approval processes that require inventors to estimate the market for their innovations, which is unknowable. Christensen noted that the bestperforming companies “have well-developed systems for killing ideas that their customers don’t want . . . until their customers want them. And by then it is too late.”34 The balancing forces underlying the innovator’s dilemma similarly impede efforts to introduce any type of change in organizations. The late management guru Michael Hammer characterized organizational resistance to change as follows: “Who resists change in an organization? Everyone. How do they resist change? Every way possible, including subversion, starting competing initiatives, delaying, and limiting resources. Why do they resist? Resistance is 10 percent logical and 90 percent psychological, including fear, loss of control, anger, lack of knowledge, and the not-invented-here syndrome.”35 When faced with an organizational change, employees typically hope that if they hold out long enough, it, too, will pass. The failure rate of efforts to change organizations is staggeringly high. In his popular 1996 book, Leading Change, Harvard Business School professor John Kotter reported that only a small percent of change programs he had studied were successful. Numerous books and journal articles have subsequently been published on managing change, yet a 2008 McKinsey global survey of 3,199 executives found that only one out of three change management efforts had succeeded. According to Peter Senge, author of The Fifth Discipline, “Whenever there is resistance to change, you can count on there being one or more hidden balancing processes.”36 The source of the resistance can derive from many aspects of the firm’s organization, including job descriptions; compensation and reward systems; promotional patterns; performance measures; budget and project approval processes; and the firm’s organizational structure and corporate culture. When employees are rewarded for performing well at doing business as usual, for example, why would they embrace doing something new? The irony of organizational change efforts is the harder you try to force change, “the more strongly the balancing process resists, and the more futile your efforts become,” according to Senge.37 The balancing forces in an organization create something akin to the stable condition in nature called homeostasis, which former psychologist Kurt Lewin claimed would keep organizations in a steady state when faced with potentially disruptive change. In other words, organizations resist any form
The Balance of Nature
101
of change, even when it is planned and beneficial. Lewin also found that organizations that have undergone change have a tendency to revert back to their original state.38 Kotter similarly found that, years after a successful change effort, organizations can revert back to their former pretransformation condition when the executive leading the change leaves office. Efforts to reduce organizational costs can encounter balancing forces that can unintentionally cause costs to increase. Cutting back on routine equipment maintenance, for example, can backfire, because it merely pushes the maintenance cost into the future, when it will be more expensive and the equipment will have further degraded. Efforts to save money by stretching the use of a vehicle beyond its normal replacement are self-defeating, because older vehicles require more maintenance, have more downtime, consume more fuel, and fail to qualify for warranty refunds. Budget freezes also can backfire as understaffing generates backlogs of unfinished work, leading to poor customer service, which forces management to spend more money on overtime and hiring contract workers. Hiring freezes, furthermore, prompt managers to retain unproductive workers because they cannot hire more qualified ones. Costs creep back into the organization in many subtle ways in the form of rework, increased customer inquiries, system downtime, decreased productivity and absenteeism.
Lessons from Balancing Forces A major lesson from balancing forces is that the arrogant flaunting of power by a dominant party can have the unintended consequences of the party losing power as rivals ally against it. Such a lesson is all the more important today, when there is a growing need for humility and diplomacy in an increasingly dangerous world where allies and friends have to be created and maintained. During the Cold War, many nations sought alliances with the United States to protect them from the Soviet Union. European nations allied with the United States to form the North Atlantic Treaty Organization to prevent a Soviet invasion of Europe; South Korea allied with the United States to ward off invasions from communist North Korea and China; Turkey and Iran provided pro-Western military outposts on the very borders of the Soviet Union. Fred Kaplan, author of Daydream Believers, noted, “In a world with no opposing superpower to cement its alliances by default, the United States would need allies more than ever and would have to work harder at diplomacy to lure—and keep—them on board.”39 U.S. power was also diminishing as the world entered the 21st century, with the rise of China and India
102
Best Laid Plans
as economic powers and the spread of religious extremism throughout Islamic nations. To maintain world order, the United States needed allies to help fund, man, and legitimize military opposition in regional conflicts. President George W. Bush’s go-it-alone war on terror and invasion of Iraq has strong parallels with President Johnson’s fight against communism and the Vietnam War: both instances caused a drop in U.S. power and respect around the world. Former senator Fulbright said in his 1966 book, The Arrogance of Power, that the United States had viewed communism as “a kind of absolute evil, as a totally pernicious doctrine”40—an attitude strikingly similar to the Bush administration’s view on terrorism. Fulbright further warned that preemptive invasions like the war in Iraq “outrage the conscience of peoples all over the world. . . . One cannot defend human values by calculated and unprovoked violence without doing mortal damage to the values one is trying to defend.”41 Fulbright envisioned the dangers of flaunting power and sought a more humble United States that would be a positive force in the world. He hoped that “America will escape those fatal temptations of power which have ruined other great nations and will instead confine herself to doing only that good in the world which she can do. He saw the irony in flaunting power: “The more one does that sort of thing, in fact, the more people doubt.”42 The fact that the United States did, in fact, recover from its unpopular Vietnam War is encouraging today. After the Johnson administration, the United States regained its stature in the world as a positive force and achieved surprising diplomatic accomplishments. President Nixon established relations with the People’s Republic of China; President Carter promoted human rights throughout the world; President Regan established good relations with the Soviet Union and ended the nuclear missile race; President George H. W. Bush emphasized a “kinder gentler America.” As a result, both presidents George H. W. Bush and Bill Clinton were able to enlist broad support from many countries to drive Iraq out of Kuwait during the 1991 Gulf War and to stop ethnic cleansing in Bosnia in 1995. President Obama took office in January 2009 promising to reemphasize diplomacy and a willingness to speak with the leaders of any country, including traditional enemies like Cuba, Iran, and North Korea. His promises earned him a Nobel Prize and have improved U.S. approval ratings in much of the world. According to the Pew Survey on Global Attitudes, U.S. approval ratings during President Obama’s first year in office rose 20 to 30 percent in Europe. These improved rates show that balancing forces can bring about desirable outcomes when one understands the winning advantages of diplomacy.
Chapter 7
Perverse Adaptations Gaming the System In June 1955, Sergey Egorov faced the biggest challenge in his stellar career as director of Volga Metal Works, a Soviet factory that made nails used in housing construction. In December 1954, Gosplan, the central planning organization in Moscow, had doubled the tonnage of nails that his factory was required to make for the upcoming year. Sergey was six months into his annual plan and falling farther and farther behind in meeting his goals. He was well aware that failure to meet the goals meant losing his annual bonus—or worse—receiving punishment for sabotaging the system. Sergey was the first member of his family to attend college, where he earned the essential engineering degree for becoming a plant manager in the Soviet Union. After graduation, he joined Volga Metals as a junior engineer and worked his way to second-in-command as chief engineer. Two years later, the Council of Commissars appointed him to the coveted directorship of the factory. For three years, Sergey had worked closely with Commissar Mikhail Kiselev, who oversaw the production of metal products as part of the Soviet Union’s all important Five-Year Plan. In 1954, however, the Council of Commissars replaced Kiselev with Andrey Popov and commanded Sergey to substantially increase the production of nails to support the Soviet Union’s new ambitious housing construction program. Popov conveyed to Sergey that his 1955 production goals were nonnegotiable. Late in June, Volga’s chief engineer, Nikolay Chaykovsky, proposed a solution to Sergey’s dilemma: Volga would only make foot-long spikes that were much heavier than a wide variety of smaller nails. Knowing that larger spike had limited use in housing construction, Sergey initially rejected Nikolay’s idea but relented when he realized that this was the only way to meet his production goals. Volga went on to exceed its production goals for 1955. The Council of Commissars paid Sergey a handsome bonus and nominated him for the prestigious Lenin Award for leadership.
104
Best Laid Plans
Other nail factories also made a surfeit of larger, heavy nails in 1955, which created a critical shortage of smaller nails used in housing construction. Gosplan responded by setting new goals for 1956 based not on the weight of nails but the number of them and set unrealistic production goals to make up for the shortages in 1955. Now, faced with the daunting challenge of making an unrealistically large number of nails, Sergey again turned to Chief Engineer Chaykovsky for advice. In a flash of inspiration, Chaykovshy replied, “Let’s make millions of tacks.” This apocryphal story is reflected in a cartoon in the Soviet newspaper Pravda lampooning Soviet-era economic planning and is emblematic of the unimaginably ridiculous ways in which Soviet managers responded to statemandated production goals to earn bonuses and avoid punishment. As Paul Craig Roberts and Karen LaFollette noted in their book Meltdown: Inside the Soviet Economy, “Examples of unforeseen outcomes are everywhere.”1 When the state evaluated plant managers on the total weight of goods produced, for example, they made excessively heavy products. They made chandeliers so heavy they pulled ceilings down. They filled desk lamps with lead, making them so heavy it took two people to lift them. They made metal roofing materials so heavy they collapsed buildings. Soviet geologists failed to discover oil in their oil-rich country because planners evaluated them on the number of meters they drilled per month. Geologists drilled many shallow wells rather than a few deeper ones needed to tap Russia’s abundant oil reserves, because deep wells increased the friction and slowed the drilling process. Pravda observed that “deeper drilling means reducing the speed of the worker and reducing the group’s bonuses.”2 When measured on the basis of square meters of buildings under construction, Soviet construction groups started many new building projects while delaying the completion of existing ones. Construction groups had no incentive to finish their projects, some of which took a decade to complete. Meanwhile, unfinished buildings deteriorated, weeds and brush infested grounds, equipment rusted, and unfinished plants caused production delays that rippled through the Soviet economy. Revised incentives similarly backfired. When planners changed the goals for producing metal roofing from total weight to total square footage produced, plants made roofing materials so thin that the wind blew them off. When planners changed the goals for producing light bulbs from the total number of bulbs to their total wattage, plants stopped making numerous tiny light bulbs and made a small number of giant, high-wattage flood lamps. When planners changed the goals for construction from square meters under construction to the number of construction deadlines met,
Perverse Adaptations
105
builders declared their projects completed even when they lacked roofs, plumbing, and electricity. The introduction of a more sophisticated measure called the gross ruble value had the unintended consequence of promoting waste and inefficiency. The gross ruble value measured the total value of labor, materials, and other inputs used to make a product, and the more rubles’ worth of time and materials that went into making a product, the easier it was for plant managers to meet their production goals. The gross ruble measure encouraged plant managers to use large quantities of costly materials, labor, fuel, and other production inputs, which made goods excessively costly. These ridiculous Soviet events illustrate how deft humans are at gaming the system for their benefit. These examples were caused by a particular class of human adaptation that economists call perverse incentives that induce unexpected—and usually unwanted—behaviors. Perverse incentives pervade human society and are not restricted to the Soviet era. When measured on the number of patents they produce, for example, U.S. engineers produce a lot of patents with limited commercial value and neglect those few with high profit potential. When measured on meeting delivery dates, plant managers set delivery dates so far in the future that they adversely affect client service. When measured on meeting shipping dates, managers ship unfinished products to temporary holding areas. Efforts to control the efficiency of customer service employees backfire when they are measured on the time required to handle service calls. Employees respond by handling customer calls so hurriedly that they are unable to solve customers’ problems. Frustrated customers have to repeatedly call back to get their problems resolved, which decreases customer service efficiency. When measured on the time it takes to respond to opened e-mails from customers, employees stop opening them. Efforts to control costs using head-count goals ironically can cause costs to increase. Head count is a measure of the number of employees on a manager’s payroll and treats all employees the same regardless of their seniority or salary and can induce managers to employ too many highly paid senior employees who are more productive than less-expensive junior ones. As a result, high-cost employees perform work that could have been performed by less-expensive ones, which increases the average cost-per-hour and total operating expense. Head-count measures also encourage managers to use temporary workers, overtime, consultants, and other expensive resources that are not included in the head-count measure. Compensation schemes for sales representatives often create perverse incentives. Rigid monthly sales goals cause sales representatives to artificially
106
Best Laid Plans
make their monthly goals by assigning sales made during prosperous months to leaner ones, making financial reports unreliable. Compensation schemes based only on revenue production can cause sales representatives to acquire unprofitable clients because they benefit from generating any and all revenue. Compensation schemes can also motivate salespeople to push products that customers do not need. Efforts to minimize problems by tracking their occurrence backfire when people stop reporting them. When measured on the number of plant injuries, some safety managers do not report small incidents that eventually can become very costly when neglected. Similarly, when measured on the number of defective products, plant managers hide them. A particularly troubling case of perverse incentives happened when congressional law passed in 1992 to force fertility clinics to publish their success rates for in vitro fertilization (IVF) procedures had the unintended consequence of substantially increasing the number of premature births in the United States. A 2009 study published by the March of Dimes indicated that premature births had increased 36 percent in the past 25 years and attributed much of the surge to fertility treatment. Premature births are a major health problem that can cause infant mortality, ear and eye problems, and mental retardation. They are also more costly for parents: a normal birth costs about $5,000; a premature birth averages about $51,000; and extreme cases can run into millions of dollars, a share of which will not be covered by insurance. Fertility clinics are lucrative businesses, and Congress mandated the publishing of their IVF success rates compiled by the Centers for Disease Control and Prevention to prevent them from exaggerating their outcomes. The fertility industry is also highly competitive. Many prospective patients use the published IVF success rates to find clinics that will maximize their chances of giving birth and avoid multiple procedures. To boost their success rates, some clinics implant multiple embryos: a woman has a 20 percent chance of giving birth with a single embryo and 40 percent with two. Not surprisingly, clinics implant single embryos in only 4.5 percent of the procedures performed. The problem with the implantation of multiple embryos is that it increases the incidence of twins, and twins are much more likely to be born prematurely. Single births have a 10 to 15 percent chance of being premature; the figure increases to 50 to 60 percent for twins. The incidence of twins has doubled over the past 25 years, and the American Society for Reproductive Medicine has published guidelines against implanting multiple embryos. Its president, William E. Gibbons, noted, “Twins are not a good outcome.”3
Perverse Adaptations
107
The effect of perverse incentives created by government-sponsored social programs is continually debated. Critics of welfare programs, for example, claim that granting money to people reduces their motivation to seek employment and makes them become permanent wards of the state. They argue that payments to single parents with dependent children cause some families to break up because an unmarried head of household can receive more benefits than an intact family. The controversy is whether the benefits of a social program outweigh the adverse impacts of the perverse incentives they create. Stereotypically, conservatives claim that social programs create perverse incentives, which cause the initiatives to be self-defeating. Conversely, liberals claim that, on balance, social programs do more good than harm. In addition to perverse incentives, human adaptability thwarts our best laid plans in other ways. We adapt to new safety measures by becoming more reckless. We adapt to new regulations by finding ways to circumvent them. We adapt to new taxes by inventing loopholes. We adapt to new products and technologies employing them in ways their inventors never anticipated.
The Titanic Effect The U.S. Department of Defense spent $17 billion to develop the Global Positioning System (GPS), a network of satellites that can determine the location of places on Earth with great precision. Now in widespread commercial use, GPS has so greatly simplified and improved navigation that it has unintentionally endangered its users who forgo the use of compasses, charts, and other traditional navigational practices. Ship captains have set their automatic pilots to GPS coordinates for a buoy, gone to sleep, and awoke when their ships hit the buoy. Hikers in an Alaskan snowstorm perished when they decided not to dig a snow shelter—a conventional safety practice—in favor of trusting their GPS to lead them to safety. GPS has created two unintended problems: (1) a decline in traditional navigational skills among people who once had them and (2) overreliance on GPS by those who never did. Such dangers posed by the use of GPS prompted outdoor equipment evaluator, gearreview.com, to place this warning for hikers on its Web site: “A GPS seems so accurate and easy to use that it has a couple of inherent dangers. The danger enters in the common thinking that, with a GPS, you no longer need a map or compass. Not so. We recommend always using a map and compass with a GPS receiver. Traveling anywhere in back country without a map is just stupid.”4
108
Best Laid Plans
In 2007, about 50 million vehicles in the United States were equipped with some form of GPS equipment. The careless use of GPS is becoming epidemic, because it is now widely used by commercial and private motorists who rely exclusively on the device for directions and forgo the traditional use of maps, oral directions, eyesight, and common sense. There are countless stories of people obediently following the driving instructions meted out by pleasant, confident voices only to get hopelessly lost; channeled onto impassable dirt roads, railroad tracks, oncoming traffic, people’s driveways; and taken to dead ends on cliff tops and in forests and streambeds. GPS directions have also led drivers into heavily settled suburbia with narrow, winding roads, creating increased traffic in these areas that poses danger to both residents and drivers. One Web site called the GPS-equipped car a “kid killer.” Heavy trucks that are directed onto suburban roads are especially problematic, because they crash into trees, fences, walls, and other obstructions, damaging vehicles and property and endangering lives. GPS driving instructions have a number of flaws that make them potentially unreliable. First, users of GPS often select the option of the shortest path to destinations, which is not always the quickest, safest, easiest, or most appropriate route. Next, keeping track of construction projects, road closings, one-way designations, and many other daily changes in road conditions nationwide is daunting—and perhaps unrealistic. Finally, the manufacturers of electronic directions used in GPS devices make mistakes. One manufacturer, for example, tracks five and half million miles of road in the United States and makes 3 million edits to its database each month to try to keep it accurate. The number of variables inherent in such an undertaking necessarily leads to some amount of error. Behavioral scientists call the ironic fact that safety equipment and measures can induce people to become more reckless risk compensation. It is also known as the Titanic effect, because the belief that the Titanic was unsinkable had the unintended consequence of causing a number of people to fail to take actions to prevent 1,513 passengers and crew from dying when it sank on its first voyage on April 14, 1912. Just under 900 feet in length, the Titanic at the time was the largest maneuverable object ever made. Advances in ship safety, like a double hull with one-inch steel plates and 16 watertight chambers sealed by electric switches controlled from the bridge, created the dangerous notion that the Titanic was unsinkable. These safety features instilled overconfidence and carelessness when the Titanic began its voyage from South Hampton, England, to New York on April 10, 1912, through the icy North Atlantic. Although the Titanic’s blueprints specified 32 lifeboats to accommodate the ship’s 3,500 passengers and
Perverse Adaptations
109
crew, the ship’s owners felt that so many lifeboats would needlessly clutter the ship’s decks and called for only 20 lifeboats instead, which could only hold 1,178 of the 2,224 passengers and crew aboard the Titanic. Further, one owner on board the Titanic ordered the ship’s captain to sail full speed through iceberg-infested waters to arrive in New York a day ahead of schedule to gain media publicity for the much-vaunted ship. The Titanic’s captain also dismissed six telegrams from other ships warning of an unusually large number of icebergs in the area. The Titanic’s unsinkable reputation also fooled captains of nearby ships into thinking that its distress signals were celebratory fireworks. “Belief in the safety of the ship became the greatest single hazard to the survival of its passengers, greater than the icebergs themselves,” according to journalist Edward Tenner.5 People adapt to risk-reduction measures in ways that can make the measures backfire. By making people feel more secure, safety programs can instill careless behaviors that make catastrophes even more likely to happen. The Titanic’s passengers and crew, for example, would have been far safer on a more vulnerable ship fully equipped with lifeboats and manned by a much more vigilant crew. Many people buy sport utility vehicles (SUVs) mistakenly believing that their four-wheel drive, high clearance, and tanklike bodies make them safer than standard cars. Although equipped for off-road driving, SUV owners primarily use them for commuting to work and running errands. SUVs are, in fact, more dangerous than standard cars. The National Highway Traffic Safety Administration (NHTSA) has found that the death rate per 1,000 vehicles is 25 percent higher for SUVs than standard cars due, in part, to the fact that SUVs are two and a half times more likely to roll over. The Titanic effect makes SUVs even more dangerous. NHTSA data indicate that drivers’ mistaken belief that SUVs are safe induces reckless driving. Eighty-four percent of people who died in SUV accidents were not wearing seat belts compared to 68 percent of people perishing in standard cars. Based on an investigative study on SUV safety, journalist Malcolm Gladwell concluded that “SUVs are unsafe because they make their drivers feel safe. That feeling of safety isn’t the solution; it’s the problem.”6 The illusion of safety prompts drivers to pay less attention, use cell phones, speed, and drive aggressively, especially in inclement weather. Safety experts caution that official efforts to detect and subdue fires make people more lax in preventing them. A classic case is Chicago’s Iroquois Theater fire in 1903. Deemed fireproof, the theater opened without a functional sprinkler system or fire-fighting equipment, and a fire broke out a few months later that killed more than 600 people.
110
Best Laid Plans
Hikers, mountaineers, and skiers take more risks exploring the wilderness when they believe park rangers can rescue them if they get into trouble. Sports equipment designed to reduce injuries has made some sports more dangerous. The National Sports Safety Organization, for example, observed the following ironic impact of safety equipment in football: “Better, more lightweight padding was devised that impeded the wearer’s movement less but helped minimize injury to players. An unintended consequence of these equipment improvements is increasing levels of violence in the game. Players may now hurl themselves and collide with more force without significant risk of injury. However, when an injury does occur, it is apt to be severe and often season or career ending.”7
Moral Hazard The U.S. government’s National Flood Insurance program ironically makes people more vulnerable to flood damage. Congress passed the National Flood Insurance Act in 1968 to offer low-cost property insurance to homeowners living in flood-prone locations that private insurers avoid. Flood insurance pays homeowners up to $250,000 for flood-related losses. This insurance is funded by artificially low insurance premiums that cover only of the program’s sixty percent / total cost; the U.S. government funds the balance with tax dollars. Before National Flood Insurance, people typically avoided building on the water’s edge for fear that they could not afford to rebuild homes damaged by floods or storms. National Flood Insurance has mitigated this fear, and analysts estimate that flood insurance has increased the building of homes on ocean fronts by 40 percent and on floodplains by 18 percent. Flood insurance has also prompted people to rebuild homes, 60 percent of which are vacation homes in the same disaster-prone spot. This flood insurance dilemma is caused by what economists call a moral hazard. Similar to the Titanic effect, moral hazard occurs when efforts to compensate people for adversity induces careless—or even fraudulent— behaviors that make the adversity more likely to happen. Moral hazard pervades private and governmental insurance programs. As economists Timothy Lane and Steven Philips noted, “Moral hazard is . . . an unavoidable consequence of any insurance.”8 Insurance executives are ever wary about whether their property policies make some people less cautious about preventing fires and worse—even torching properties that are insured for more than they are worth. Life insurance executives worry about whether their product fosters suicide and
Perverse Adaptations
111
murder. Private mortgage insurance insulates banks from loan defaults, prompting bankers to more aggressively lend to people who are more likely to default, which has the unintended consequence of increasing overall mortgage default rates. Deposit insurance, introduced by the U.S. government to protect depositors against bank failures, has had the unintended effect of making banks more likely to fail. Numerous bank failures during the Great Depression prompted the U.S. government to insure customers’ deposits up to $3,000 should their bank fail. The program gradually expanded to the point where it now covers deposit losses of up to $250,000, making depositors even less cautious about the soundness of their banks. Similarly, regulators have speculated that government pension insurance has endangered pension plans by making employees and unions more focused on pay raises than the solvency of their pension plans. We encounter moral hazard in many facets of everyday life. Company cars, for instance, depreciate 10 to 15 percent faster than the ones we own, indicating that people are less careful in taking care of cars when someone else is paying to maintain them. Parents of adolescents face a perpetual problem of wanting to help solve their children’s problems while not creating a moral hazard that impedes their growth. Paying for children’s parking tickets, for example, could make their children more careless about where they park; paying for bank overdrafts could make them less diligent in managing their finances.
Loopholes The Endangered Species Act has hastened the demise of certain species it was intended to protect. The U.S. Congress passed the Endangered Species Act (ESA) in 1973 to protect the habitats of plants and animals facing extinction. Environmentalists exploited the act to restrict land development by finding endangered species requiring large, undisturbed habitats and then seeking a court order to protect large tracts of public and private property. Developers fought back with species cleansing—the preemptive killing of the endangered species and elimination of their habitats before environmentalists discover them. The National Association of Home Builders tacitly endorsed species cleansing in its book Developer’s Guide to Endangered Species Regulation by describing techniques for species cleansing and advising its members to “be aware of [these methods] as a means employed in several areas of the country to avoid ESA conflicts.”9 According to Charles Mann and Mark Plummer, authors of Noah’s Choice: The Future of Endangered
112
Best Laid Plans
Species, “The Endangered Species Act of 1973 . . . ended up turning landowners into the enemies of species on their land.”10 When the U.S. Congress passed the 1970 Clean Air Act, it never anticipated that the law would result in the creation of acid rain. The bill was enacted to force electric utilities to reduce their air pollution, which regulators measured at ground level near power plants. Utilities circumvented the new regulation by building 1,000-foot chimneys that reduced ground-level air pollution by sending emissions into the upper atmosphere, where they transformed into sulfuric and nitric acids. Borne by the winds in the upper atmosphere, these corrosive acids combine with water and become precipitation that damages lakes and forest hundreds of miles from the power plants that generated them. Legislation intended to conserve oil by imposing efficient mileage standards for automobiles similarly backfired by spawning the sport utility vehicle. In response to the oil embargo of 1973, the U.S. Congress passed the Energy Policy Conservation Act of 1975, requiring new cars and trucks to average at least 27.5 and 21 miles per gallon, respectively. Prior to the act, auto manufacturers made a variety of popular station wagons but had to cut production of these inefficient cars to meet the new mileage standards. Automobile manufacturers responded by introducing the SUV, which regulators classified as a light truck and therefore subject to the lower 21-milesper-gallon standard because it was built on a truck chassis. As the SUV became very popular, customers switched from owning fuel-efficient cars to buying SUVs that were less fuel efficient. The failure of the Endangered Species, Clean Air, and Energy Policy Conservation Acts illustrates another form of adaptation: people respond to new laws and regulations by finding creative ways to circumvent them. People create loopholes, for example, in response to new tax laws, and efforts to close them spawn new loopholes. Peoples’ deft ability to create loopholes is especially evident in the failed efforts to reform campaign financing and to restrict the promotion of cigarettes. For over a century, congressional reformers have tried and failed in continual efforts to limit wealthy individuals, corporations, and labor unions from influencing federal elections. Not only have these efforts failed to constrain money interests, they unintentionally spawned new loopholes that distort and weaken the election process. As New York Times journalist Alison Mitchell noted, “The whole struggle devolves into a tale of unintended consequences.”11 Early in U.S. history, people expected to receive or keep a plum government job in compensation for contributing money to a successful candidate’s
Perverse Adaptations
113
political campaign. The Pendleton Act of 1883 banned the raising of campaign funds from government employees, only to create a new demon: corporate campaign funds, which created a scandal in 1896 when businesses doled out today’s equivalent of $82 million for McKinley’s presidential campaign. McGill University history professor Gil Troy noted that “The law of unintended consequences took hold: by barring federal assessments, the Pendleton Act increased parties’ need for more corporate money.”12 When Congress passed the Smith-Connally Act in 1943 barring the use of labor unions’ dues for campaign financing, the Congress of Industrial Organizations responded by forming the nation’s first political action committees (PACs) to help fund Franklin Roosevelt’s reelection campaign. PACs became more influential when Congress limited individual contributions to candidates to $1,000 but allowed $5,000 contributions to PACs, spurring unions, corporations, and special interest groups to use PACs as a new loophole. Another unintended consequence of campaign finance reform was the emergence of soft money and issue ads. Congress passed laws in the 1980s permitting candidates, parties, PACs, and special interest groups to raise unlimited soft money from deep pockets as long as the money was spent on promoting issues instead of a particular candidate. The issue-ad provision created a large loophole allowing unlimited dollars to be spent on ads that associated a candidate with a particular issue—thus, in effect, promoting the candidate. The AFL-CIO, for example, sought to defeat Pennsylvania congressman Phil English by running an issue ad featuring a concerned mother saying, “My husband and I both work . . . and next year, we’ll have two kids in college.” The announcer then added, “Working families are struggling. But Congressman Phil English voted with Newt Gingrich to cut college loans, while giving tax breaks to the wealthy.” “The only thing [the issue ad] doesn’t tell viewers to do is throw English out of office,” noted journalist Chris Conway in an article titled “Issue Ads Allow Unlimited Political Pitches.”13 Congress’s effort to limit candidates’ use of soft dollars with the passage of the Bipartisan Campaign Reform Act in 2002 gave birth to 527 groups, so named for the reference in the tax regulation. The act enabled the ostensibly independent 527 groups to raise unlimited campaign funds, which they have lavishly spent to influence elections. Two 527 groups, for example, were very influential in the 2004 presidential election: the Swift Boat Veterans for Truth spent $8.9 million in anti-Kerry ads, while BushMoveOn.org spent $18.2 million raised from billionaire George Soros to help defeat incumbent George W. Bush.
114
Best Laid Plans
The emergence of PACs, 527 groups, and other special interest groups with a stake in campaign financing reform has unfortunately caused candidates and parties to lose control over their own elections. It has also fragmented the election process. Parties traditionally promoted political cohesion with broad, unifying platforms, whereas powerful special interest groups often promote singular divisive issues such as abortion and gun ownership. The U.S. Supreme Court in January 2010 turned back the clock on campaign financing by permitting corporations to spend unlimited funds in support of political candidates, a practice that had been prohibited since 1947. The Court ruled that corporations, like individuals, had First Amendment rights to freedom of speech and thus could participate freely in political elections. The profound unknown consequences of the Court’s decision prompted President Obama to predict that the ruling would provide “a green light to a new stampede of special interest money.”14 Efforts to restrict the promotion of cigarettes especially to youth have similarly backfired, as tobacco companies created highly effective promotional schemes that were difficult for the U.S. government to regulate. Massey University professor Janet Hoek noted that “Ironically, the regulatory changes introduced to limit the extent and influence of tobacco advertising seem instead to have led directly to the refinement of new ‘legal’ promotions that have high reach among young people.”15 The government’s banning of tobacco advertising from broadcast media in the early 1970s prompted tobacco companies to sponsor sporting events, giving them access to a broad global audience with a positive sports-related image. The use of logos on players’ uniforms, signage at events, and naming rights created a more effective promotional medium than traditional media advertising. Consumer research shows that tobacco sports promotions have increased tobacco product recognition and diminished the perceived danger of smoking among adolescent males. The government’s banning of sports promotions prompted tobacco companies to pay film companies to include smoking characters. The smoking characters created highly influential smoking role models that, unlike television ads, were hard to ignore and came with no health warning. Tobacco companies have also associated their cigarettes with popular non-tobacco products, such as Camel clothing stores, Salem Cool Planet music stores, and Benson & Hedges Bistros. As Hoek noted, “The effect of legislation designed to curtail sponsorship seemed unintentionally to have prompted the development of increasingly subtle methods of promoting smoking.”16 One final, nongovernment example of how people adapt by finding loopholes is how affluent U.S. families have adapted to the Scholastic
Perverse Adaptations
115
Aptitude Test (SAT) in ways that defeat its original purpose. Prior to the 1950s, children of affluent families routinely attended elite colleges, which gave them access to high-paying careers. Seeking to make admissions more equitable, the leaders of several elite colleges began selecting students based on their SAT scores. Affluent families adapted by enrolling their children in SAT preparation courses, giving them an added advantage in college admission that only they could afford.
Gadgetry Breathtaking developments in technology have been designed to allow people to communicate instantly throughout the world and to provide a vast flow of information. The expectation might be to bring people closer together and to use knowledge for beneficial purposes. In some ways, the exact opposite is true. The engineers who equipped cell phones with audiovisual recording, for example, could have hardly imagined that their innovation would enable the world to watch the grisly hanging of Saddam Hussein in January 2007 and the sectarian tensions it caused in war-torn Iraq. A witness at the highly secured event used a cell phone to record surreptitiously the hanging and uploaded the video onto the Internet. The graphic taunting of Hussein in his last moments by Shiite guards enraged his fellow Sunnis and escalated terrorism in Iraq. The hanging video illustrates how new technologies almost always create unintended consequences as people adapt to them in unexpected ways. Whole books have been devoted to this topic. This is also an age-old problem, as illustrated by the early-19th-century introduction of the cotton gin, which had the unintended consequence of dramatically increasing slavery in the United States. Before the cotton gin, cotton farming was only viable in lands with highly fertile soil, because slaves could only remove the seeds from a single bushel of cotton per day. Eli Whitney’s cotton gin was 50 times more efficient in removing seeds, making cotton profitable to grow in vast tracts of less-fertile lands. The cotton gin caused cotton production to increase tenfold from 1820 to 1860 and the slave population needed to plant and harvest the crop to expand from 1.4 million to 3.9 million people. New technologies create two levels of uncertainty. First, it is hard to predict the ultimate applications for a new technology. The laser, for example, had no known applications when it was first developed in the 1950s, and it took decades before scientists and engineers found innovative applications
116
Best Laid Plans
in medicine, telecommunications, defense, retail shops, and many other fields. Second, each new application will change society in unknown ways. The development of GPS is a good example. GPS, like the Internet, was originally intended to be exclusively used by the military and unexpectedly morphed into widespread commercial use. As military spokesperson Aaron Renenger noted, “[GPS] was developed as a military system, and never intended for commercial use. It allows us to guide bombs to target within meters of accuracy.”17 President Reagan opened the use of GPS to civilians in 1983 and President Clinton solidified government support for commercial use in signing the Presidential Decision Directive on GPS in 1996. By 2009, GPS had become widely used in boating, hiking, driving, commercial logistics, and many other applications, and its use is sure to spread as the price of the technology continues to plummet by as much as 50 percent per year. In addition to the unintended safety problems previously discussed, GPS has created other unintended social consequences as users adapt to the device in unexpected ways. In particular, the equipping of GPS in cell phones and other small devices is enabling the continuous tracking of people, creating new uncertainties as people use them in new ways. People-tracking applications will surely have uncertain social impacts, both good and bad. For example, GPS may reshape criminal and civil justice systems as the locations of criminal suspects and unfaithful spouses are used as evidence during trials. Companies have equipped their trucks with GPS to track their locations and drivers’ routes and speeding patterns, much to the concern of the drivers’ unions. Another concern is the sale of location data for commercial applications. Such uses of GPS to track people has engendered new civil liberties debates over locational privacy—a brand new concept. The widespread use of cell phones while driving has unintentionally increased the rate of accidents, especially when the phones are used for texting. Drivers using cell phones become distracted and have a fourfold higher rate of accidents, prompting states like New York to ban their use while driving. In May 2009, the Massachusetts Bay Transportation Authority banned its subway drivers from carrying cell phones at work after a driver rear-ended a stopped trolley while texting his girlfriend. The accident resulted in injuries to 46 passengers. Even pedestrians distracted by using cell phones are endangered as they pay less attention to traffic and signal potential muggers that they are not paying attention. The widespread use of the Internet is changing society as people adapt to its many emerging applications. It has transformed how people and businesses shop for goods and services and communicate via many channels like
Perverse Adaptations
117
e-mail, texting, Skype, Facebook, MySpace, and Twitter. It has also created new social problems with the proliferation of pornography and its access by children; new intrusive advertising with pop-ups and spam; and the exposure of people—especially women and children—to crooks, molesters, murderers, and other social deviants searching the Web for potential victims. Especially unclear are the social impacts of new Internet applications like Facebook, which in the nine months ending in June 2009 grew from 100 million to 200 million users. Facebook surely can broaden one’s network of friends and contacts, and users as of 2009 believe that it deepens them as well. Conversely, there is evidence that Facebook diminishes children’s time spent with their families. In a 2008 study, the Annenberg Center for Digital Future found that 28 percent of children surveyed spent less time with their families, up from 11 percent two years prior.18 The survey also found a significant drop in the number of hours spent with family from 26 to 18 per month. The study suggests that the solitary act of using social technology limits face-to-face time with families; even television is a somewhat shared experience. Sociologists are also concerned about the negative impact of texting— beyond promoting improper grammar. A three-year study by MIT’s Initiative on Technology and Self raises the concern that texting impedes teenagers’ emotional growth by enabling them to continually text their parents for all manner of advice instead of learning to make their own decisions.19 Worse still, texting is not a face-to-face communication, and it might weaken teenagers’ interpersonal skills. The unexpected uses and impacts of Twitter are also changing society beyond its vain and banal use of keeping friends up to date with what you are doing every moment. Twitter users, for example, can effectively become embedded journalists around the world. During the 2009 controversial presidential elections in Iran, the Iranian government prohibited foreign correspondents from reporting real live events, and news media like CNN had to rely on word and film coverage sent in via Twitter from people in the streets. This surprising use of Twitter prompted the notion that the news no longer breaks, it Tweets—as captured by a political cartoon lampooning the media clampdown during the Iranian election. The cartoon shows one religious leader addressing the masses who are using Twitter, demanding, “Expel the Correspondents,” while another leader standing by says, “But they’re all correspondents!!” Twitter is affecting society in other ways. The U.S. military is using Twitter, Facebook, and other social technology to counter Taliban propaganda in communicating in real time the number of civilian and Taliban
118
Best Laid Plans
casualties in Afghanistan. Conversely, U.S. intelligence warns of terrorists using Twitter to coordinate attacks. Twitter is also used by consumers to communicate complaints about products and services, prompting corporations to use Twitter as an electronic complaint box. Another unexpected outcome is the availability of free and highly current news via the Internet, which is causing the rise of blogs and the decline of traditional newspapers with their professional journalistic standards. Editors at leading newspapers strive to report factual information; journalists lose their jobs for reporting untruths, and newspapers routinely report mistakes when discovered. Conversely, publishers of blogs operate independently and are not all subject to the same high journalistic standards and can report whatever they desire. It is interesting to speculate whether the reporting of factual news will give way to the publishing of falsehoods, lies, myths, and opinions or whether professional journalism will migrate to the Web as an island of truth amid a sea of erroneous information. Only time will tell how these new technologies will shape society with new applications and user behaviors. What is clear, however, is that the Internet has provided a malleable platform for creating new forms of commerce and communication with unknown consequences for future societies.
Lessons from Adaptation Humans are highly adaptive creatures. We quickly change behaviors in response to change and do so in unexpected and sometimes undesirable ways. We defy efforts that seek to influence our behavior and constrain our actions in ways that lead to unintended consequences. The following questions, comments, and suggestions could help anticipate and thereby avoid unexpected outcomes in developing new products, programs, laws, rules, incentives, penalties, restraints, and other initiatives. Does the initiative create incentives that might induce a variety of behaviors—expected, unexpected, surprising, logical, and seemingly illogical? Think beyond what you intended or hoped would be the outcomes. How might people game the system, either to attain some benefit or to avoid costs and extra efforts? According to Loyola University professor of philosophy Thomas Carson, “Rules, decision procedures, and schemes for reward and compensation all need to be scrutinized for the incentives they create.”20 Does the initiative shield people from the negative consequences of poor decisions, carelessness, or reckless behaviors? Does the initiative provide increased safety or security? Does the initiative create the illusion of increased safety or security? If so, what unexpected and undesired behaviors might
Perverse Adaptations
119
emerge? Are there ways at the outset to prevent or greatly limit these undesired outcomes? How might some people try to circumvent the new initiative, especially in creative and unexpected ways? Are there potential loopholes? How might people unexpectedly exploit vague language and subjective provisions in the initiative for their own gain? What might various people do to defeat the initiative? How might the initiative affect different groups of people? How might they variously respond to the initiative? Are there clever or unscrupulous people who might exploit the initiative for their own gain? Could the initiative unexpectedly attract a particular group of customers or users that cost far more than average ones?
Chapter 8
Coming into Being The World’s Most Dangerous Gang The U.S. government’s decision in the mid-1990s to deport a small band of illegal Salvadoran immigrants in Los Angeles unwittingly gave rise to the “world’s most dangerous gang,” so named in a 2007 National Geographic documentary.1 The gang is the MS-13, short for Mara Salvatrucha. The name comes from a violent street gang in El Salvador called Mara and Salvadoran guerilla fighters called Salvatrucha.2 The MS-13 operates throughout the United States, Central America, and Canada, and its criminal activity includes drug smuggling and sales, arms trafficking, auto theft, car jacking, home invasion, drive-by shootings, contract killing, murder, rape, robbery, prostitution, extortion, immigrant smuggling, racketeering, kidnapping, and police assassination. The hallmark of the heavily tattooed MS-13 is extreme violence. According to a January 2008 FBI report, “They perpetrate violence—from assaults to homicides, using firearms, machetes, or blunt objects—to intimidate rival gangs, law enforcement, and the general public.”3 They brutally murder gang defectors and police informants and, in Central America, have massacred innocent people on buses and have assassinated police and politicians. The FBI reported in 2007 that MS-13 has about 10,000 members in the United States operating in at least 42 states and has about 50,000 members in Central America.4 It has grown through local recruitment and immigration and has replicated its gang model as it has spread geographically. According to the Federal Bureau of Investigation’s Brian Truchon, director of the MS-13 National Gang Force, “MS-13 has the unique, unfortunate ability to replicate themselves in similar ways across the United States, exactly like a virus. . . . It is known for its ability to operate between borders, to effectively communicate and move between Central America and the U.S.”5 MS-13 originated when some one million to two million Salvadorans migrated to the United States in the mid-1980s to avoid El Salvador’s
122
Best Laid Plans
devastating civil war. Many settled in Los Angeles, where they were abused by local gangs. To protect themselves, they formed the MS-13 gang, wielding machetes and employing violent tactics they had learned as guerilla warriors back home. The original gang had a few thousand members whose crimes mostly involved street violence and turf warfare. In the mid-1990s, however, the U.S. government began deporting illegal MS-13 members in Los Angeles with criminal records back to El Salvador and, by 2005, had deported 50,000 of them. When the deportees arrived home, the Salvadoran police housed them all in the northern Guezaltepequa Prison. The concentration of MS-13 prisoners in the same jail enabled the gang to coordinate international gang activity via cell phones, plot the expansion of new gang franchises throughout the United States, and easily recruit thousands of new members from the impoverished Salvadorans traumatized by years of brutal civil war. The government of El Salvador routinely released the imprisoned deportees because they had not committed crimes in that country. Upon their release, they then plotted their quick return to the United States. A number of factors created the medium from which MS-13 emerged as a large international crime organization. The Salvadoran immigrants settled in a high-crime Los Angeles neighborhood plagued by existing gangs. The densely packed community provided the opportunity for the immigrants to interact, plan criminal activity, and self-organize into a local gang. Ethnic cohesion and the need to protect their community from existing gangs fostered the formation of MS-13. It was the deportations of MS-13 members back to El Salvador, however, that transformed the small local gang in Los Angeles into a massive international crime organization that has spread throughout the United States, Canada, and Central America. The deportations concentrated in the same Salvadoran prison large numbers of MS-13 gang members with familiarity of and easy access to the United States. Julie Ayling, a criminal investigator and author of “Criminal Organizations and Resilience,” noted, “Institutions of incarceration have proved fruitful incubators of gangs.”6 The Los Angeles Times reported in 2005 that “deportations have helped create an unending chain of gang members moving between the U.S. and Central America.”7 MS-13’s coming into being as a large international crime organization illustrates how acts can create ideal conditions that unintentionally spawn new social phenomena with surprising outcomes. No one planned MS-13; it emerged on its own as a by-product of the concentration of deported gang members in a single jail in El Salvador, where it took on a life of its
Coming into Being
123
own. MS-13 had no central authority that planned its activities as it spread as an international crime organization. That living or lifelike phenomena can come into being unplanned and thrive without central control can be a disturbing thought, but it is a wellestablished fact in biology and the social sciences. Before modern biology, scientists thought that natural order came from mystical forces; later it was attributed to central control mechanisms. Modern science has debunked both notions, showing that life self-organizes without central planning. Queen ants and bees, for example, do not rule over their colonies. “Contrary to popular belief, the queen of a colony is not an omniscient individual that issues orders; rather, she is an oversized individual that lays eggs,” noted biologist Thomas Seeley from Cornell University.8 Similarly, we do not consciously control all of our bodily functions. “Who or what within the brain monitors all this [bodily] activity: no one. Nothing,” proclaimed Harvard entomologist Edward O. Wilson.9 Order in nature arises by itself in a process called self-organization. Order emerges from within ecosystems, animal bodies, cells, and other systems solely from the interaction of the systems’ components. Quite literally, nature takes on a life of its own. According to Seeley, “Self-organization is widespread in biological systems. That is to say, it is common to find biological systems that function without guidance from an external controller, or even from an internal control center.”10 Similarly, much of our social world, like the emergence of MS-13, comes into being via self-organization without central control. Conventional wisdom is that presidents, dictators, generals, government planners, and other authorities run our social world, and, without them, “the life of man [would be] solitary, poor, nasty, brutish, and short,” as British philosopher Thomas Hobbes proclaimed in 1651.11 Order in social systems emerges from the interaction among people, networks, organizations, and institutions. German sociologist Norbert Elias observed, “Civilization is . . . set in motion blindly, and kept in motion by the autonomous dynamics of a web of relationships.”12 That cultures, organizations, movements, institutions, and other social phenomena come into existence unplanned, uncontrolled, and unanticipated can be unsettling. “One is brought face to face . . . to an uncomfortable position: how to conceive of an order without an orderer,” observed Kenneth Waltz from the University of California, Berkeley.13 Emergent social phenomena like MS-13 are highly resilient and adaptive to changing conditions. Julie Ayling defined resilience as “the capacity
124
Best Laid Plans
to absorb and thus withstand disruption and the capacity to adapt, when necessary, to changes arising from that disruption. Adaptation can range from minor evolutionary adjustments through to robust transformations.”14 Gangs like MS-13 “may be nearly invulnerable to repression,” according to Ayling.15 Their existence in high-crime areas enables members to learn from more experienced criminals and provides safe havens in which to recover from arrests and injuries. Tight-knit communities shelter gangs because community members are likely to have friends and family that belong to the gangs and because gangs protect the community. Gangs’ loose organizations also contribute to their resiliency; they are able to respond to opportunities and threats more quickly than hierarchical, bureaucratic organizations. The loose structure also makes gangs less dependent on individuals should they be arrested or killed. The idea of deporting criminal Salvadorans backfired. Had they been dealt with locally, MS-13 might have remained a small Los Angeles gang. Based on her crime research, Ayling cautions, “Interventions risk having unintended consequences. . . . Law enforcement action might stimulate an organizational adaptation that is more resilient and perhaps more harmful. . . . In some instance, it may be wiser not to intervene.” This chapter provides additional examples of how certain deeds have created the right conditions for social phenomena to come into being unexpectedly and to surprising unintended consequences, both favorable and catastrophic.
Star Wars All Over Again The fact that the U.S. Congress in 2009 submitted the highest military budget in human history featuring the highly dubious Star Wars missile defense program is the unintended consequence of the massive military spending during World War II.16 Prior to World War II, the U.S. government would ramp up military spending with the outbreak of a war and then slash the budget when the war ended. The government procured weapons from manufacturers of civilian goods that temporarily diverted their plants to making weapons. When the war was over, the weapons makers resumed making civilian products. With the onset of World War II, government spending rose from $5 billion to $89 billion in 1944, all but $1.6 billion of which was spent on the military. Prior to 1940, the United States produced 921 military planes a year; in 1944, it produced 96,318 of them. By 1945, the United States was
Coming into Being
125
producing one-half of the world’s armaments. This critical mass of military spending during World War II gave rise to a permanent arms industry. In his famous January 1961 Farewell Address to the Nation, President Dwight D. Eisenhower called attention to this new arms industry: “Until the latest of our world conflicts, the United States had no armaments industry. American makers of plowshares could, with time and as required, make swords as well. But now we . . . have been compelled to create a permanent armaments industry of vast proportions.”17 In this speech, Eisenhower coined the term military-industrial complex to describe this phenomenon. The military-industrial complex (MIC) emerged unplanned and unexpected as a self-sustaining, highly resilient entity that consists of the Department of Defense, politicians, weapons manufacturers, lobbyists, scientists, universities, and labor unions inextricably woven together in a web of common interest: maximizing government spending on military arms. President Eisenhower was one of the first to warn the U.S. public about the dangers of the MIC in his farewell speech: “The total influence—economic, political, and even spiritual—is felt in every city, every state house, and every office of the federal government. . . . In the councils of government, we must guard against the acquisition of unwarranted influence . . . by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.”18 The MIC’s backbone is the tight links among the U.S. Department of Defense, the arms industry, and Congress. After World War II, the U.S. military has continually rationalized the need for massive defense spending, starting with the Cold War. When the Cold War ended with the collapse of the Soviet Union in 1989, the Department of Defense predicated military spending on a “two-war” scenario, which meant that the United States had to be equipped to simultaneously fight two regional wars with “rogue states” like North Korea, Iraq, and Iran. Michael Klare, author of Rogue States and Nuclear Outlaws, observed that “Colin Powell devised the twowar strategy once he realized that the United States was ‘running out of enemies’ large enough to justify spending hundreds of billions on the Pentagon every year.”19 To put this in perspective, the United States spends 19 times more on its military than the combination of all of the rogue states, which include Cuba, Iraq, Iran, Sudan, Libya, Syria, and North Korea.20 Two of the MIC’s closest partners are Congress and the arms industry. Lobbyists for the arms industry heavily influence members of Congress by offering millions of dollars in campaign contributions. Furthermore, congressional leaders with arms manufacturers in their districts have a strong self-interest in funding them to provide jobs for their constituents. This
126
Best Laid Plans
conflict of national interest is most obvious when Congress buys weapons that the military does not need or want, which happens frequently. From 1978 to 2003, for example, the U.S. Air Force requested 5 C-130 transport planes, but Congress purchased 256 of them. Newt Gingrich, in particular, was a big supporter of the C-130, because its maker, Lockheed Martin, is located near his congressional district. Congress’s budgeting of $9.4 billion for fiscal year 2010 to continue President Reagan’s Star Wars program is clear evidence that the MIC is alive and well. In a March 1983 speech, President Reagan pledged to make nuclear weapons “impotent and obsolete” by developing a space-based antimissile shield called the Strategic Defense Initiative (SDI).21 Claiming that SDI was unrealistic and unscientific, critics dubbed it Star Wars after George Lucas’s space fantasy movie. The creation of SDI was also a blatant violation of the Anti-Ballistic Missile (ABM) Treaty that the United States had signed with the Soviet Union in 1972 to curtail the missile race between the two superpowers. Support for SDI diminished later in Reagan’s presidency. Support continued to wane during the George H. W. Bush and Clinton administrations as SDI encountered technical problems and cost overruns and as the United States engaged in arms-reduction negotiations with the Soviet Union. SDI’s mission ended with the fall of the Soviet Union in 1989, which prompted Clinton’s secretary of defense, Les Aspin, to declare that SDI was “designed to meet a threat that has receded to the vanishing point.”22 SDI was reborn when the Republicans took control of the House of Representatives in 1994 as part of its Contract with America platform that included restoring the missile-defense program. Congress appointed Donald Rumsfeld, formerly President Ford’s secretary of defense, to chair a committee to assess the United States’ vulnerability to missile attacks. The committee was stacked in favor of reviving SDI. Rumsfeld had been a strong advocate of missile defense, and 9 of the 13 committee members who helped write the report had ties to aerospace contractors or to the Center for Security Policy, a think tank that supported SDI. In July 1998, Rumsfeld presented the committee’s findings in the Commission to Assess the Ballistic Missile Threat to the United States, which concluded that “rogue states” like Iran and North Korea were within five years of developing intercontinental ballistic missiles that could be could launched with little advanced warning.23 Although Rumsfeld’s report drew heavy criticism, North Korea’s missile launch in August 1998 paved the way for passage of the National Missile Defense Act of 1999. The act stated that the United States should “deploy as soon as is technologically possible an effective National Missile Defense
Coming into Being
127
system capable of defending the territory of the United States against limited ballistic missile attack (whether accidental, unauthorized, or deliberate).”24 President Clinton signed the bill with the provision that National Missile Defense (or NMD, the new name for SDI), had to be technically feasible, affordable, and have a clear mission that did not disrupt arms control negotiations—three would-be showstoppers. NMD got a huge boost when George W. Bush became president in 2000. He nearly doubled the budget for NMD from $4.2 billion in fiscal year 2000 to $7.7 billion in fiscal year 2002. Bush also appointed Donald Rumsfeld secretary of defense and added to his administration 22 officials from the Center for Security Policy and 32 officials with ties to missile defense contractors. In December 2002, President Bush cancelled the ABM Treaty with Russia to eliminate legal hurdles in expanding his missile defense program. Researcher Michelle Ciarrocca from the World Policy Institute noted that the resurgence of missile defense “has been politically driven, spurred on by missile defense lobby, which is thoroughly entrenched in the Bush Administration.”25 Missile defense has come under intense criticism since it was conceived in 1983. In 2002, the Central Intelligence Agency questioned NMD’s mission and provided testimony that the United States was unlikely to be attacked with missiles because lower-tech delivery vehicles like shipping containers were cheaper, readily available, more accurate, and untraceable. In 2003, the American Physical Society called the NMD technology infeasible.26 NMD also became one of the most incestuous and corrupt sagas in MIC’s history. By 2007, defense contractors had paid $1.6 million in bribes to midlevel defense employees to lobby Congress to support NMD. Walter E. Braswell, a lawyer for one of the defendants, commented, “Ever more grotesque is the way defense procurement has disintegrated into an incestuous relationship between the military, politicians, and contractors.”27 By 2008, NMD had cost taxpayers $110 billion and had become the military’s biggest weapons program despite its questionable mission and unproven technology. Although Congress in July 2009 proposed increasing the NMD’s budget to $9.4 billion, President Obama cancelled the program two months later saying that there were more effective ways to safeguard the United States from rogue states. Ironically, the pursuit of missile defense to make the United States safer actually diminished U.S. national security. The junking of the ABM Treaty and plans to base the system near Russia’s Eastern Europe border needlessly provoked a former enemy. The ABM Treaty had helped deter nuclear warfare for 30 years, and when the United States abandoned it in 2002, Russia pulled
128
Best Laid Plans
out of the START II arms reduction talks the next day. In April 2007, Russia’s President Putin warned of a return to the Cold War. No one planned the MIC; it emerged on its own as a by-product of the massive military spending during World War II and thereafter took on a life of its own. No central authority has controlled the MIC or planned its activities over its six decades of existence. Nor did the MIC arise from a conspiracy between weapons makers and Congress, as some critics have alleged. As former U.S. senator William Fulbright noted, “I do not think the military-industrial complex is the conspiratorial . . . band of ‘merchants of death.’ One almost wishes that it were, because conspiracies can be exposed and dealt with.”28 The MIC has also demonstrated tremendous resilience. Despite dramatic reductions in threats to U.S. national security, the defense budget has steadily grown.
Jihad Blowback It is tragically ironic that the United States’ largest covert operation conducted decades ago in one of the most remote places on Earth inadvertently caused the largest terrorist catastrophe ever to occur on U.S. soil. Desperate to thwart Soviet expansion during the 1980s, the United States secretly funneled over $3 billion in military aid to the mujahideen Islamic forces fighting the Soviet army occupying Afghanistan. After the mujahideen prevailed in the late 1980s, the “Afghan freedom fighters,” as the Reagan administration described them, evolved into the well-armed and disciplined Islamic terrorist group al-Qaeda, which, in the 1990s, bombed Cairo, Riyadh, U.S. embassies in Africa, and New York’s World Trade Center. On September 11, 2001, al-Qaeda struck again with devastating suicide attacks using commercial airlines that destroyed the World Trade Center and damaged the Pentagon. The massive support that the United States gave the mujahideen during the Afghan war with the Soviet Union created ideal conditions that spawned Islamic militancy and the terrorist group al-Qaeda. The Afghan war became a celebrated cause among the world’s 1 billion Muslims. Some 30,000 Muslims from nearly 30 countries, including Osama bin Laden from Saudi Arabia, traveled to Afghanistan to fight the Soviet occupiers. At no other time in history have so many of the world’s most extreme and militant Muslims come together in a critical mass, which fostered the spirit of jihad. Without $3 billion in U.S. military aid, the poorly armed and undisciplined mujahideen would unlikely have defeated the Soviet Union. Equipped with deadly helicopter gun ships, the Soviets had routinely massacred the
Coming into Being
129
mujahideen until the United States provided them with state-of-the-art shoulder-held surface-to-air missiles. The flow of U.S. weapons turned the tide in favor of the mujahideen, and their success fostered a culture of jihad. Militant Muslims became more strident in their cause and confident in their ability to defend Islam around the world. After the war, some 1,000 foreign Islamic militants armed and trained by the United States settled in the remote border of Afghanistan and Pakistan bent on taking their jihad to new places like Cairo, Algiers, Manila, Bangkok, Islamabad, Riyadh, Peshawar, and U.S. and European cities. The U.S occupation of Iraq has had the unintended consequence of creating a much larger and stronger Islamic terrorist network than the alQaeda organization it sought to destroy. On September 26, 2006, the U.S. government made public portions of the National Intelligence Estimate on Iraq, which was the official position of U.S. intelligence agencies. The report clearly stated that the U.S. occupation of Iraq has given rise to a much larger terrorist organization that is “increasing in both number and geographic dispersion.” The report further stated that the “Iraqi conflict has become a ‘cause celebre’ for jihadists, breeding deep resentment of U.S. invasion in the Muslim world and cultivating supporters for a global jihadist movement. . . . We assess that the Iraq jihad is shaping a new generation of terrorist leaders and operatives.”29 The invasion of Iraq, in fact, created the ideal conditions for the emergence of a new terrorist organization, al-Qaeda in Iraq. The former Iraqi leader Saddam Hussein had never tolerated the presence of al-Qaeda in the police state he ruled. The chaos that ensued after the United States’ defeat of Iraq’s military and the absence of police and border guards created an ideal medium to nurture the emergent terrorist organization. The invasion of Iraq not only created another rallying point for terrorists, it provided the ideal place to fight the infidel enemy firsthand. Positive phenomena can also emerge from ideal conditions, as in the case of the unplanned and unexpected transformation of global commerce.
Spawning a New Industry A confluence of unrelated events created the ideal conditions that unexpectedly gave rise to the rapidly growing Indian outsourcing business that emerged around 2000, which, by 2009, had become a $47 billion industry that employed over 2 million Indians. India’s outsourcing business began with computer programming and expanded into numerous information technology (IT), engineering, business process, and design functions
130
Best Laid Plans
ranging from the mundane—such as call centers and data entry—to sophisticated activities—such as reading X-rays. The Indian outsourcing phenomenon came into being unplanned and unpredicted. The sequence of events began in 1617, when India became a trading partner with Great Britian and later the “crown jewel” in its empire, until 1947. India is linguistically diverse, with more than 1,500 spoken languages. The widespread use of English as a second language throughout India is one of the most important legacies of its colonial era. The ability of many Indians to speak English has given them a significant advantage in interacting with U.S. industry. In the decades leading up to 2000, India had trained millions of engineers and scientists but had too few jobs to employ them locally. Many of these technically trained Indians became employed in the United States, where they gained firsthand knowledge of its industry and made contacts that would later serve them well as outsourcing entrepreneurs. The next critical event was the invention of the Internet and the promise of e-commerce. The rapid growth in Web traffic prompted the conversion from copper wire telecommunications networks to the much more efficient fiber-optic cables. The wildly optimistic predictions for e-commerce in the late 1990s spurred the massive laying of fiber-optic cables that included telecommunications links between the United States and India. By removing regulatory barriers, the Telecommunications Act of 1996 encouraged many new firms to enter the industry, which further fueled the laying of fiber-optic cables. The bursting of the e-commerce bubble in 2000 set the stage for Indian outsourcing. Intense competition among telecommunications firms faced with excess cable capacity sharply drove down long-distance rates to the point where it cost as little to transfer data between the United States and India as making a local call. For the first time, this cheap communications highway made it economically feasible to outsource work to India. The bursting of the e-commerce bubble in 2000 also caused the U.S. economy to slip into a recession, and many Indian nationals working in the United States with temporary work visas lost their jobs and had to return home to limited job prospects. The recession also forced U.S. corporations to cut costs, especially their large IT budgets. Technically skilled, proficient in English, familiar with U.S. industry, and armed with contacts in U.S. IT departments, these Indian engineers who returned home had all the resources necessary to offer outsourced IT work from the United States to India for a fraction of the cost. The need to cut IT costs in 2000 came at the exact same time that U.S. corporations had to devote more programming resources to deal with the
Coming into Being
131
Y2K problem. As the year 2000 approached, there was widespread concern among corporate IT departments that their older legacy systems might crash after midnight on December 31, 1999, because they might have been programmed decades earlier unable to handle 21st-century dates. If the systems were programmed using two-digit fields for dates, for example, the older systems might interpret the year 2001 as 1901 because both dates would be stored as 01. This surge in the need for Y2K programming spurred the growth in outsourcing IT work from the United States to India. This, in turn, enabled Indian IT engineers and programmers to become intimately familiar with U.S. firms’ core IT systems. It also confirmed that IT work could be outsourced to India with great savings and reliability. The success of IT outsourcing, furthermore, led to the outsourcing of other business functions such as billing and call centers and eventually to performing engineering, design, and other higher-skilled functions. Thus, today’s massive Indian outsourcing industry came into being, without specific planning, as the unintended consequence of a series of unexpected and largely unrelated events. Similarly, the massive transformation of U.S. society and economy after World War II emerged unplanned and unexpected as an unintended consequence of a benefits program for returning veterans.
The Accidental Remaking of America Signed into law by President Roosevelt in June 1944 in hopes of pacifying the millions of soldiers returning home from World War II to bleak job prospects, the GI Bill unexpectedly transformed U.S. society in unimaginable ways. Traditionally reserved for the wealthy and elite, a college education became the norm for middle-class Americans. The education and training of over 7 million veterans boosted family incomes and spurred rapid growth in the U.S. economy. The low-cost mortgage benefits in the GI Bill encouraged many Americans to move from urban rental apartments to homes they owned in the new suburbs. Journalist Edward Humes noted that this transformation happened “not by grand design, but quite by accident, as much a creation of petty partisans as of political visionaries.”30 The GI Bill was product of the American Legion’s vision and its strong influence over Congress to enact it. Its passage in the early 1940s came at an odd time when a more conservative Congress became less supportive of Roosevelt’s progressive social policies. The president himself was much more heavily focused on winning the war than on postwar social planning.
132
Best Laid Plans
He told his staff, “This is no time for public interest in or discussion of post-war problems on the broad ground that there will not be any post-war problems if we lose this war.”31 When Roosevelt’s administration turned to postwar planning, it envisioned a pragmatic and restrictive program for returning soldiers. The goal was to avoid the riots among soldiers that had erupted after previous U.S. conflicts dating back to the Revolutionary War. Especially troubling was the memory of 1932, when 20,000 veterans marched on Washington, DC, seeking bonuses for their military service during World War I. At a low point in U.S. history, President Hoover had ordered the military to run this so-called bonus army out of town by force. Seeking to avoid the unpopular and potentially corrupt idea of giving cash to veterans, Roosevelt proposed to fund one year of training for those few individuals who passed examinations for filling occupations that lacked trained workers. The American Legion developed a proposal for compensating World War II veterans and enlisted Legion official and lawyer Harry Colmery to draft it. Similar to the president’s proposal, the Legion’s plan avoided cash payments and focused on training soldiers to reenter the workforce. The Legion’s bill, however, called for providing four years of college education for any veteran who served at least 90 days in the military. The Legion initially called its plan the Servicemen’s Readjustment Act and renamed it the GI Bill of Rights before submitting it to Congress in January 1944. Both the president’s and the Legion’s proposals were specifically intended to compensate veterans for their service as opposed to transforming U.S. society. Colmery told Congress that the bill “has to take them back sympathetically away from the horror and stark reality of war and give them every opportunity to again become disciplined forces for peaceful progress through educational opportunities in every respect.”32 He further warned that his proposal might have limited popularity among returning soldiers. He estimated that between 10 and 20 percent of the veterans would take advantage of the educational benefits. The GI Bill easily passed the U.S. Senate but got held up in the House of Representatives with debate over the cost of its unemployment insurance provision. According to Milton Greenberg, author of The G.I. Bill of Rights, “Few people, including many closely connected to the GI Bill’s development, were aware of the implications of this revolutionary new law. Commentary of the time—inside and outside of Congress—tended to stress the costs and benefits of the unemployment readjustment allowance contained in the bill and to underestimate the education and loan program provisions.”33
Coming into Being
133
Representative John Rankin from Mississippi, chairman of the House Committee on Veteran’s Affairs, warned that too many veterans would go on the dole. Being suspicious that colleges were populated with Communists, he also objected to the educational benefits, saying, “I would rather send my child to a red school house than a red teacher.”34 The bill eventually passed both houses of Congress, providing all veterans who had served at least 90 days in the military with unemployment insurance of $20 per month for a year, four years of tuition and expenses for any college or training program that would admit them, and low-interest home loans with no down payments. According to the House Committee on Veteran’s Affairs, “Congress’s motivation was to create an economic ‘cubby hole’ into which America could put 12 million demobilized troops, as we transformed our economy from making machine guns to making Maytags.” Coinciding with the massive D-Day invasion of continental Europe, President Roosevelt signed the GI Bill on June 22, 1944, with minimal news coverage and little ceremony. Experts predicted that the vast majority of the veterans would return home to jobs on farms and in factories and that fewer than 10 percent of them would take advantage of the GI Bill’s college benefits or the generous home-loan provisions. Few envisioned that it would become, according to Humes, “the most successful socially uplifting legislation the world has ever seen.”35 Before 1944, only the United States’ upper classes attended college, while the rest of the society toiled away on farms and in factories. Only about 10 percent of Americans attended college, and many were lucky to finish high school. American men typically pursued their fathers’ occupations as farmers, factory workers, mechanics, carpenters, and employees in other trades involving manual labor. Colleges were the bastion of wealthy, white, Protestant legacies and some even had limits on ethnic enrollment. College leaders feared that the GI Bill would lower academic standards by flooding colleges with unqualified students. Harvard University president James Conant, for example, called the GI Bill a poor substitute for a public works program and a serious threat to private higher education.36 University of Chicago president Robert Hutchins warned that the bill would turn the nation’s colleges into “educational hobo jungles.”37 Predictions about the popularity and impact of the GI Bill proved false. A surprising 7.8 million of the 16 million returning soldiers took advantage of the educational benefits offered by the GI Bill: 2.2 million veterans went to college, while 5.6 million of them received vocational training. Seventyseven percent of these GI students believed that they could not have otherwise afforded further education after the war.38 With their wartime
134
Best Laid Plans
experience, greater maturity, and a desire to get on with life, the veterans proved to be excellent students. The GI Bill’s educational benefits created tremendous occupational advancement. Most veterans’ fathers were manual laborers, but those who took vocational training became skilled technicians and business owners. Veterans who attended college “experienced breathtaking transformation in their life circumstances,” according to Suzanne Mettler from Syracuse University.39 The GI Bill created 450,000 engineers, 240,000 accountants, 238,000 teachers, 91,000 scientists, 67,000 doctors, 22,000 dentists, and millions of skilled technicians. GI Bill recipients included 14 Nobel Prize winners, 3 Supreme Court justices, 3 presidents, and 12 senators. Another unintended consequence of the GI Bill was the American dream of home ownership and suburban life. Before World War II, most Americans lived in urban rental apartments, but afterward most owned their homes. The GI Bill offered mortgages with low interest rates and no down payments, which made home ownership about the same cost as renting. Home ownership grew from 44 percent in 1940 to 62 percent in 1960. Veteran builder of military housing Bill Levitt took advantage of what he foresaw as a surge in the demand for homes by creating cheap, prefabricated houses in the suburbs. He and the many home builders who copied him became developers of whole neighborhoods, thus making the rapid move to the suburbs possible. The GI Bill substantially increased America’s middle class and fueled the country’s economic growth with highly trained veterans. The bill is also credited with strengthening U.S. democracy by increasing civic involvement and expanding political leadership to include a broader segment of the population. Ultimately, the GI Bill yielded a huge return on investment for the U.S. government. The Congressional Research Office estimated that every dollar invested in veterans generated somewhere between 5 and 12 dollars in increased tax receipts due to their higher incomes. Former president Clinton called it “the greatest investment in our people in American history.”40 In a more recent war, executive orders regarding the treatment of prisoners created the right conditions for a cult of illegal torture to emerge among U.S. troops.
The Lucifer Effect President Bush’s war on terrorism gave rise to a shockingly widespread culture of illegal torture among traditionally morally disciplined U.S. soldiers.
Coming into Being
135
Evidence of prisoner abuse came to light on April 28, 2004, in a 60 Minutes II program titled “Abuse of Iraqi POWs by GIs Probed,” which displayed disturbing photographs of U.S. soldiers abusing detainees at the Abu Ghraib prison in Iraq. The pictures showed stacks of naked prisoners, sexual taunting by guards, a hooded prisoner standing on a box with electric wires attached to his fingers, the presence of vicious dogs, and, most infamous of all, a guard leading a naked prisoner on all fours with a dog leash attached to his neck. Particularly appalling were the smiles on the guards’ faces, showing that they were enjoying themselves. This was not how U.S. soldiers were trained to behave. Congress was incensed with this illegal and internationally embarrassing saga portraying the United States as an abusive occupier of Iraq. The incident would surely make it harder to quell the insurgent violence that continued to plague Iraq. Senator John McCain said, “The mistreatment of prisoners harms us more than our enemies.”41 Surely the guilty soldiers would serve time in jail, which they did, but there was much more to the story. The abuse at Abu Ghraib was a clear violation of the Geneva Convention, which Knut Dormann, head of the Red Cross’s Legal Division, called “the cornerstone of contemporary International Humanitarian Law.”42 In 1882, the United States signed the Geneva Convention. It was amended in 1929 to protect prisoners of war from abuse and, by 1949, was ratified by 194 countries. According to the Geneva Convention, the prisoner abuse at Abu Ghraib included “grave breaches,” like torture, inhumane treatment, infliction of great suffering, deprivation of a fair trial, and unlawful deportation, transfer, or confinement of prisoners. Article 27 of the Fourth Geneva Convention specifically says that prisoners “shall at all times be humanely treated, and shall be protected especially against all acts of violence or threats thereof and against insults and public curiosity.” Initially, the U.S. government claimed that the prisoner abuse was an isolated case involving a few bad apples during the night shift at Abu Ghraib. When asked what went wrong at Abu Ghraib on the 60 Minutes II program, Brigadier General Mark Kimmitt, deputy director of coalition operations in Iraq, said, “Frankly, I think all of us are disappointed by the actions of the few.” The U.S. Joint Chiefs chairman declared there was “only a small section of the guards that participated in this, it’s a pretty good clue that it wasn’t a more widespread problem.”43 President Bush promised that the “wrongdoers will be brought to justice.”44 The military did, in fact, convict seven military police officers of the torture at Abu Ghraib and sentenced them to prison terms ranging from 45 days to eight years.
136
Best Laid Plans
Soon after, however, evidence of widespread prisoner abuse began to surface. After an official investigation, former defense secretary James Schlesinger reported instances of prisoner abuse throughout military operations in Iraq, Afghanistan, and at the Guantanamo Bay detention base. A report in November 2004 found hundreds of cases of prisoner abuse and dozens of instances where prisoners died during interrogations. In an April 2005 report titled “Getting Away with Torture?” Human Rights Watch claimed: “It has become clear that torture and abuse have taken place not solely at Abu Ghraib but rather in dozens of detention facilities worldwide, that in many cases the abuse resulted in death or severe trauma, and that a good number of victims were civilians with no connection to al-Qaeda or terrorism.”45 Philip Zimbardo, professor emeritus of psychology at Stanford University, explained how a culture of torture could emerge among normally welldisciplined U.S. soldiers in his book, The Lucifer Effect: Understanding How Good People Turn Evil.46 Zimbardo had sponsored the notorious Stanford Prison Experiment that involved students playing the roles of prisoners and guards. He had to stop the experiment prematurely because the guards became too abusive. Parallels between Abu Ghraib and his Stanford experiment prompted Zimbardo to write The Lucifer Effect to “understand the processes of transformation at work when good or ordinary people do bad or evil things.”47 Zimbardo attributes the widespread prisoner abuse to the system in which the soldiers operated, which he described as “the complex of powerful forces that create the Situation.” He also noted, “Systems provide the institutional support, authority, and resources that allow Situations to operate as they do.”48 He observed that people behave differently in groups in which “emergent norms” rule the day where “some new practice quickly becomes the standard that must be complied with and conformed to.”49 Zimbardo summed up the abuse problem as follows: “The military and civilian chain of command had built a ‘bad barrel’ in which a bunch of good soldiers became transformed into ‘bad apples’.”50 Human Rights Watch came to the same conclusion, citing the fact that the prisoner abuse at Abu Ghraib “did not result from the acts of individual soldiers who broke the rules. It resulted from decisions made by the Bush administration to bend, ignore, or cast rules aside. Administration policies created the climate for Abu Ghraib and for abuse against detainees worldwide in a number of ways.”51 A series of internal investigations eventually convinced military leaders that the system it had created was at fault. A 2004 investigation by Lieutenant General Anthony R. Jones and Major General George R. Fay blamed
Coming into Being
137
the prisoner abuse on the “operations environment.” They noted, “Looking beyond personal responsibility, leader responsibility and command responsibility, systemic problems and issues also contributed to the volatile environment in which abuse occurred.”52 The architects of systemic torture worked at the highest levels in the Bush administration. In a November 2001 interview on Meet the Press, Vice President Richard Cheney said, “We also have to work through, sort of the dark side, if you will. We’ve got to spend time in the shadows in the intelligence world. . . . It’s going to be vital for us to use any means at our disposal, basically to achieve our objective.”53 Secretary of Defense Donald Rumsfeld implemented Cheney’s vision by authorizing harsh interrogation methods for prisoners at Guantanamo Bay and in Iraq and Afghanistan that violated the Geneva Convention, including removing prisoners’ clothes, using detainee phobias, isolating prisoners, putting prisoners in stressful positions, using hoods, employing forced grooming, and depriving prisoners of light and sound. Human Rights Watch noted, “Secretary Rumsfeld created the conditions for U.S. troops to commit war crimes and torture by sidelining and disparaging the Geneva Conventions, by approving interrogation techniques that violated the Geneva Conventions as well as the Convention against Torture, and by approving the hiding of detainees from the International Committee of the Red Cross.”54 The Bush administration created the conditions for a culture of torture to emerge among front-line soldiers. The top of the command chain told them to get more intelligence and to “take the gloves off ” to get it. It is unknown whether such harsh investigatory practices gave the Bush administration its desired intelligence, but experts from the military and from the Federal Bureau of Investigation claim that torture does not provide reliable information and that more humane practices are more effective. A major unintended consequence of the prisoner abuse scandal was the tarnishing of the United States’ reputation as a positive force for peace, democracy, and civil behavior. Senator John McCain warned, “Prisoner abuses extract a terrible toll on us in this war of ideas. They inevitably become public, and when they do they threaten our moral standing.”55 In an earlier war, the punitive treatment of the vanquished helped create the conditions for another evil force to emerge.
A Carthaginian Peace The harsh terms imposed on Germany by the Treaty of Versailles in 1919 to conclude World War I unwittingly created conditions that gave rise to
138
Best Laid Plans
the cult of ultranationalism, Nazism, and Adolf Hitler. Germany was in shambles economically and politically in 1919, and, had the Allies sought to restore the country instead of punishing it, the horrors surrounding World War II might have been avoided. This is precisely what British economist John Maynard Keynes recommended when he attended the treaty conference as Britain’s Treasury delegate. Keynes argued that the conference should focus on rebuilding postwar Europe’s economy, including Germany’s. He argued against imposing reparations on Germany for wartime damages, believing that such reparations would destroy Europe economically. He further proposed forgiveness of war debts and the creation of a U.S. credit program to invest in European industry. Delegates from the United States, France, and Britain did not understand Keynes’s economic ideas, and, in fact, President Wilson forbade U.S. Treasury officials from even discussing the credit provision in Keynes’s plan. Keynes noted that France was especially keen to “crush the economic life out of Germany.”56 The victorious allies were preoccupied with preventing Germany from reemerging as a future military threat. The Treaty of Versailles imposed stiff reparations and humiliating conditions on Germany. The country was to pay £6.6 billion in reparations and to limit its military to 100,000 soldiers, 15,000 sailors, a few ships, and no air force. Germany also had to cede 13 percent of its homeland and all of its colonies to France, Poland, Denmark, Czechoslovakia, and Belgium. Especially humiliating was the treaty’s “guilt clause” that laid the blame for the war entirely on Germany when, in fact, a chain reaction of mutual defense treaties triggered by a Serbian terrorist in Bosnia had started the war. Keynes left the conference in frustration on May 26, 1919, returning to Cambridge University to write his famous book, The Economic Consequences of the Peace. In the book, Keynes harshly criticized the Treaty of Versailles, calling it a “Carthaginian peace,” a reference to how Romans in 146 b.c. achieved lasting peace with its Phoenician enemy in Carthage by killing and enslaving the entire population and demolishing the city. He warned that the German economy would be destroyed by inflation if it printed money to pay for reparations. He further noted that Lenin had said that the best way to ruin a capitalist country was to “debauch its currency.” Keynes presciently predicted that some malevolent force might emerge from the treaty’s harsh treatment of the German people: “Who can say how much is endurable, or in what direction men will seek at last to escape from their misfortunes?”57 At the signing of the treaty, Germany was run by a
Coming into Being
139
newly formed, weak government beset with a disastrous economy and political chaos. Although historians disagree on whether the treaty’s imposed reparations ruined Germany’s economy, its humiliation of the German people and failure to help restore their economy set the stage for the emergence of Hitler’s Nazi Party and the end of Germany’s nascent democracy. When military action ended in November 1918, Germany’s Social Democratic Party forced Kaiser Wilhelm III to resign, ending Germany’s monarchy. In January 1919, a national constitutional assembly created a new government modeled after Britain’s parliament called the Weimar Republic, so named after the out-of-the-way city where the assembly took place to avoid the political violence that had erupted in Berlin. In the final years of World War I, Germany suffered from mounting inflation, a declining standard of living, increasing food shortages, and even starvation. Economic matters deteriorated after the war, with soldiers returning home to unemployment, huge war debts, a crippled industry, and the payment of reparations. In need of cash, the Weimar government resorted to printing currency, which caused disastrous hyperinflation. In 1915, the 50 million mark bank note was worth about $12 million; in 1923, it was worth about $1 million and soon after became virtually worthless. Germany eventually decided it could no longer afford to make reparations and stopped paying them in 1923. France retaliated by occupying Germany’s Ruhr Valley, which was Germany’s most important industrial area and critical for its economic recovery. Industrial activity ceased in the Ruhr when German workers passively refused to run factories, which further increased Germany’s unemployment and poverty. The French further humiliated Germans by attacking the striking workers to force them back to work. In addition to economic woes, the Weimar Republic was created amid a violent political conflict between Communists and right-wing parties. In October 1918, the Communists took over much of the country, and many feared that Germany would become a Communist state like Russia had become a year earlier. The German Communist Party was founded in January 1919, and further uprisings led to Communist takeovers in Saxony, Hamburg, and the Ruhr valley. Pitted against the Communists were right-wing groups that favored totalitarian rule. In March 1920, a right-wing group took over Berlin, forcing the new Weimar government to flee to Stuttgart. In 1920, the rightists formed the National Socialist German Worker’s Party, nicknamed the Nazi Party, and a year later appointed Adolf Hitler as its chairman. Hitler’s failure
140
Best Laid Plans
to take over Munich in the Beer Hall Putsch of 1923 landed him in jail for eight months, during which he wrote Mein Kampf and plotted to take over all of Germany. The years of economic hardship, political instability and violence, and national humiliation prompted the rise in German nationalism and the Nazi Party. Hitler blamed the Weimar Republic for signing the Treaty of Versailles with its oppressive and humiliating terms. As the Nazi Party became increasingly popular among German voters, Hitler eventually became chancellor in 1933 and, within months, ended the Weimar government and began destroying his political opposition.
Lessons from Emergence The major lesson about emergence is that it is vitally important to consider the conditions that decisions and acts create that could unintentionally spawn unexpected phenomena. Sometimes this requires counterintuitive thinking. On the surface it may seem, for example, more logical to punish a defeated enemy than help it rebuild, but the punishment may create the conditions that give rise to a more formidable foe. Similarly, it may seem logical to deport illegal immigrants who commit crimes, but the deportations can give rise to a much larger international crime organization. It is especially important to be wary when introducing massive programs, efforts, or expenditures, which are more likely to give rise to the unexpected. The massive military expenditures during World War II gave rise to the military-industrial complex that is today stronger than ever. The massive spending on education under the GI Bill transformed U.S. society. The massive spending on fiber-optic cable helped create the Indian outsourcing industry. John Maynard Keynes exemplified this type of thinking when he wisely envisioned that the punitive treatment of Germany under the Treaty of Versailles would create the right conditions—economic hardship and resentment—for something nefarious to emerge as men sought “at last to escape from their misfortunes.” Unfortunately, his advice went unheeded, and we suffered the dire consequences of Nazi Germany. At the conclusion of World War II, the victorious allies were perilously close to repeating the same mistakes they had made in punishing Germany after World War I. Instead, they created the right conditions for the emergence of a vital and peaceful postwar era in Western Europe. After Germany was defeated in World War II, the United States, Soviet Union, France, and Britain signed the Morgenthau Plan to convert Germany
Coming into Being
141
into a pastoral state by reducing its heavy industry to 50 percent of its prewar capacity, thus making it incapable of resurfacing as a powerful aggressor. In the aftermath of World War II, Germany and the rest of Europe were once again in economic ruins and chaos, which set the stage for a Communist takeover of Europe. The Soviet Union occupied half of Germany and Europe, installed Communist systems in every country it occupied, and was spreading Communist influence throughout the rest of Europe. Punishing West Germany and failing to rebuild Western Europe would have made a Communist takeover of all of Europe all the more likely. U.S. statesmen eventually recognized the impracticality and dangers of the Morgenthau Plan. As a member of Truman’s commission on postwar planning for Germany, former president Herbert Hoover bluntly stated, “There is the illusion that the New Germany left after the annexation can be reduced to a ‘pastoral state.’ It cannot be done unless we exterminate or move 25,000,000 people out of it.” Other U.S. officials began recommending the rebuilding of Germany and Europe to promote peace, prosperity, and political stability on the continent. The United States abandoned the Morgenthau Plan in September 1946. In April 1948, President Truman signed the Marshall Plan, named after Secretary of State General George Marshall, which called for spending billions of dollars rebuilding West Germany and Western Europe. It was a plan similar to what Keynes had proposed for settling matters after World War I. In total, the United States invested $12 billion in the rebuilding of Western Europe, and from 1948 to 1952 its industrial production increased 35 percent, the fastest period of growth in its history. Postwar poverty and starvation vanished, and European living standards improved. Europe’s need to import goods even helped stimulate the U.S. economy. The Marshall Plan helped create political stability, which greatly reduced the prospect of a Communist takeover of Europe. Although historians debate how much the Marshall Plan can be credited for the rapid restoration of Western Europe, there is general agreement that, at the very least, it accelerated the recovery. Those that conceived the Marshall Plan had learned a vital lesson about emergence: avoid a ruinous repeat of the Treaty of Versailles and create the right conditions for peace and prosperity to emerge in postwar Europe. Similarly, the generous treatment of a defeated Japan proved beneficial to both the victors and the vanquished. The decades of peace and strong alliances between the United States and its defeated enemies of Germany and Japan and between Germany with the rest of Europe after World War II make a strong case for thinking through how decisions
142
Best Laid Plans
and deeds can create contexts from which beneficial or disastrous phenomena arise. A major point in this chapter is that social and biological systems come into being as self-sustaining entities—and that sometimes we unintentionally cause this to happen. The next chapter shows how disrupting established social and biological systems can also cause unintended consequences.
Chapter 9
Breaching the Peace The Worst Ecological Disaster Imported in the 1970s with the best laid plans to clean fish farms and sewage plants, the Asian carp could become one of the worst manmade ecological disasters in history.1 The carp escaped from fish ponds and sewage treatment plants in Arkansas and, by 2009, had migrated near Lake Michigan. If they enter the lake, the carp could destroy the world’s largest freshwater ecosystem by eradicating native fish from all the Great Lakes and their numerous connecting rivers. The situation has become so grave that the state of Michigan filed a lawsuit in January 2010 in the U.S. Supreme Court requiring the state of Illinois to permanently close its locks in Chicago to block the carp’s entrance to Lake Michigan. The culprits are three large species of Asian carp—bighead, silver, and black—all of which are bottom feeders that remove plankton and other life forms at the bottom of the aquatic food chain on which native fishes directly or indirectly subsist. These Asian carp can consume up to 40 percent of their body weight each day. The black carp can grow to five feet in length and weigh 100 pounds. The Asian carp are also highly prolific; they spawn several times a year, giving birth to millions of offspring. Moreover, they have no natural predators or diseases to cull their explosive population. The fact that the Asian carp like cold water makes their northerly migration to the Great Lakes even more troublesome. The potential destruction to the Great Lakes is clearly evident from the widespread damage the Asian carp have caused during their migration to Chicago through the Mississippi River Basin and the riverways that lead to Lake Michigan. Asian carp now dominate many parts of the Mississippi, Tennessee, Missouri, Ohio, Columbia, and Platte Rivers. When waters on an offshoot of the Mississippi River near St. Louis receded after a 1999 flood, the muddy flats revealed that 97 percent of all dead fish left high and dry were large Asian carp. The smaller, 50-pound silver carp is notorious for
144
Best Laid Plans
injuring boaters as scores of them simultaneously leap up to 10 feet out of the water when disturbed. The Asian carp invasion is a story of federal and state government agencies’ best intentions gone seriously awry as they experimented with using bottom-feeding alien carp to clean fish farms and sewage treatment plants. The first documented importation of Asian carp was in 1963 by the Stuttgart National Aquaculture Center in Arkansas, which was part of the U.S. Department of the Interior until it was transferred to the U.S. Department of Agriculture in 1996. The center imported finger-sized grass carp from Malaysia to replace herbicides used to control weeds in southern fishing holes and farms. In 1966, the center accidentally spilled its grass carp into runoff waters that eventually drain into the Mississippi River Basin. The carp made their way to the basin, but they seemed to have had little impact on its ecosystem. The ecological disaster began four years later, however, with the accidental invasion of much larger Asian carp. In 1970, Arkansas fish farmer Jim Malone ordered Asian grass carp to control weeds in his fish ponds. When his shipment arrived, it mistakenly included 60 bighead, silver, and black carp. Because these species of carp do not eat weeds, Malone donated them to the Arkansas Game and Fish Commission. The commission devised an experiment to see whether these large carp could help consume human waste in municipal sewage plants and become a revenue-generating source of food. The Environmental Protection Agency liked the idea enough to provide $81,000 to fund the project. Although the Asian carp proved to be highly prolific in consuming vast quantities of human waste, they had no market value, because the U.S. Food and Drug Administration banned the sale of animal products raised on human waste, even for dog food. When the commission terminated the experiment, it kept the Asian carp in poorly maintained tanks. The carp escaped into nearby flooded ditches and eventually made their way to the Mississippi Basin. As Mike Freeze, former chairman of the Arkansas Game and Fish Commission confessed, “None of us were as careful as we should have been.”2 Although critics frequently blame southern fish farmers for importing the large carp and the Mississippi River floods in 1990 for enabling them to escape, the large Asian carp had escaped much earlier from the commission’s Arkansas lab and reached the Mississippi Basin by 1980. The escaping Asian carp entered the Mississippi River via its many tributaries and headed north, eating, spawning, and destroying ecosystems along the way. By 2006, bighead and silver carp had swum up the Illinois and Des Moines rivers and entered the Chicago Sanitary and Ship Canal. The
Breaching the Peace
145
canal provides ships access to Lake Michigan through a series of locks and was opened in 1900 to stop the dumping of human waste into Lake Michigan, the city’s source of drinking water. Although the American Society of Civil Engineers has called the canal one of the greatest feats of the 20th century, it has provided a highly controversial two-way conduit for alien species to invade the previously separated Great Lake and Mississippi River Basin ecosystems. If the large carp enter Lake Michigan, they could quickly destroy the world’s largest freshwater ecosystem, which encompasses 95,000 square miles of water, 5,000 tributaries, and 30,000 islands. The delicate ecosystem contains many endangered species and has been weakened for centuries by invasive alien animals and plants. The Asian carp could quickly displace valued game and food fish like sturgeon, Coho salmon, yellow perch, and brook and lake trout. The Great Lakes are also a vital nesting and migratory corridor for many species of birds whose food sources might get disrupted with the disappearance of fish on which they subsist. The states bordering the Great Lakes stand to lose $7 billion in commercial and sport fishing and another $8 billion to $10 billion in recreational boating. Worse still, the large carp have limited value because they are too boney to be used as food and are not a popular sport fish. The last line of defense against the Asian carp entering Lake Michigan is an electrical mesh that the Army Corps of Engineers installed on the bottom of the canal. On November 20, 2009, however, the corps discovered DNA evidence that the Asian carp had crossed the electrical barrier and had swum to within eight miles of Lake Michigan. If the large carp can breach this electric barrier, they can easily swim through the canal’s locks into Lake Michigan. In early 2010, biologists detected DNA evidence indicating that Asian carp had already made it into the lake. Faced with this impending disaster, Michigan’s attorney general, Michael Cox, filed a lawsuit on December 21, 2009, with the U.S. Supreme Court to force the state of Illinois to permanently shut the locks that link the canal with Lake Michigan. The governments of Indiana, Minnesota, New York, Ohio, Pennsylvania, Wisconsin, and Ontario, Canada, have supported the lawsuit. The Illinois Chamber of Commerce and the American Waterways Operators, however, countersued saying that the closure of the canal would cost the shipping industry $1.5 billion per year and cost thousands of jobs. On January 19, 2010, the U.S. Supreme Court refused to consider Michigan’s lawsuit. Biological ecosystems like the Great Lakes are complex, self-organized, and self-sustaining. They are complex in that have many diverse
146
Best Laid Plans
interconnected components—plants, animals, soil, water, and nutrients— and behave in complex ways that do not lend themselves to simple formulaic explanations. They are self-organized and self-sustaining in that they came into existence on their own and maintain their own balance of nature. The invasion of the Asian carp illustrates that meddling with ecosystems can destroy them. Even attempts to prevent forest fires can disrupt ecosystems and be selfdefeating. Periodic forest fires in redwood groves, for example, actually help preserve the ancient trees by clearing out saplings and inflammable debris on a forest’s floor. Mature redwoods have thick, fire-resistant bark that protects them from ground-level fires. The suppression of forest fires, however, causes redwood groves to become clogged with dead branches and tall saplings. When lightning ignites the debris, the saplings catch fire and carry flames high up into the forest’s flammable foliage, where wind-driven fire spreads through the forest. The well-intended effort to control forest fires results in killing trees that have survived thousands of years and countless forest fires without human intervention. Similarly, cities, cultures, communities, economies, and other social entities are complex systems that have evolved over many years to become reasonably stable phenomena. If permitted the freedom to do so, these complex social systems can also be self-organizing and self-sustaining. Economies and cities thrived, for example, thousands of years before experts arrived to meddle with them, and disturbing social systems, like disrupting the balance of nature, can have many dire unintended consequences. In many instances politicians, city planners, economists, and other professionals who sought to fix what they perceived as broken social systems have made them worse and, in some instances, have destroyed them altogether.
The Death of Great American Cities City planners’ efforts to improve old—but otherwise healthy—urban neighborhoods have unintentionally caused their decline in ways that are analogous to the withering of coral reef ecosystems subject to pollution. Coral reefs consist of millions of tiny organisms that make vast limestone structures that support a complex and vibrant ecosystem, including 4,000 species of fish and 25 percent of all marine species in the world. Undisturbed, the coral reef is a self-sustaining, adaptive ecosystem. However, when pollution kills the coral organisms, the entire ecosystem collapses, leaving behind a lifeless, rotting, limestone tomb encrusted in slime.
Breaching the Peace
147
Similarly, older cities that have grown organically are social ecosystems consisting of numerous individuals in pursuit of business, pleasure, and domestic matters. They have thrived unplanned and are self-sustaining, vibrant, safe, adaptive places. Urban planners with the best intentions to improve these cities have caused a number of them to collapse, leaving behind decaying stone structures that once housed a diversity of human activity. Jane Jacobs described in her seminal book, The Death and Life of Great American Cities, how cities can thrive as complex systems similar to ecosystems. Cities should be thought of, she wrote, “as problems in organized complexity—organisms that are replete with unexamined, but obviously intricately interconnected, and surely understandable, relationships.”3 She further noted that cities were adaptive, self-sustaining entities that can improve themselves “in spite of planning and counter to ideals of city planning.”4 What makes a neighborhood a vibrant ecosystem is city life, and what makes a city safe is a diversity of activity that puts eyes on the streets around the clock seven days a week. Jacobs noted, “The first thing to understand is that the public peace—the sidewalk and street peace—of cities is not kept primarily by the police, necessary as police are. It is kept primarily by an intricate, almost unconscious, network of voluntary controls and standards among the people themselves, and enforced by the people themselves.”5 She further attributes urban safety to pedestrian traffic: “Lowly, unpurposeful and random as they may appear, sidewalk contacts are the small change from which a city’s wealth of public life may grow.”6 Jacobs cited four factors that are required to put eyes on the street day and night, day after day. First, a neighborhood must have multiple functions mixed together for people to come and go at different hours. For example, a neighborhood with a tight mixture of residential dwellings, businesses, stores, restaurants, bars, and libraries generates pedestrian traffic at different times of day and night. This is in stark contrast to the modern U.S. city that consists primarily of office buildings, which attract people only during the 40-hour work week. Many of these modern cities are dangerous ghost towns after 5:00 p.m. and on weekends. Second, a dense concentration of residents is needed to supply constant vigilance and to support a large number of restaurants, bars, stores, and other establishments that sustain foot traffic at different times of day and night. Furthermore, a high density of dwellings and establishments is important to encourage walking instead of driving. Empty land such as parking lots and parks decreases the density of people and makes cities more dangerous. Finally, the presence of older buildings provides inexpensive
148
Best Laid Plans
rents to support a variety of less-profitable establishments like bars, foreignfood restaurants, antique stores, small bookstores, shoe repair shops, and corner stores. The North End in downtown Boston closely matches Jacobs’s model for a healthy neighborhood, as she acknowledged when she wrote her book in 1961. The North End was first settled in the 17th century and contains older buildings such as Paul Revere’s house, which dates back to 1680. The neighborhood is densely packed with mixed-use buildings, including dwellings, restaurants, shops, churches, and a diversity of other businesses sustained by lower rents in older buildings. It is a labyrinth of narrow streets and small blocks with minimal parkland and has one of the highest dwelling densities in the United States. Pedestrian traffic remains heavy until very late at night. The neighborhood has a very low crime rate and, according to Jacobs, is “as safe as any place on earth.”7 By 1960, residents of the North End had dramatically improved their neighborhood on their own. Bankers and urban planners considered the North End to be the worst slum in Boston due to its overcrowding; children playing in its narrow, winding streets; and a seemingly chaotic mix of residential and commercial buildings. Ironically, it was this messy organic nature of the North End that made it vital and safe but misled city planners into thinking it was a slum in need of urban renewal. After World War II, urban planners throughout the United States similarly misperceived older inner-city neighborhoods to be slums. Their best intentions to revitalize these older neighborhoods destroyed many of them. State and federal governments provided billions of dollars to cities to confiscate properties via eminent domain, demolish whole neighborhoods, and replace them with large housing projects, wide parks, and civic centers. The planners also adhered to the mistaken notion that it was important to separate residential and commercial use of property. Urban renewal forcibly evicted thousands of people from close-knit neighborhoods who were “expropriated and uprooted much as if they were the subjects of a conquering power,” according to Jacobs.8 Today it is widely acknowledged that the raze-and-build urban renewal was a disastrous mistake. Low-income projects became drug infested and far more dangerous than the neighborhoods they replaced. A number of these new housing projects became so hopelessly crime-ridden that they had to be demolished. Highways bisected cities and isolated whole neighborhoods, causing them to decline. Parks and civic centers became dangerous places frequented mostly by vagrants and criminals. The former sense of commu-
Breaching the Peace
149
nity and of neighborly relations disappeared along with the eyes on the street that naturally deterred crime. Being one of the oldest cities in the United States, Boston lost nearly onethird of its inner city neighborhoods to raze-and-build urban renewal. Granted federal money and the power of eminent domain, the Boston Redevelopment Authority (BRA) set out in 1957 to eradicate the city’s “substandard residences, narrow costly, outmoded streets, and public utilities.”9 The BRA first targeted Boston’s West End, which it mistook as a slum. Like its nearby North End neighborhood, the West End was a thriving neighborhood densely populated with Italian, Jewish, Polish, Greek, Albanian, and African Americans. The BRA condemned the West End as a slum, razed the neighborhood, and replaced it with middle-income housing, parks, and broad streets. West Enders fought to save their neighborhood, sometimes resorting to violence, but were eventually evicted and forced to relocate. In 2002, the Boston Globe noted, “The old West End was the new agency’s [BRA’s] first target and ultimately its most remembered failure.”10 Eventually, the high-rise buildings became luxury apartments and condominiums, and, ironically, some of the park lands between them have been filled recently with new townhouses not too dissimilar from the ones that the BRA demolished. The tragic raze-and-build era of urban renewal ended in the early 1970s. By then, city planners adopted much less invasive means for revitalizing downtown areas, such as rehabilitating existing buildings instead of demolishing them. Nevertheless, a number of smaller projects like skywalks and pedestrian walkways have unintentionally made cities less safe by decreasing the number of people on the street. A number of cities introduced skywalks in the 1960s to insulate pedestrians from inclement weather by creating networks of second-story bridges connecting downtown buildings. Planners have subsequently acknowledged that skywalks made cities more dangerous because they decreased street traffic and caused street-level shops to close and property values to decline, all of which made fewer eyes on the streets. Skywalks also created dark areas that harbored vagrants, impaired building facades and city vistas, and segregated people according to income. Des Moines, Iowa, for example, built three miles of skywalks in 1983 for the purpose of saving its inner city. Decades later, however, city planners found that the skywalks had greatly diminished street-level pedestrian traffic, which caused ground-level vacancy rates to soar to 60 percent. Urban planners have also found that the conversion of city streets into pedestrian walkways, free of cars, has backfired. Although intended to
150
Best Laid Plans
promote pedestrian traffic, these walkways are only used during work hours. After 5:00 p.m. and on weekends, they are empty and provide fewer eyes on the street than if they had some automobile traffic. Shop owners along such walkways have complained about crime and loitering, and cities like Chicago, Tampa, and Eugene, Oregon, have reopened the streets to vehicle traffic. Another example of well-meaning urban planning that has gone awry is the development of large, full-service projects that have sucked the vitality from adjacent neighborhoods. Baltimore’s popular Harborplace Mall, for example, was built in 1980 to revitalize the city’s barren waterfront by building a complex of office buildings, hotels, restaurants, and shops. Although the project greatly improved Baltimore’s waterfront, it virtually destroyed the adjacent Sandtown area, which previously was a thriving low-income neighborhood and one of Baltimore’s safest downtown places. Sandtown lost 120 establishments and became one of Baltimore’s most dangerous areas infested with 72 blocks of abandoned or dilapidated homes, 50 percent unemployment, and widespread drug traffic.11 The Renaissance Center in downtown Detroit similarly destroyed adjacent areas. Built to be a city within a city with seven interconnected skyscrapers, a shopping center, hotels, and movie theaters, the Renaissance Center sapped the vitality of surrounding neighborhoods that had already been in a state of decline. Outside of the Renaissance Center, street crime is appallingly high, and downtown Detroit routinely ranks among the most dangerous cities in the United States.
A Peace to End All Peace When the British granted its subcontinental colony of India independence on August 15, 1947, it unwittingly created one of the most dangerous conflicts in today’s world. In the process of granting India its independence, Britain acceded to Muslim demands for carving out the Islamic state of Pakistan. The partitioning of the subcontinent unleashed centuries of religious tension between indigenous Hindus and conquering Muslims with waves of genocide, terrorism, and ethnic cleansing. The partitioning needlessly created two mortal enemies, both equipped with nuclear arms, who have been waging continual warfare over the ownership of the northern province of Kashmir. Indian civilization is one of the world’s oldest civilizations, dating back to around 3000 b.c.e. in the delta of the Indus River in today’s south-central
Breaching the Peace
151
Pakistan. India’s primary religion of Hinduism includes thousands of gods and goddesses whose mythical lives are lavishly depicted in temple carvings and sculptures. In sharp contrast, the Muslims who settled in India are strict monotheists who believe that graven images are sinful, pagan idolatry. Muslim influence in India began with early seventh-century trade and accelerated with the Arab invasion beginning in the eighth century and the Mongol invasions first led by Timur in 1398. Timur’s invasion set the stage for the Islamic Mughal dynasties that held power over much of India until the British colonial administration called the Raj took control in 1857. For centuries, the Muslim invaders ruled over India’s Hindu majority population with vacillating periods of conflict and peaceful coexistence. Some Islamic conquerors, like Timur, massacred Hindu as idolaters, while other Mughal leaders intermarried with Hindus and sought to blend in with the majority population. Over time, the two religious groups shared villages and mixed socially, though they tended to live in separate neighborhoods and harbor mutual mistrust. According to Larry Collins and Dominique Lapierre, coauthors of Freedom at Midnight, the British Raj “had forced its Pax Britannica over the warring subcontinent, but the distrust and suspicion in which the two communities dwelt remained.”12 Immediately following World War II, a beleaguered Britain was no longer willing to forcibly subdue India’s independence movement with its military mutinies, civil disobedience, and the pro-independence stance of the Royal Indian Armed Forces. Britain’s Prime Minister Attlee appointed Lord Mountbatten to be India’s last viceroy and charged him with winding down the British Raj on midnight of August 15, 1947. Complicating matters was Britain’s decision to carve up its subcontinental colony into the predominantly Hindu Dominion of India and an Islamic state called the Dominion of Pakistan. The idea of creating an Islamic state originated in 1933 when an Indian Muslim named Rahmat Ali first envisioned a new Islamic country called Pakistan, named for the Muslim-majority provinces it would include: Punjab, Kashmir, Sind, the Frontier, and Baluchistan. India’s Muslim League embraced Ali’s plan when it became concerned that, within an independent India, “they would be drowned by Hindu majority rule, condemned to existence of a powerless minority in the land their Mogul forebears had once ruled,” according to Collins and Lapierre.13 Fearing that civil war would break out in the newly independent India and acceding to pressure from the Muslim League, Mountbatten decided to create the Islamic state of Pakistan, which would be officially established the day before the rest of India would be granted its independence.
152
Best Laid Plans
Mountbatten delegated the details for partitioning Pakistan and India to a patrician English barrister named Sir Cyril Radcliffe and gave him a mere few weeks to do the job. Radcliffe knew very little about India, and, as he drew border lines around enclaves of Muslims and Hindus, he was well aware that errors in his delineations could have dire consequences in the relations between the two new states. Radcliffe’s plan was to create Western Pakistan from the heavily Islamic areas in northwestern Punjab and Eastern Pakistan from the northeastern Muslim area of Bengal. To implement Radcliffe’s plan, Mountbatten negotiated with India’s provincial maharajas to get them to cede their dominions to either India or Pakistan. Mountbatten’s failure to deal with Hari Singh, maharaja of Kashmir, has had dire consequences. Mountbatten believed that heavily Muslim Kashmir should become part of Pakistan, and Jinnah, who was head of the Islamic League, assumed that it would. In fact, Jinnah had welcomed Singh to join Pakistan, promising him “an honored place in his new dominion,” even though the maharaja was a Hindu. An indecisive Singh, however, foolishly thought he could rule over Kashmir as a newly created independent country. Mountbatten visited Singh in July 1947 to determine which of the new countries he planned to join and assured the Maharaja that India had no objections to his joining Kashmir with Pakistan. Singh told Mountbatten that he had no interest in joining Pakistan, to which Mountbatten replied, “I think you should consider it very carefully, since after all, almost 90 percent of your people are Muslim. But if you don’t, then you must join India.”14 The maharaja stated that he wanted to rule over an independent Kashmir, to which Mountbatten replied, “You’ll lose your throne and your life, too, if you’re not careful.”15 Feigning sickness, Singh refused to meet with Mountbatten the next day to give him his decision, and the fate of Kashmir was left unsettled when India and Pakistan gained their independence in August 1947. Matters came to a head on August 24, 1947, when Jinnah, who had become Pakistan’s first governor general, decided to take a two-week vacation in Kashmir to recuperate from tuberculosis and the exhausting independence negotiations. He sent his British military secretary to make arrangements for his visit. Five days later, the secretary reported that Singh would bar Jinnah from visiting Kashmir, even to vacation. In response, Jinnah sent a spy to determine Singh’s plan for Kashmir, who, upon returning, reported that Singh had no intention of ever joining Pakistan. Jinnah and fellow Pakistani leaders conspired to take Kashmir by force in a secret plot. They would unleash a band of fearsome Pathan warriors
Breaching the Peace
153
from Pakistan’s semiautonomous northwest region to invade Kashmir and wrest control of the province. The Pakistani leaders motivated the Pathans with the need to wage a holy war to prevent Hindu India from taking control of their Muslim brothers in Kashmir and with the promise they could loot the province. On the night of October 22, 1947, the Pathan and their Pakistani overseers crossed into Kashmir unopposed. Singh’s private army had abandoned him and left unguarded the 135 miles of paved road that led to Singh’s palace in Srinagar. The Pakistanis leading the Pathan invaders expected to invade Srinagar and force Singh’s abdication within a few hours at daybreak. Soon after the Pathans crossed into Kashmir, however, they became preoccupied with looting a nearby market, and it took the warriors two days to travel 75 miles to blow up a power station on October 24. The resulting power outage cast Srinagar and Singh’s palace into darkness while he was celebrating a Hindu festival, prompting the maharaja to flee for his life. Singh found refuge in another palace and sought military assistance from India to fend off the Pathan invaders. India replied that it had an agreement with Pakistan not to intervene in independent Kashmir and could only do so if Kashmir were to become part of India. V. P. Menon, an Indian civil servant assigned to collect accessions from former maharajas, flew to Kashmir to meet Singh on October 26 and returned to Delhi the same day with his signed accession giving Kashmir to India. Showing the document to a British commissioner, he said, “Here it is. We have Kashmir. The bastard signed the Act of Accession. And now that we’ve got it, we’ll never let it go.”16 Considering itself the legal owner of Kashmir, India immediately flew troops to Srinagar the next day, on October 27. By then, the Pathans were lagging five days behind Pakistan’s plan to invade Srinagar. They had paused once again just 30 miles from Singh’s palace to rape and pillage a local convent. When the Pathans finally arrived in Srinagar, they encountered Indian troops, which drove them out of Kashmir. To this day, Pakistan refuses to recognize India’s legal possession of Kashmir, because it believes that Singh had no authority to cede it to India and only did so under duress. One of the most dangerous situations in today’s world is the unintended consequence of Mountbatten’s failure to assign Muslim Kashmir to Pakistan before Britain left its former colony on August 15, 1947. India and Pakistan have gone to war over Kashmir in 1947, 1965, and 1999 and have been in a state of perpetual conflict ever since partitioning. Indeed, the clash over Kashmir is the only situation in the world that could easily trigger a nuclear war between two atomic powers. Moreover, Islamic extremists see India’s
154
Best Laid Plans
possession of Kashmir as a cause célèbre for conducting a holy war to rescue their Muslim brothers from their infidel masters. Of special concern are Islamic extremists operating in Kashmir that have ties to al-Qaeda terrorists. U.S. secretary of defense Robert Gates stated, while on a visit to Pakistan in January 2010, that these Kashmiri extremists were seeking to provoke a nuclear war between India and Pakistan. There are many instances like India, where planners have unintentionally breached the peace of settled societies by arbitrarily redrawing country borders. In his book, A Peace to End All Peace, David Fromkin describes, for example, how the carving up of the Ottoman Empire by Britain and France in the aftermath of World War I created a number of artificial countries that cut across ancient tribal societies and has caused decades of conflict.17 The British cobbled together the modern state of Iraq, for example, by joining Kurds with two Arab tribes—Sunnis and Shiites—that have been locked in a bloody religious war against each other since the death of Mohammed in 632 c.e. It was only the iron rule of Iraqi dictator Saddam Hussein that kept these rival ethic groups at bay. His deposition by U.S. forces in 2003 unleashed the ethnic tensions among these groups that threaten Iraq’s viability and the stability of the region today. Like settled societies, economies are self-sustaining systems, and, despite decades of scholarly research, it remains highly controversial whether meddling with them does more harm than good.
The Late Great Stimulus Congress passed the American Recovery and Reinvestment Act of 2009 to mitigate the recession that began in the fourth quarter of 2007. Much of the $365 billion spending provided for in the act, however, might ironically come into effect after the economy has already recovered and could unintentionally become inflationary. A recession occurs when consumers and businesses spend too little to support the economy’s production capacity, thereby creating unemployment. According to the National Bureau of Economic Research’s Business Cycle Dating Committee, a recession is a “significant decline in economic activity spread across the economy, lasting more than a few months, normally visible in real GDP [gross domestic product], real income, employment, industrial production, and wholesale-retail sales.”18 Figure 9.1 shows the National Bureau of Economic Research’s data on recessions over the past the past 156 years, indicating that they have become a little less frequent and considerably shorter lived. Recessions over the past 50 years have lasted only 10 months before the economy turned around
FIGURE 9.1. Profile of the 32 Business Cycles in the U.S. Economy from 1854 to 2001
Average Number of Months Between Cycles
Recessions Have Become Less Frequent 80 70 60 50 40 30 20 10 0 1854-1919
1919-1945
1945-2991
Number of Months from Peak to Trouugh
Recessions Have Become Shorter 25 20 15 10 5 0 1854-1919
1919-1945
1945-2991
Source: National Bureau of Economic Research Cycle Dating Committee, Cambridge, MA, December 1, 2008.
156
Best Laid Plans
and began growing again, whereas it took a bit longer than 20 months before recovery began for recessions occurring during the prior 100 years. Economies, like ecosystems, are complex adaptive systems with balancing forces that come into play to eventually self-rectify recessions. The fact that economies can recover from recessions on their own is self-evident from the fact that they have done so since recorded history, long before there were economists proposing government policies to stabilize business cycles. It was not until 1936, for example, that British economist John Maynard Keynes advocated government intervention in economic matters in his famous book, The General Theory of Employment, Interest, and Money. The Congressional Research Service, the nonpartisan research arm of Congress, noted in a 2009 policy paper that “recessions generally are short-term in nature—eventually, markets adjust and bring spending and output back in line, even in the absences of policy intervention.”19 The issue of economies being self-correcting has given rise to heated political debate over whether governments should intervene to stabilize fluctuations in business cycles or let the economy heal itself. Strong advocates of laissez-faire policy, which broadly means to leave it alone, believe that government intervention makes matters worse and creates dire unintended consequences. A good example of this laissez-faire perspective was written by Bruce Bartlett, a former U.S. Treasury Department official, in a popular 1992 Wall Street Journal article titled, “If It Ain’t Broke, Don’t Fix It.” Bartlett states that nearly every U.S. stimulus spending effort to mitigate recessions has unintentionally “exacerbated inflation, raised interest rates and made the next recession worse.”20 Conversely, there are those who strongly believe that government policy can and should intervene to mitigate recessions, though they frequently argue over how to do so. Decades after his death in 1946, Keynes remains an influential proponent of government intervention, and numerous economists and politicians routinely refer to themselves as being Keynesians. Although Keynes recognized that economies eventually self-recover from recessions, he believed that governments should intervene in the short run to alleviate economic hardship, because, as he famously said, “in the long-run we are all dead.”21 The U.S. government, in fact, has been politically pressured to intervene in every economic decline since the Great Depression by variously stimulating the economy with tax decreases and deficit spending and by encouraging the Federal Reserve to increase the money supply. Of the various ways to stimulate an economy, deficit spending has the highest potential for coming into effect too late, with the unintended con-
Breaching the Peace
157
sequence of causing inflation due the inherent time lags involved. The Congressional Research Service noted: When the economy is in a recession, fiscal stimulus could mitigate the decline in GDP growth by bringing idle labor and capital resources back into use. When the economy is already robust, a boost in spending could be largely inflationary—since there would be no idle resources to bring back into production when spending is boosted, the boost would instead bid up the prices of those resources, eventually causing all prices to rise.22
The first timing problem is that economists cannot reliably predict the onset of recessions or when the economy will recover. The official start dates of recessions, for example, are usually established months or quarters after the fact. My own analysis of the forecast accuracy for two of the most influential groups of economists—the Council of Economic Advisors and the Congressional Budget Office—indicated that they are no better than chance at predicting recessions one year into the future.23 Figure 9.2, for example, illustrates the inaccuracy of economic predictions for the severe downturn in the economy that occurred during the first quarter of 2009. In September 2008, for example, IHS Global Insights, which claims to be the “most consistently accurate economic forecasting firm in the world,” predicted that the U.S. economy would grow by 1 percent in the first quarter of 2009. Instead, the economy experienced a severe, record decline of 6 percent.24 The process of governmental approval for enacting stimulus spending can add months and even quarters to the time delays. The authorization of stimulus spending requires both chambers of Congress to separately pass legislation, then reconcile any differences they have, and, finally, have the president to sign the finished document. As shown in Table 9.1, the recessions that occurred from 1945 to 2001 lasted only about 10 months before the economy began to recover, which is a very short period relative to the inherent delays in government stimulus spending. In fact, Table 9.1 shows that all but one stimulus spending bill enacted to rectify the recessions that have occurred since 1948 were approved too late. Finally, it can then take years before the money actually is spent on various projects and even longer for the economy to react to it. The deficit spending portion of the Recovery Act of 2009 signed by President Obama on February 17, 2009, was highly controversial, with no House Republicans and only three Senate Republicans voting for it. Before it was enacted, 200 economists purchased ads in the New York Times and the Wall
FIGURE 9.2. Gross Domestic Product Growth Forecasts for First Quarter 2009 Made at Different Dates in the Final Months of 2008 2
1ST Quarter 2009
Percent Change in Real GDP
1 0 -1
Sept. 2008
Oct. 2008
Nov. 2008
Dec. 2008
Jan. 2009
Feb. 2009
Mar. 2009
-2
Forecasts
-3 -4 -5
Actual Growth
-6 -7
Source: IHS Global Insight, http://www.ihsglobalinsight.com.
TABLE 9.1. Economic Stimulus Legislation Was Enacted Only Once Before the Last Eight Recessions Ended
Beginning of Recession November 1948 August 1957 April 1960 December 1969 November 1973 July 1981 July 1990 March 2001
Number of Months after Recession Ended That Stimulus Was Enacted 0 0 3 8 0 3 8 −5
Source: Congressional Research Service, Economic Stimulus: Issues and Policies, Washington, D.C., December 9, 2009.
Breaching the Peace
159
Street Journal on January 28, 2009, to argue against its passage, while 200 other economists signed a petition on February 8, 2009, in favor of it. The Recovery Act consisted of $787 billion in economic stimulus in the form of tax cuts, unemployment relief, and domestic spending, the first two of which have relatively quick impacts on the economy. Some portion of the $356 billion stimulus spending might come too late. The Congressional Budget Office, which provides impartial analyses for the House and Senate Budget Committees, estimated that only 40 percent of the domestic spending actually gets paid out in 2009 and 2010, as shown in Table 9.2. The Congressional Research Service noted, “By historical standards, the recession would be expected to end before the stimulus could be delivered, but forecasters are predicting this recession will be longer than usual.”25 As shown in Figure 9.3, the recession officially began in the fourth quarter of 2007. The Recovery Act was not enacted until the first quarter of 2009, five quarters later. As the graph indicates, the legislation was enacted the same quarter as economic growth commenced, though, as of this writing, the determination of whether the recession that began in 2007 actually had ended in January 2009 will remain unknown for some time. If, in fact, the recession is officially declared to have ended in January 2009, it will have done so just when the stimulus package was enacted, as has happened for all but one recession since World War II. The possible unintended consequence of the
TABLE 9.2. The Estimated Timing of Domestic Spending Outlays from the Recovery Act of 2009 Year of Outlay 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019
Amount (in Billions of Dollars)
Percent of Outlays
29.0 115.8 105.5 53.6 25.6 13.0 6.9 3.0 1.6 0.9 0.4
8.0 32.5 29.6 15.1 7.4 3.7 1.9 0.8 0.5 0.3 0.1
Source: Congressional Budget Office, Cost Estimates for the American Recovery and Reinvestment Act of 2009, Washington, D.C., January 26, 2009.
2007 q1
q2
q3
q4
2008 q1
Official Start of Recession
q2
q3
q4
2009 q1
Stimulus Enacted
q3
q4
2010 q1
q3
q4
32 Percent in 2010
q2
Estimated Rate of Stimulus Spending
8 Percent in 2009
q2
Source: Recession dates from the National Bureau of Economic Research Cycle Dating Committee, Cambridge, MA, December 1, 2008; gross domestic product data from the Bureau of Economic Analysis, Gross Domestic Product 2010 3rd Quarter, Washington, D.C.; spending rate from the Congressional Budget Office, Budget Review, Washington, D.C., June 2010.
-8
-6
-4
-2
0
2
4
6
8
FIGURE 9.3. Time Lags in Fiscal Policy (Growth Rates of Real Gross Domestic Product)
Percent Growth Rate of Real GDP
Breaching the Peace
161
domestic spending part of the Recovery Act of 2009 is that it might in hindsight prove to be inflationary.
The Dutch Disease Paradox Providing economic aid to the least developed countries in the world can have the unintended consequence of weakening their economies and inhibiting their ability to lift themselves from poverty. Development economists are troubled by the fact that the $2.74 trillion that had been spent on foreign aid from 1970 to 2008 has had a small impact on alleviating poverty. Paul Collier, an economics professor at Oxford University, analyzed the growth rates of the countries comprising the poorest billion people on Earth whose countries received the trillions in aid. He concluded that, over this period, the countries had not grown at all and that, by 2000, “they were poorer than they had been in 1970.”26 This enigmatic finding exacerbates the persistent controversy regarding the efficacy of economic aid, where liberals believe that wealthy, developed nations are obligated to aid impoverished ones, while conservatives believe that it only makes recipients dependent on aid that usually ends up in dictators’ Swiss bank accounts to fund their wars, coups, and lavish lifestyles. “The left seems to want to regard aid as some sort of reparations for colonialism. . . . The right seems to want to equate aid with welfare scrounging . . . rewarding the feckless and so accentuating the problem,” according to Collier.27 A number of economists devoted to fighting poverty are convinced that just pouring money into an impoverished country is ineffective and may even harm its economy. Collier, who has devoted his entire professional life to aiding poor African countries, for example, has concluded that “aid alone is really unlikely, in my view, to be able to address the problems of the bottom billion.”28 Raghuram Rajan, a professor at the University of Chicago and chief economist at the International Monetary Fund, has stated: “One point about which there is general agreement among economists is that there is little evidence of a robust unconditional effect of aid on growth.”29 In a paper titled “What Undermines Aid’s Impact on Growth?” Rajan and Professor Arvind Subramanian from Johns Hopkins University, found that “economic aid doesn’t have a robust positive correlation with long-run growth.” Furthermore, they found that good governance on the part of the aid recipient was necessary but insufficient for aid to spur economic growth. They pondered a number of significant questions. “Why is it so hard to find a robust effect of aid on the long-term growth of poor countries, even those
162
Best Laid Plans
with good policies? . . . Why do countries with better policies and governance not seem to use aid any better?”30 Collier, Rajan, and Subramanian theorized that a major reason for the ineffectiveness of economic aid is Dutch Disease: a phenomenon whereby large, sustained inflows of aid money weaken the aid recipient’s economy. Rajan and Subramanian’s research indicated that “aid inflows have systematic adverse effects on a country’s competitiveness.”31 According to the theory, aid weakens a recipient country’s export manufacturing industries, which are vital to the country’s economic growth because manufacturing industries are labor intensive and help reduce unemployment. Export industries also generate foreign currency needed to purchase imported goods that help fuel economic growth. Dutch Disease, named after an incident when Holland’s oil wealth seems to have stunted its economy, works in two ways to weaken the aid recipient’s export manufacturing industries. First, a large and sustained inflow of aid money inflates the prices for limited resources, such as the wages for skilled workers, which makes the country’s goods manufactured for export become uncompetitive on world markets. Second, the aid inflow raises the value of the country’s currency relative to that of other countries, making the country’s export products even less competitive. These two forces combine to cause the country’s export manufacturing to decline, thereby increasing unemployment, decreasing exports and access to foreign currency, and making the country more dependent on foreign aid. Rajan and Subramanian examined countries that received substantial aid during the 1980s and 1990s and concluded that “aid inflows do alter the growth trajectory of a country—moving it away from export-oriented labor intensive manufacturing.”32 In another article, Rajan noted, “We know that the law of unintended consequences is always at work. This means that few programs ever operate as the designers intended, and that ever present Dutch Disease can be mitigated through sensible policies. But to do so, one must first acknowledge its existence and its pernicious effects.”33 He advised that, instead of providing large inflows of unconditional aid, donors should experiment with micro-interventions, monitor and evaluate results, and share best practices that promote the aid recipient’s export manufacturing industries. Collier suggests, for example, that aid for improving infrastructure like ports can help a poor country’s export industries and can be scaled back when the project is finished in order to avoid Dutch Disease.34 Economists generally agree that humanitarian aid, as distinguished from aid for economic development, is vitally important and less subject to Dutch Disease. Nevertheless, they caution that donors need to make sure
Breaching the Peace
163
they do not disrupt local industries or sustain aid so long that Dutch Disease comes into play. This occurred, for example, when humanitarian aid was provided to Haiti in the wake of the disastrous January 2010 earthquake that killed nearly 230,000 people and left 1 million homeless. The U.S. Agency for International Development donated 7,000 tons of rice just when Haiti’s locally grown crop was coming to market, which made it hard for Haitian farmers to sell their produce while rice was being handed out for free. In the future, Haiti’s president, Rene Preval, wants to deemphasize food aid in favor of aid to restore the nation’s crippled agriculture.35 In a paper titled “Unintended Consequences of Food Aid Abound,” economist Christopher Barrett from Cornell University cautions that food aid can harm local agriculture by depressing local food prices and making it hard for local farmers to stay in business.36 He found that sustained food aid was especially harmful to local agriculture because it depressed prices much longer. He advised that donors should focus on emergency aid that does not last long enough to hurt local agriculture.
Lessons from Breaching the Peace Among the most disastrous unintended consequences wrought by the developed world in the past 100 years has been the thoughtless meddling with nature and societies’ complex systems. Ecosystems have been and continue to be destroyed not so much by manmade disasters like British Petroleum’s oil spill in the Gulf of Mexico in May 2010, but rather from well-intended efforts to improve nature that have backfired by introducing seemingly helpful alien species like Asian carp, gypsy moths, killer bees, and thousands of other foreign plants and animals. Urban renewal efforts have razed thriving city neighborhoods that were mistakenly seen as slums and destroyed a number of America’s older cities. The thoughtless creation of arbitrary countries like Pakistan, Afghanistan, and Iraq that cut across ancient tribal lands has created, in David Fromkin’s words, “a peace to end all peace.” Fooled by our advanced knowledge and technology, we have arrogantly thought that we can improve the world by intervening in its natural and social systems, when in fact many of our efforts to do so have failed miserably and with dire consequences. The quandary is whether and how to intervene. One of a few easy answers is to minimize human intervention in ecosystems by not introducing alien species even with the best intentions of improving nature. I can think of no case where such an introduction was helpful and many that have been disastrous. We even need to be more thoughtful about subtle interventions in
164
Best Laid Plans
nature, like preventing forest fires that unintentionally cause more serious fires in the future. Beyond this, the issue of intervention becomes less clear. It would appear to be unwise, for example, to stop introducing genetically altered plants and animals that have higher yields, are disease resistant, and help feed a hungry world. The repercussions of such high-tech marvels remain unknown. There is speculation, though, that a disease could wipe out the entire population of a genetically identical plant or animal. Our record in meddling with social systems has been equally poor, and the quandary remains about whether and how to intervene to improve matters, which is the subject of the next chapter.
Chapter 10
Thinking through the Maze Failure This book is about the prevalence of unintended consequences arising from decisions and initiatives that failed to achieve their intended goals. An InfoWorld article provocatively claimed that calling failed initiatives unintended consequences was management-speak for you “didn’t bother to think them through.”1 In other words, unintended consequences are actually failures caused by ignorance, error in analysis, or overzealousness, which is what sociologist Robert Merton asserted in his famous 1936 article, “The Unintended Consequences of Purposive Social Action.” The thesis of this book, however, is that unintended consequences are primarily caused by eight social mechanisms that complicate the process of thinking through decisions in advance. As described in the preceding eight chapters, complex social systems contain many elements—people, organizations, and institutions—that interact in ways that make their behavior hard to anticipate. Small events can give rise to huge outcomes. People rapidly adapt to changes in their environment to defeat our best intentions. We unintentionally create conditions that spawn phenomena that take on lives of their own. Furthermore, the complexity of social systems has kept the social sciences from developing scientific theories to help make reliable predictions about how events will unfold, as is done in the hard sciences of physics and chemistry. In his book Consilience, Harvard biologist Edward O. Wilson noted that “the social sciences are hyper complex. They are inherently far more difficult than physics and chemistry, and as a result they, not physics and chemistry, should be called the hard sciences.”2 Wilson concluded that the social sciences “lack what can be called a true scientific theory . . . [that is] . . . precise in the predictions they make across many phenomena.”3 The failure of experts
166
Best Laid Plans
to make reliable predictions documented in my earlier book, The Fortune Sellers, supports Wilson’s assertion. Economists’ inability, for instance, to reliably predict turning points in the economy constrains their ability to develop sound policy recommendations.4 Furthermore, the many failed initiatives documented in this book and in other sources is humbling. For example, studies conducted by Harvard Business School professor John Kotter and the consultancies McKinsey and Bain indicate that the failure rate in attempting to introduce changes to organizations is about 70 percent. The McKinsey study, titled “The Inconvenient Truth about Change Management,” is especially authoritative with its large sample of 1,546 firms.5 Although these studies focused on businesses, there is no reason to believe that government agencies, associations, and other nonbusinesses would fare any better in introducing changes to their organizations. Other studies have similarly found that the failure rate in implementing information technology (IT) is also in the 70 percent range.6 The British research group Organizational Aspects of Information Technology studied 14,000 organizations, for example, and found that 70 percent of IT projects had “failed in some way.” A KPMG study of 1,450 companies found that 61 percent of IT projects failed. A Standish Group study of 8,380 IT projects found that 84 percent had failed to meet their deadlines or budget goals and that 31 percent of the projects were cancelled midstream. The failure rate of new businesses is similarly high. According to a 2009 Small Business Administration report, one-third of new businesses fail within two years, and half fail within five years. Scott Shane, author of Illusions of Entrepreneurship, found that 70 percent of new businesses fail 10 years after their founding.7
Our Simplicity-Seeking Minds Psychologists have found that our decision-making ability is inhibited by our inherent mental blind spots in dealing with complex social systems. Psychologists Amos Tversky, formerly from Stanford University, and Daniel Kahneman, from Princeton University, made ground-breaking discoveries on these blind spots, which they documented their in a popular 1974 article titled “Judgment under Uncertainty: Heuristics and Biases.” Kahneman won a Nobel Prize in 2002 for his and Tversky’s discovery of systematic decision rules and biases that caused errors in human decision making; Tversky died in 1996 and thus was ineligible for the award. Tversky and Kahneman found that our minds have a number of flaws that can lead to erroneous decision making, especially in dealing with complex social systems. We oversimplify complex situations. We envision familiar
Thinking through the Maze
167
events as being more likely to occur than unfamiliar ones. We overestimate the likelihood of success in pursuing projects. We misunderstand the probability of events happening and infer patterns from randomness. Our minds seek to simplify matters by generalizing and stereotyping, which causes us to make snap decisions on faulty information. People often generalize, for instance, the ability of tennis players from the skills of the people with whom they play. This oversimplification can lead to invalid conclusions, because a stronger player might frequently play with less skilled ones for social reasons. Tversky and Kahneman concluded that this simplifying process can lead to erroneous decision making: “Many decisions are based on beliefs concerning the likelihood of uncertain events . . . people rely on a limited number of heuristic principles [decision rules] which reduce the complex tasks of assessing probabilities and prediction values to simpler judgmental operations . . . [which can] lead to severe and systematic errors.”8 We have difficulty anticipating outcomes with which we are unfamiliar, because we assign a lower probability to their occurring. Given the unpredictable nature of our complex social systems, we thus often fail to anticipate the unintended consequences of our deeds because we are less familiar with them—we might not even recognize unintended outcomes at the outset. We envision future events as extensions of our current surroundings, which prevents us from anticipating unusual outcomes. We envisioned future flying machines that resembled birds, automobiles as horseless carriages, and the radio as wireless. Prior to the September 2001, al-Qaeda terrorist strike in New York and Washington, DC, it would have been hard to envision Islamic terrorism threatening the Western world. We are overconfident in undertaking initiatives because we underestimate the probability of failure, which may cause us to pursue goals with too little caution and preparation. In part, we are overconfident because we overestimate the likelihood of a project’s success by incorrectly inferring it from the likelihood of success of its components. If the components seem likely to succeed, we believe that the project will succeed. We fail to see, however, that it often takes only one component to fail for the overall project to fail and that the probability of any one component failing is quite high. For example, we infer that a project with 10 components, each with a 95 percent chance of succeeding, would have high probability of success, when, in fact, the probability of all 10 components working is only 60 percent. We leap to conclusions, misjudging the true probabilities of events. A classic example of this blind spot is the gambler’s fallacy. After a long run of red outcomes on a roulette wheel, gamblers put their chips on black because they erroneously believe that the odds of a black result is much higher after the
168
Best Laid Plans
long run of reds. In fact, the probability of a black result is always 50 percent and has nothing to do with prior spins of the wheel. Similarly, we tend to believe that, after flipping a coin and getting heads five times in a row, the odds of getting tails has become much higher than getting another head. Our faulty minds believe that a tails result is long overdue and that the odds of getting heads six times in a row seems remote, when, in fact, the odds of getting a sixth head is still 50 percent regardless of prior results. We see patterns and make erroneous inferences from random events. For example, if events X and Y occur simultaneously due to chance, we tend to believe that X caused Y to happen or vice versa. This is not just a layman’s problem. Academic researchers continually face a problem called spurious correlations between events that have mathematical relationships with no causal connection. Dutch statisticians found, for example, a mathematical relationship between the number of storks nesting in Holland and the number of babies born at the same time, which obviously was a relationship with no causal connection. The problem is serious enough to prompt U.S. and European social scientists to create the Journal of Spurious Correlations.
The Art of Meddling At last we come to the concluding question in this book. Given that our living world consists of tricky, complex systems that our faulty minds have difficulty dealing with, what can and should we do with respect to intervening in our organizations, economies, and other social systems? The late Lewis Thomas, a famous physician and award-winning author, suggested in his essay On Meddling that the best way to avoid unintended consequences is to stop interfering with complex systems—both biological and social ones. “Intervening is a way of causing trouble . . . the safest course seems to be to stand by and wring hands, but not to touch,” he counseled.9 Thomas’s advice is similar to that of extreme proponents of laissez-faire government like Milton Friedman who have suggested that the best way to avoid unintended consequences is for governments to do as little as possible. Thomas goes on to blame “interveners” who meddle with life’s complex systems as the root causes of their problems. He said that germs are interveners in our bodies and that removing them will restore our health. Thomas similarly blames problems in our social world on human interveners whose “efforts to be helpful . . . may have caused things to have gone wrong.” As an example, he proposed the following remedy to improve the health of cities like New York: “I do not know who the [interveners] of New York City may be, but it seems to me a modest enough proposal that they be looked for,
Thinking through the Maze
169
identified, and then neatly lifted out. Without them and their intervening, the system will work nicely. Not perfectly, perhaps, but livably enough.”10 Thomas’s advice on removing interveners has, in fact, proven to be sound in dealing with the biological world. The banning of the pesticide DDT helped the bald eagle make an impressive comeback. The removal of dams from rivers enabled the return of spawning salmon. The banning of chlorofluorocarbons has enabled the thinning ozone layer to show signs of recovery. When it comes to our social world, however, meddle we must, for without social intervention our world would be an undesirable place to live. It would be heavily polluted and riddled with chaotic infrastructure and dangerous workplaces. Social intervention has undoubtedly accomplished much in the way of public health, civil rights, fair voting, a national highway system, and many other things that make our world a better place than if we had just let matters evolve on their own. Furthermore, when left alone, organizations stagnate and become ineffective, and continual small changes and periodic larger ones are needed to sustain them. If meddle we must, then how should we do so in a manner that minimizes the tyranny of unintended consequences? The following six-step process is a proposal for intervening in complex social systems that I developed by synthesizing key points from the preceding chapters and pertinent ideas from change management, development economics, strategic planning, urban planning, complex systems, and other fields.
Step 1: Avoid Rushing the Big Bang The biggest mistake one can make is to rush overconfidently into a major initiative. Major initiatives are inherently riddled with unintended consequences—mostly very bad ones. Examples of these initiatives include wars, large social programs, major organizational change, corporate mergers, untargeted economic aid, large urban renewal projects, and the imposition of political ideology, whether it be communism or Western-style democracy. Rushing into large projects with too little information, forethought, and planning is a recipe for disaster, as in the case of the 2003 invasion of Iraq. The United States rushed into the invasion lacking knowledge about Iraq’s weapons of mass destruction, religions, culture, society, and political system. The invasion also commenced without a plan to secure the peace once Iraq’s army was defeated and without an exit strategy to leave the country. The illconceived war has spawned a number of significant deleterious unintended consequences that may persist for decades.
170
Best Laid Plans
Step 2: Adopt a Humble Frame of Mind The way to begin intervening in complex social systems is to adopt a humble perspective by understanding the nature of the challenge and the uncertainty of success. Urban analyst Aaron Renn advised that one should approach such projects with “humility and rich understanding of the limits of what we can accomplish.”11 There are no scientific theories, road maps, complete descriptions of the system, or valid precedents to copy in undertaking an initiative. Lewis Thomas cautioned, “When you are confronted by any complex social system, such as an urban center or a hamster, with things about it that you’re dissatisfied and anxious to fix, you cannot just step in and set about fixing with much hope of helping. This realization is one of the sore discouragements of our century.”12 Particularly humbling is the massive failure rates of endeavors previously mentioned and the fact that failure might be more the norm than success. Renn noted that “it never ceases to amaze me that after all the failures of the past, all the unintended consequences, people are still ready to attempt to radically remake our cities on the basis of fairly simplistic policy approaches.”13 One should recognize that intervening in a complex system will unavoidably yield unintended consequences and that the best one can hope for is that the benefit of intended outcomes outweighs the downside of unplanned side effects. Other psychologists in addition to Amos Tversky and Daniel Kahneman have found that overconfidence is a serious problem in undertaking projects. J. Edward Russo from Cornell University and Paul Shoemaker from the University of Pennsylvania stated in a 1992 article, “Managing Overconfidence,” that “overconfidence has remained a hidden flaw in managerial decision making.”14 At the outset, it is useful to recognize that we tend to overestimate the likelihood of success of initiatives. Russo and Shoemaker have proposed a number of ways to address excess optimism, starting by identifying what is known and unknown about a complex system and the nature of the change one seeks to make to it. Russo and Shoemaker said that not knowing what you do not know is especially dangerous, reflecting a line from Confucius: “Real knowledge is to know the extent of one’s ignorance.” They next proposed that people can be trained to be more proficient in assessing the likelihood of a project’s success by providing continual feedback on actual versus projected progress. Another tool they suggest is the use of counter argumentation, a process whereby you ask yourself or enlist the help of others about how your initial assumptions and beliefs might be wrong. Make sure you talk to people of different experience, backgrounds, perspectives, and interests.
Thinking through the Maze
171
Step 3: Develop a Deep Understanding of the System Before developing a specific action plan, make sure you fully understand the system you are seeking to change by filling in the unknowns identified in the prior step. Thomas cautioned that “if you want to fix something you are first obliged to understand, in detail the whole system.”15 Think of the system as an ecosystem with many different players. Cast a large net to identify all the existing and potential stakeholders and assess their current and evolving roles. Identify their motives and how they might react to your proposed endeavor. Use chapter 7, “Perverse Adaptations,” as a checklist to identify potential unexpected behaviors. In their popular book Nudge, Richard Thaler from the University of Chicago and Cass Sunstein from Harvard University suggest that one should ask four questions: “Who uses? Who chooses? Who pays? Who profits?”16 Determine how these players are interlinked within the system and their factions and sources of power. Analyze the history of the system, its trends, and prior efforts to change it. Avoid the human tendency to oversimplify the system. As Tversky and Kahneman found, we inherently seek to reduce the number of variables and factors in analyzing complex systems. Thaler and Sunstein provided good advice on this matter: “Small and apparently insignificant details can have major impacts on people’s behavior. A good rule of thumb is to assume that ‘everything matters.’”17 Avoid trying to solve parts of the problem in isolation without considering the broader perspective of the entire system. As Renn noted, we tend to view problems “in isolation and develop problem-specific policies. But most problems are linked in a complex system.”18
Step 4: Draft Plans Specifically Suited to the System Avoid a one-size-fits-all approach in developing the plan. Consider how the system differs from others that are seemingly similar, recognizing that no two cities, organizations, countries, or cultures are ever exactly alike and that the same plan will not work in every case. Renn noted, “It is the nature of government to promote uniform laws and policies—which is a good thing in many contexts. But it would be a disastrous thing from an urban policy perspective because our cities are so diverse. They don’t all need the same thing. They need different things.”19 A particularly paradoxical example of the one-size-fits-all mentality is what Thaler and Sunstein call the “just maximize choices mantra, which is the mistaken belief that all initiatives are improved by providing “people as many choices as possible, and let citizens choose the one that they like best.”20 Thaler and Sunstein documented a number of cases where providing too much choice had the unintended consequence of confusing people and
172
Best Laid Plans
causing them to make very poor decisions. When Sweden privatized its social security system, for example, it provided its citizens with more than 1,000 investment options. Overwhelmed by so many options, Swedes violated the basic tenets of financial planning and made very poor investment decisions, which resulted in substandard returns. Recognize that initiatives that worked in the past may no longer be effective. Conversely, programs that previously failed might now be viable; perhaps they were poorly implemented or the timing was wrong. The worst excuse for dismissing an idea is to claim “we tried that before and it didn’t work.” Formulaic solutions and dogmatic ideas are recipes for disaster. Renn noted that “we need to be very skeptical of dogma and silver bullet solutions.”21 The failed attempts by the U.S. government to promote—and, in some instances, impose—Western-style democracy throughout the world is a good example. For example, the United States could have worked with Afghans to build a democratic model based on their centuries-old Loya Jirga, which in Pashto means “great council.” Tribal leaders from Pashtun, Tajiks, Hazaras, and Uzbek tribes have traditionally assembled for the Loya Jirga to select kings, amend constitutions, and to settle tribal disputes. The Loya Jirga has long been successful in Afghanistan because it utilizes tribal leaders who effectively run the country. In contrast, Western-style democracies are based on citizens voting directly for political office holders, thereby limiting the role of tribal leaders in the governing process. The collapse of Argentina’s economy in 2002 is another failed example of imposing a single model on countries with diverse economies. According to economist Robert Kuttner, “The economic collapse of Argentina is the latest failure of the one-size-fits-all model that the United States tries to impose on developing countries.”22 The economic model promoted by the United States and enforced by the International Monetary Fund (IMF) as a condition of obtaining its loans involves opening markets to foreign trade, limiting government, and balancing budgets in order to become globally competitive and attract foreign investment. Kuttner noted that other countries that had followed the U.S. model encountered severe economic troubles when “[too] much foreign capital poured in, and when the bubble burst, it poured right out again. The IMF then came in to shoot the wounded.”23 Argentina followed the U.S. market model closely by widely opening its economy to foreign trade and investment. The foreign investment that flowed into Argentina’s economy caused its currency to become overvalued, which made its exports uncompetitive in global markets. Faced with Argentina’s economic decline, the IMF imposed austerity measures that ex-
Thinking through the Maze
173
acerbated the country’s economic problems. Nobel Prize–winning economist Joseph Stiglitz noted that “international financial institutions [like the IMF] have pushed a particular ideology—market fundamentalism—that is both bad economics and bad politics. . . . The IMF has pushed these policies in ways that have undermined emerging democracies.”24 He noted further that countries with the highest economic growth rates—like China and Korea—had avoided the U.S. model.
Step 5: Thinking through the Maze The next step is to subject your plan to a mental acid test by thinking through how events might unfold and how unintended consequences might arise. This involves the crafting of potential outcome scenarios considering the full array of possible results of implementing your plan. Scenario planning is a useful tool to envision potential future environments that might affect your plan. In this instance, it is more important to consider how your plans might change the world, if even in a small, local way—this is essentially the subject of this book. Use the preceding chapters as a guide to assess the potential impacts of your plan. Consider, for example: •
• •
• • •
What are the potential knock-on effects of your plan? What are the furthest ramifications of implementing your plan throughout an organization, within an industry, across economic sectors, over time, and across geography? Are there any ways that your plan could unleash escalating forces that could greatly amplify events beyond your intent? How might your plan instill resisting forces within an organization, industry, or political sphere that could impede your progress? Does your plan pose a threat to existing players who might form competing alliances? How might your plan create incentives that cause people to change their behaviors in unexpected ways? Does your plan create conditions that might spawn unanticipated people, parties, ideas, movements, or other phenomena to come into existence? Does your plan disrupt established systems in such a way that could unleash an unexpected series of events?
Step 6: Start Small and Learn by Doing One tactic that leading experts on implementing change agree upon is the importance of starting with small projects. This can include a limited version of the full program or a complete program piloted locally on a small scale. Starting with small projects enables you to quickly implement new concepts and learn by doing. You can experiment with a small initiative and get quick
174
Best Laid Plans
feedback to discover what works while encountering failures on a small scale. It is essential to be highly flexible; if something is not working, stop it and proceed with Plan B. Avoid becoming too wedded to original plans or getting trapped by the sunk-cost fallacy. Renn noted that “we need to cast a wide net and be willing to try lots of things, knowing some will fail. . . . Failure isn’t necessarily bad, particularly if we are able to fail quickly and cheaply.”25 Starting with small initiatives also has political advantages. Limited programs also have a greater chance of survival because they fly under the radars of financial executives, controllers, and others charged with curtailing endeavors that appear to be wasting money. Small ventures pose substantially less political risk for project sponsors and team members if they fail. If your experiment with a single skywalk is ruining a city block, you can tear it down without expending too much political capital. Small pilot projects are also an effective way to gain organizational support for implementing the program on a larger scale. Starting small, for example, is a central element of the highly successful Toyota Production System, also known as lean manufacturing, that has become a widely acceptable practice throughout the world for both manufacturing and service operations. Lean manufacturing calls for creating a limited but fully functioning pilot program that is minimally disruptive to day-to-day operations. It enables a project team to experiment with different features and to expose the concept to a broad audience so they can observe the change concept and become more comfortable with the idea before it is fully implemented. According to Cynthia Karen Swank, who wrote an article titled “The Lean Service Machine” in Harvard Business Review, this fully functioning pilot program “allows managers to conduct experiments and smooth out the kinks while working towards an optimal design. It also gets people throughout the organization excited about the process, paving the way for the broader transformation effort that will follow.”26 It is encouraging to see an emerging wave of interest in conducting small experiments to assess the effectiveness of potential social programs instead of the widespread implementation of them, which causes adverse unintended consequences. The Poverty Action Lab (PAL) at the Massachusetts Institute of Technology, for instance, tests the efficacy of potential social programs in a similar manner used by pharmaceutical companies to test drugs before commercializing them. The PAL researchers create hypotheses—not assumptions—about how the program will work and set forth to test them. PAL randomly divides populations into two large groups of people and introduces a potential program to one group while leaving the other untouched as
Thinking through the Maze
175
a control group. The use of large random samples enables the researchers to derive reliable conclusions about the program’s impact on a society. There are a number of other instances where small experiments led to vastly improved social outcomes. Sweden, for example, experimented with a privatized social security system and ended up promoting a well-designed default plan that enabled citizens to greatly improve their investment performance. Companies have improved their employee’s savings rate by experimenting with programs that automatically enroll them in corporate pension plans. The success of school-choice programs has been greatly improved by testing whether providing parents with fact sheets describing each school— including test scores—would enable them to make better selections. Despite this book’s gloomy portrayal of failed human endeavors, I conclude it with an optimistic vision of economists, planners, and the many other meddlers taking a humble approach in intervening in our social and biological worlds. I envision more questioning of established wisdom and avoiding one-size-fits-all solutions. In particular, I see great hope in the use of small-scale experimentation with potential programs before they are broadly launched in organizations, societies, and environments. This new approach to meddling in our social world should yield productive results and mitigate the tyranny of unintended consequences.
Notes Chapter 1 The Tyranny of Unintended Consequences 1. Daniel Boorstin, Cleopatra’s Nose (New York: Vintage Books, 1995), 143. 2. David Maraniss, “Armey Arsenal: Plain Talk and Dramatic Tales,” Washington Post, February 21, 1995: A1. 3. Daniel Balz and Ronald Brownstein, Storming the Gates (Boston: Little, Brown, 1996), 358. 4. Edmund L. Andrews, “Greenspan Concedes Error on Regulation,” New York Times, October 23, 2008: 1. 5. Adam Smith, An Inquiry into the Nature and Causes of the Wealth of Nations (Chicago: University of Chicago Press, 1976), chapter 2. 6. Frederic Bastiat, Selected Essays on Political Economy (Irvington-on-Hudson, NY: Foundation for Economic Education, 1995), 2. 7. Robert K. Merton, “The Unintended Consequences of Purposive Social Action,” American Sociological Review 1, no. 6 (1936): 894. 8. Nicholas Rescher, Luck: The Brilliant Randomness of Everyday Life (Pittsburgh: University of Pittsburgh Press, 2001), 13. 9. Rescher, Luck, 32. 10. Voltaire, Zadig, and Other Tales (London: George Bell, 1907), 138. 11. Warren Weaver, “Science and Complexity,” American Scientist 36, no. 536 (1948): 536. 12. Aristotle, Politics, book 7, chapter 1.
Chapter 2 The Web of Life 1. Susan S. Lang, “Cornell Ecologist’s Study Finds That Producing Ethanol and Biodiesel from Corn and Other Crops Is Not Worth the Energy,” Cornell University News Service, July 5, 2005: 30. 2. Lester Brown, “Starving for Fuel: How Ethanol Production Contributes to Global Hunger,” Globalist, August 1, 2006. 3. Charles Darwin, On the Origin of Species (New York: Modern Library, 1993), 59.
178
Notes
4. Stanley Milgram, “The Small World Problem,” Psychology Today, 2 (1967): 60. 5. Lewis Thomas, The Medusa and the Snail (New York: Viking, 1979), 110. 6. Robert Jervis, Systems Effects: Complexity in Political and Social Life (Princeton, NJ: Princeton University Press, 1997), 18. 7. Drake Bennett, “Paradigm Lost,” Boston Globe, December 21, 2008: D1. 8. Bennett, “Paradigm Lost,” D1. 9. “A Mortgage Fable,” Wall Street Journal, September 22, 2008: A22. 10. Neil Bhutta and Glenn B. Canner, “Did the CRA Cause the Mortgage Market Meltdown?” Community Divided, March 2009. 11. Robert Gavin, “Study Puts New Spin on the Housing Bubble,” Boston Globe, May 5, 2010: B11. 12. Warren Buffett, Annual Letter to Berkshire Hathaway Shareholders, 2002. 13. Baram Marcus, “Who’s Winning Now? Gramm Slammed by Economists,” ABC News, September 19, 2008. 14. Derek Thompson, “Blaming Things Not Named Greenspan for the Great Recession,” Atlantic Business, January 12, 2010: 2. 15. Jeremy Grantham, “Just Desserts and Markets Being Silly Again,” GMO Quarterly Letter, October 2009: 6. 16. Paul Krugman, “Responding to Recession,” New York Times, January 14, 2008: 25. 17. Grantham, “Just Desserts,” 6. 18. Evan Thomas, Michael Hirsh, and Mathew Philips, “Rubin’s Detail Deficit,” Newsweek, December 8, 2008: 44. 19. Bennett, “Paradigm Lost,” D2. 20. “Behind the Steel-Tariff Curtain: A Blow-by-Blow Look at the Struggle That Culminated in Bush’s Decision To Impose the Levies,” Forbes, March 8, 2002. 21. Roe v. Wade, 410 U.S. 110, 153 (1973). 22. John J. Donohue and Steven D. Levitt, “The Impact of Legalized Abortion on Crime,” Quarterly Journal of Economics 116, no. 2 (2001): 379. 23. Jon Jeter, “The Dumping Ground,” Washington Post, April 22, 2002: A1. 24. Jeter, “Dumping Ground,” A1. 25. Neil Reynolds, “Goodwill May Be Stunting African Growth,” AfricaFiles, December 24, 2009: 1. 26. “The Invisible Green Hand: A Survey of Global Environment,” Economist, July 4, 2002. 27. National Aeronautics and Space Administration, “Aerosols May Drive a Significant Portion of Arctic Warming,” Washington, DC: NASA, April 8, 2009. 28. Anya Kamenetz, “What? Clean Air Act Caused Half of Global Warming, Says NASA,” Fast Company, April 13, 2009. 29. “Skies Scrubbed Clean Have Contributed to Half Europe’s Recent Warming,” New Scientist, July 5, 2008: 16. How declining aerosols and rising greenhouse gases forced rapid warming in Europe since the 1980s.
Notes
179
30. “The Six Degrees of Separation Is Now Three,” O2 press release, August 19, 2008.
Chapter 3 The Domino Effect 1. Charles A. Radin, “Rumors of Rape Fan Anti-American Flames,” Boston Globe, January 4, 2004. 2. Radin, “Rumors of Rape.” 3. Radin, “Rumors of Rape.” 4. Radin, “Rumors of Rape.” 5. Philip Van Munching, “The Devil’s Adman,” Brandweek, 42 (2001): 42. 6. Van Munching, “Devil’s Adman,” 42. 7. David L. Miller, Introduction to Collective Behavior (Belmont, CA: Wadsworth, 1985), 18. 8. Gustave Le Bon, The Crowd: A Study of the Popular Mind (1896; New York: Viking, 1960), 4. 9. Le Bon, Crowd, 23. 10. Robert Bartholomew, “The Martian Panic Sixty Years Later: What Have We Learned?” Skeptical Inquirer, 22 (1998): 440. 11. Bartholomew, “Martian Panic,” 440. 12. Carl von Clausewitz, On War (Princeton, NJ: Princeton University Press, 1976), 37. 13. John Kenneth Galbraith, The Great Crash (Boston: Houghton Mifflin, 1972), 113. 14. Geoffrey Colvin, “Old Consultants Never Die: They Just Go ‘e,’ ” Fortune, 141–12 (2000): 130. 15. “In Come the Waves,” Economist, June 16, 2005. 16. Aaron Lucchetti, “Buffett Defends Moody’s Managers,” Wall Street Journal, June 3, 2010: C1. 17. Ben S. Bernanke, “Monetary Policy and the Housing Bubble,” Board of Governors of the Federal Reserve System, January 3, 2010. 18. Henry Shannon, “Store-to-Door Shopping,” Washington Post, November 25, 1999: E1. 19. Del Jones, “New Economy Ideas Bit the Dust Faster than Usual,” USA Today, June 13, 2001: B1. 20. Adam Shell, “Tech Teetotalers Have the Last Laugh,” USA Today, January 11, 2001: B2. 21. Burton G. Malkiel, A Random Walk Down Wall Street (New York: W. W. Norton, 1990), 35. 22. Martin S. Fridson, Extraordinary Popular Delusions and the Madness of Crowds (New York: John Wiley, 1996), 119. 23. Miller, Introduction to Collective Behavior, 141. 24. Miller, Introduction to Collective Behavior, 148.
180
Notes
25. Richard Pascale, Managing on the Edge (New York: Simon & Schuster, 1990), 20. 26. John Micklethwait and Adrian Wooldridge, The Witch Doctors (New York: Times Books, 1996), 15 and 44. 27. Mack P. Holt, The French Wars of Religion, 1562–1629 (Cambridge, UK: Cambridge University Press, 1995), 81. 28. Donald G. Dutton, The Psychology of Genocide, Massacres, and Extreme Violence (Westport, CT: Praeger Security International, 2007), 35. 29. Teresa Poole, “Mao’s Frenzy of Mass Violence,” World Press Review 43, no. 8: 18. 30. Poole, “Mao’s Frenzy,” 18. 31. Lu Xiuyuan, “A Step toward Understanding Popular Violence in China’s Cultural Revolution,” Pacific Affairs, 67, no. 3: 563. 32. Poole, “Mao’s Frenzy,” 21. 33. Holt, French Wars, 81. 34. Helen MacGill Hughes, Crowd and Mass Behavior (Boston: Allyn & Bacon, 1972), 155. 35. Hazel Felleman, The Best Loved Poems of the American People (New York: Doubleday, 1989), 65. 36. Michael Hammer, Seminar on Six Sigma, Cambridge, MA, 2002.
Chapter 4 The Vicious Cycle 1. Janet L. Holt, “Hatfield-McCoy Feud Goes to the Graves,” June 1, 2002, www.thefreelibrary.com/Hatfield-Coy+feud+goes+to+the+graves.-a088764287. 2. Holt, “Hatfield-McCoy Feud.” 3. Holt, “Hatfield-McCoy Feud.” 4. Scott Canon and Ron Hutcheson, “Bush Intervenes in India-Pakistan Conflict as Leaders Warn of War,” Knight-Ridder Tribune News Service, December 29, 2001: 37. 5. Carl von Clausewitz, On War (Princeton, NJ: Princeton University Press, 1976), 20. 6. “Forget the Maine!” Economist 346, no. 8049 (1998): 32. 7. Richard Parke, “Galbraith and Vietnam: Kennedy, Unlike Bush, Had One Advisor Who Told Him What He Needed to Hear,” Nation 280, no. 10: 16. 8. 88th Congress of the United States of America, Second Session, August 10, 1964. 9. David Halberstam, The Best and the Brightest (New York: Random House, 1972), 618. 10. Jim and Sybil Stockdale, In Love and War (New York: Harper & Row, 1984), 25. 11. William McNeill, “The Changing Shape of World History,” History and Theory 34, no. 2 (1995): 8.
Notes
181
12. “Says Hoover Drove British Trade Away,” New York Times, October 14, 1932: 10. 13. “Pratt Says Tariff Cripples US Abroad,” New York Times, August 29, 1930: 1. 14. John M. Dunn, The Civil Rights Movement (San Diego: Lucent Books, 1998), 79. 15. Debra Riechmann, “Bush Tries to Boost Support,” Los Angeles Daily News, October 26, 2005: 13. 16. Peter Baker, “Bush’s New Tack Steers Clear of ‘Stay the Course,’” Washington Post, October 24, 2006. 17. Douglass Daniel, “Bush’s Words on Iraq Echo LBJ in 1967,” Associated Press, September 22, 2005. 18. “The Sunk-Cost Fallacy, Bush Falls Victim to a Bad New Argument for the Iraq War,” Slate, September 9, 2005: 1. 19. Robert Axelrod, The Evolution of Cooperation (New York: Basic Books, 1981), 14. 20. Axelrod, Evolution of Cooperation, 119. 21. Axelrod, Evolution of Cooperation, 120. 22. Martin Walker, The Cold War (New York: Henry Holt, 1993), 167. 23. Walker, Cold War, 181.
Chapter 5 The Bandwagon Effect 1. John Ward, “The Macintosh Licensing Conundrum,” Vectronic’s Apple World, www.vectronicsappleworld.com/macintosh/licensing.html. 2. A copy of the letter is available at http://www.scripting.com/specials/gates letter/text.html. 3. See note 2. 4. Ward, “Macintosh Licensing Conundrum.” 5. Peter Grindley, Standards Strategy and Policy (Oxford,UK: Oxford University Press, 1995). 6. Grindley, Standards Strategy, 27. 7. Paul A. David, “Clio and the Economics of Qwerty,” Economic History 75, no. 2 (1985): 333. 8. Harvey Leibenstein, “Bandwagon, Snob, and Veblen Effects in the Theory of Consumers’ Demand,” Quarterly Journal of Economics 64, no. 2 (1950): 183. 9. Patrick Healy, “Kerry’s Win in Iowa Attracts Cash for N.H. Test,” Boston Globe, January 22, 2004: A19. 10. W. Wayt Gibbs, “Saving Dying Languages,” Scientific American 287, no. 2 (2002): 80. 11. David Crystal, English as a Global Language (Cambridge, UK: Cambridge University Press, 2003), 4. 12. Crystal, English as a Global Language, 28. 13. Crystal, English as a Global Language, 84.
182
Notes
14. Alistair Wood, “International Scientific English: Some Thoughts on Science, Language, and Ownership,” Science Tribune, 2–4 (1997): 71. 15. Michael Specter, “Computer Speak; “World, Wide, Web: Three English Words,” New York Times, April 14, 1996: 4–1. 16. Seth Mydans, “Across Cultures, English Is the Word,” New York Times, April, 9, 2007. 17. Morton Grodzins, The Metropolitan Area as a Racial Problem (Pittsburgh: University of Pittsburgh Press, 1958), 4. 18. Grodzins, Metropolitan Area, 6. 19. Eric Bickford, “White Flight: The Effect of Minority Presence on Post–World War II Suburbanization,” Working Paper, University of California, Berkeley, 1997. 20. William H. Frey, “Central City White Flight: Racial and Nonracial Causes,” American Sociological Review, 44 (1979): 443. 21. Mark Feeney, “Boston Desegregation Judge Is Dead at 79,” Boston Globe, September 18, 1999: A1. 22. Grindley, Standards Strategy, 37 23. Grindley, Standards Strategy, 39. 24. Grindley, Standards Strategy, 39. 25. Bhaskar Chakravorti, “The New Rules for Bringing Innovation to Market,” Harvard Business Review 82, no. 3 (2004): 63. 26. Grindley, Standards Strategy, 39.
Chapter 6 The Balance of Nature 1. James E. Lovelock, GAIA (Oxford, UK: Oxford University Press, 1987), 10–11. 2. Connie C. Barlow, From Gaia to Selfish Genes: Selected Writings in the Life Sciences (Cambridge, MA: MIT Press, 1991), 9. 3. Jim Gillon, “Feedback on Gaia,” Nature 406, no. 6797 (2000): 685. 4. Robert Gavin, “Strength in Weakness,” USA Today, May 13, 2003: D1. 5. Patricia Chatergee, “The Classical Balance of Power Theory,” Journal of Peace Research 9, no. 1 (1972): 51. 6. Kenneth Waltz, Theory of International Politics (Reading, MA: AddisonWesley, 1976), 128. 7. Robert Jervis, System Effects (Princeton, NJ: Princeton University Press, 1997), 132. 8. Waltz, Theory of International Politics, 166. 9. Stephen Walt, The Origins of Alliances (Ithaca, NY: Cornell University Press, 1990), xx. 10. Herodotus, The History of Herodotus, trans. G. Rawlinson (London: None Such Press, 1935), section 9.9. 11. President George W. Bush, State of the Union Address, January 2002.
Notes
183
12. Robert C. Byrd, Losing America: Confronting a Reckless and Arrogant Presidency (New York: W. W. Norton, 2004), 125. 13. Byrd, Losing America, 125. 14. J. R. Nyquist, “Putin’s Munich Speech,” Geopolitical Global Analysis, February 16, 2007. 15. Fred Kaplan, Daydream Believers: How a Few Grand Ideas Wrecked American Power (New York: John Wiley, 2008), 171. 16. Kaplan, Daydream Believers, 171. 17. Pew Research Center, Pew Global Attitudes Project, July 2009. 18. James Carroll, “On the Verge of Collapse,” Boston Globe, June 21, 2010: A11. 19. Brian C. Mooney, “Outside Donations Buoyed Brown,” Boston Globe, February 24, 2010: 6. 20. Steven LeBlanc, “Senator Scott Brown Reports Raising $14 Million since January 1,” Associated Press, February 18, 2010. 21. Laura Crimaldi, Jessica Van Sack, and Hillary Chabot, “Tea Party Members Brew Scott Brown Boost,” Boston Herald, January 16, 2010. 22. Joan Vennochi, “A Vermonter All the Way,” Boston Globe, May 25, 2001: A23. 23. Scot Lehigh, “Bush and Lott Blew It on Jeffords,” Boston Globe, May 25, 2001: A23. 24. Glenn H. Snyder, Alliance Politics (Ithaca, NY: Cornell University Press, 1997), 143. 25. Stuart Kauffman, At Home in the Universe (Oxford, UK: Oxford University Press, 1995), 270. 26. “NBC Universal and News Corp. Announce Deal with Internet Leaders AOL, MSN, MySpace and Yahoo! to Create a Premium Online Video Site with Unprecedented Reach,” Time Warner Newsroom, March 22, 2007. 27. Bill Briggs, “Fox ‘Circling Wagons’ with Web Venture,” MSNBC.com, March 23, 2007. 28. “Global Market Share Statistics,” http://marketshare.hitslink.com, June 2010. 29. Daniel Frommer, “Microsoft’s Latest Attempt to Derail Google: Sic the Antitrust Cops on Them,” Silicon Alley Insider, February 26, 2010. 30. Michael Liedtke, “Yahoo-Microsoft Deal Set, Taking Aim at Google,” Associated Press, AP Technology, February 18, 2009. 31. “Desperate Hours at DEC—Two Years after a Management Shakeup It’s in Even Worse Shape,” Business Week, May 9, 1994: 26. 32. Jeffrey Pfeffer and Gerald R. Salancik, The External Control of Organizations: A Resource Dependence Perspective (New York: Harper & Row, 1978), 1. 33. Clayton M. Christensen, The Innovator’s Dilemma (Cambridge, MA: Harvard Business School Press, 1997), xii. 34. Christensen, Innovator’s Dilemma, xix. 35. Hammer & Co, Seminar on Change Management, Cambridge, MA, 2002. 36. Peter M. Senge, The Fifth Discipline: The Art and Practice of the Learning Organization (New York: Currency Doubleday, 1990), 88.
184
Notes
37. Senge, Fifth Discipline, 101. 38. Kurt Lewin, Field Theory in Social Science (New York: Harper & Row, 1945). 39. Kaplan, Daydream Believers, 3. 40. William Fulbright, The Arrogance of Power (New York: Random House, 1966), 106. 41. Fulbright, Arrogance of Power, 154. 42. Fulbright, Arrogance of Power, 222.
Chapter 7 Perverse Adaptations 1. Paul Roberts and Karen LaFollette, Meltdown: Inside the Soviet Economy (Washington, DC: Cato Institute, 1990), 8. 2. Marshall I. Goldman, USSR in Crisis: The Failure of an Economic System (New York: W. W. Norton, 1983), 38. 3. Stephanie Saul, “21st Century Babies,” New York Times, November 11, 2009: 24. 4. Jeff Porcaro, “Have GPS Will Travel (to the Right Place!),” www.gearreview. com/gpsreview98.asp. 5. Edward Tenner, Why Things Bite Back: Technology and the Revenge of Unintended Consequences (New York: Knopf, 1996), 19. 6. Malcolm Gladwell, “Big and Bad: How the S.U.V. Ran over Automotive Safety,” New Yorker, January 12, 2004: 55. 7. Jim Brown, “Is American Football Too Violent?” National Sports Safety Organization, April 27, 2007, 1. 8. Timothy Lane and Steven Philips, “Does IMF Financing Result in Moral Hazard?” IMF Working Paper 00/168 (Washington, DC: International Monetary Fund, 2000). 9. National Association of Home Builders, Developer’s Guide to Endangered Species Regulation (Washington, DC: Home Builder Press, 1996), 109. 10. Charles C. Mann and Mark L. Plummer, Noah’s Choice: The Future of Endangered Species (New York: Alfred A. Knopf, 1995), 220. 11. Alison Mitchell, “The Law of Unintended Consequences,” New York Times, March 18, 2001: WK3. 12. Gil Troy, See How They Ran: The Changing Role of the Presidential Candidate (Cambridge, MA: Harvard University Press, 1991). 13. Christopher Conway, “Issue Ads Allow Unlimited Political Pitches,” Charleston Gazette and Daily Mail, September 15, 1996: 19A. 14. Robert Barnes and Dan Egger, “Supreme Court Rejects Limits on Corporate Spending on Political Campaigns,” Washington Post, January 22, 2010. 15. Janet Hoek, “Tobacco Promotion: Restrictions Ironies and Unintended Consequences,” Journal of Business Research 57, no. 11 (2004): 1250. 16. Hoek, “Tobacco Promotion,” 1250.
Notes
185
17. Claire Tristram, “Has GPS Lost Its Way?” Technology Review 102, no. 4 (1999): 72. 18. Barbara Ortutay, “The FaceBook Age, Families Spend Less Time Together,” Boston Globe, June 16, 2009: A4. 19. “Texting Is Not Talking,” Boston Globe, June 16, 2009, editorial page. 20. Thomas L. Carson, “Self-Interest and Business Ethics: Some Lessons of the Recent Corporate Scandals,” Journal of Business Ethics 43, no. 4 (2003): 393.
Chapter 8 Coming into Being 1. “The World’s Most Dangerous Gang,” National Geographic special, May 23, 2007. 2. There are a number of different interpretations of the MS-13 name. 3. “The MS-13 Threat: A National Assessment,” Federal Bureau of Investigation, Headline Archives (Washington, DC: FBI National Press Office, January 14, 2008). 4. Cara Buckley, “A Fearsome Gang and Its Wannabes,” New York Times Week in Review, August 19, 2007. 5. Buckley, “Fearsome Gang.” 6. Julie Ayling, “Criminal Organizations and Resilience,” International Journal of Law, Crime and Justice 37, no. 4 (2009): 187. 7. “Gang Uses Deportation to Its Advantage to Flourish in U.S.,” Los Angeles Times, October 30, 2005: A1. 8. Thomas D. Seeley, “When Is Self-Organization Used in Biological Systems?” Biological Bulletin, 202 (2002): 315. 9. Edward O. Wilson, Consilience (New York: Alfred A. Knopf, 1998), 110. 10. Seeley, “When Is Self-Organization Used,” 314. 11. Thomas Hobbes, Leviathan (Oxford, UK: Oxford University Press, 1996), 84. 12. Lars-Erik Cederman, Emergent Actors in World Politics (Princeton, NJ: Princeton University Press, 1997), 46. 13. Kenneth Waltz, Theory of International Politics (Reading, MA: AddisonWesley, 1979), 88. 14. Ayling, “Criminal Organizations,” 185. 15. Ayling, “Criminal Organizations,” 193. 16. “U.S. Spends More on Its Military Than the Rest of the World Combined,” International Herald Tribune, September, 16, 2006. 17. Herbert I. Schiller and Joseph D. Philips, Super State: Readings in the Military-Industrial Complex (Urbana: University of Illinois Press, 1970), 31. 18. Schiller and Philips, Super State, 31–32. 19. Michael Klare, Rogue States and Nuclear Outlaws (New York: Farrar, Straus and Giroux, 1995). 20. International Institute for Strategic Studies, The Military Balance, 1998/99 (New York: Routledge, 1998).
186
Notes
21. President Ronald Reagan, address to the nation on national security, March 23, 1983. 22. Les Aspin, “The End of the Star Wars Era,” Department of Defense news briefing, May 1993. 23. Donald H. Rumsfeld, “Report of the Commission to Assess the Ballistic Missile Threat to the United States,” presented to the 104th Congress of the United States, July 15, 1998. 24. William J. Clinton, “Statement on Signing the Nuclear Missile Defense Act of 1999,” American Presidency Project, July 22, 1999. 25. Michelle Ciarrocca, “Missile Defense All Over Again,” Foreign Policy in Focus (Washington, DC), September 30, 2005. 26. “Report of the American Physical Society Study Group on Booster-Phases Intercept System for Nuclear Missile Defense: Scientific and Technical Issues,” Review of Modern Physics 76, S1 (2004). 27. Eric Lipton, “Insider Projects Drained Missile-Defense Millions,” New York Times, October 11, 2008. 28. Schiller and Philips, Super State, 174. 29. “Declassified Key Judgments of the National Intelligence Estimate on Global Terrorists,” New York Times, September 27, 2006: 6. 30. Edward Humes, Over Here: How the G.I. Bill Transformed the American Dream (New York: Harcourt, 2006), 5. 31. Suzanne Mettler, Soldiers to Citizens (Oxford, UK: Oxford University Press, 2005), 4. 32. Mettler, Soldiers to Citizens, 11. 33. Milton Greenberg, The G.I. Bill of Rights: Historians on America (Washington, DC: U.S. State Department, September 2007), 46. 34. Mettler, Soldiers to Citizens, 9. 35. Humes, Over Here, 39. 36. Michael J. Bennett, When Dreams Came True (Washington, DC: Brassey’s, 1996), 155. 37. Laurel Walters, “Surge of US Education under GI Bill Recalled by D-Day Commemorations,” Christian Science Monitor, June 8, 1994: 1. 38. Mettler, Soldiers to Citizens, 3. 39. Mettler, Soldiers to Citizens, 95. 40. William J. Clinton, address to the Veterans’ Affairs Department, June 1994. 41. “The Truth about Torture,” Newsweek, November 21, 2005: 35. 42. Knut Dormann, “The Geneva Conventions Today,” address in London, July 9, 2009. 43. “The Torture Question.” Frontline transcript. Public Broadcasting System. pbs.org/wgbh/pages/frontline/torture/etc/script.html. 44. Eric Schmitt, “The Reach of War: Abu Ghraib Report: Abuses at Prison Tied to Officers in Intelligence,” New York Times, August 26, 2004. 45. Human Rights Watch, “Getting Away with Torture?” April 23, 2005.
Notes
187
46. Philip Zimbardo, The Lucifer Effect: Understanding How Good People Turn Evil (New York: Random House, 2007). 47. Zimbardo, Lucifer Effect, 5. 48. Zimbardo, Lucifer Effect, x and 226. 49. Zimbardo, Lucifer Effect, 424. 50. Zimbardo, Lucifer Effect, x. 51. Human Rights Watch, “Getting Away with Torture?” 52. Zimbardo, Lucifer Effect, 393. 53. Meet the Press with Tim Russert, September 16, 2001. 54. Zimbardo, Lucifer Effect, 406. 55. “The Truth about Torture,” 35. 56. John Maynard Keynes, The Economic Consequences of the Peace (New York: Harcourt, Brace, and Howe, 1920), 226. 57. Keynes, Economic Consequences, 251.
Chapter 9 Breaching the Peace 1. “Hot Topic: Asian Carp,” U.S. Fish and Wildlife Service Midwest Region, 2010. 2. Dan Egan, “Troubled Waters: The Asian Carp Invasion,” Journal Sentinel (Milwaukee, WI), October 15, 2006. 3. Jane Jacobs, The Death and Life of Great American Cities (New York: Modern Library, 1993), 438. 4. Jacobs, Death and Life, 290. 5. Jacobs, Death and Life, 32. 6. Jacobs, Death and Life, 72. 7. Jacobs, Death and Life, 33. 8. Jacobs, Death and Life, 5. 9. Scott S. Greenberger, “Calendar, Critics May Curb Agency That Reshaped Boston,” Boston Globe, November 25, 2002: A6. 10. Greenberger, “Calendar, Critics” A6. 11. Jeff Dannes, “Collateral Damage: Unintended Consequences of Urban Renewal in Baltimore, MD,” unpublished report. 12. Larry Collins and Dominique Lapierre, Freedom at Midnight (New York: Simon & Schuster, 1975), 38. 13. Collins and Lapierre, Freedom at Midnight, 40. 14. Collins and Lapierre, Freedom at Midnight, 242. 15. Collins and Lapierre, Freedom at Midnight, 242. 16. Collins and Lapierre, Freedom at Midnight, 412. 17. David Fromkin, A Peace to End All Peace: The Fall of the Ottoman Empire and the Creation of the Modern Middle East (New York: Henry Holt, 1989). 18. “Business Cycle Expansions and Contractions” (Cambridge, MA: National Bureau of Economic Research, December 1, 2008).
188
Notes
19. Jane G. Gravelle, Thomas L. Hungerford, and Marc Labonte, “Economic Stimulus: Issues and Policies,” (Washington, DC: Congressional Research Service, December 9, 2009), 17. 20. Bruce Bartlett, “If It Ain’t Broke, Don’t Fix It,” Wall Street Journal, December 2, 1992. 21. John Maynard Keynes, A Tract on Monetary Reform (London: Macmillan, 1923), chapter 3. 22. Gravelle, Hungerford, and Labonte, “Economic Stimulus,” 17. 23. William Sherden, The Fortune Sellers (New York: John Wiley, 1998). 24. IHS Global Insight. http://www.ihsglobalinsight.com/. 25. Gravelle, Hungerford, and Labonte, “Economic Stimulus,” 17. 26. Paul Collier, The Bottom Billion: Why the Poorest Countries Are Failing and What Can Be Done about It (Oxford, UK: Oxford University Press, 2007), 9. 27. Collier, Bottom Billion, 1. 28. Collier, Bottom Billion, 99. 29. Raghuram G. Rajan, “Aid and Growth: The Policy Challenge,” Finance and Development 42, no. 4 (2005): 1. 30. Raghuram G. Rajan and Arvind Subramanian, “What Undermines Aid’s Impact on Growth?” International Monetary Fund Working Paper, WP/05/126, Washington, DC, 2005, 53. 31. Rajan and Subramanian, “What Undermines Aid’s Impact,” 1. 32. Rajan and Subramanian, “What Undermines Aid’s Impact,” 22. 33. Rajan, “Aid and Growth,” 2 and 5. 34. Collier, Bottom Billion, 121. 35. Jonathan M. Katz, “With Cheap Food Imports Haiti Can’t Feed Itself,” Associated Press, March 20, 2010. 36. Christopher B. Barrett, “Food Aid’s Intended and Unintended Consequences,” Agricultural and Development Economics Division of the Food and Agriculture Organization of the United Nations, ESA Working Paper no. 06-05.
Chapter 10 Thinking through the Maze 1. Robert Lewis, “Every Cost-Cutting Decision Has a Consequence—Even the Easy Ones,” Advice Line InfoWorld, April 14, 2009. 2. Edward O. Wilson, Consilience: The Unity of Knowledge (New York: Alfred A. Knopf, 1998), 183. 3. Wilson, Consilience, 189 and 198. 4. William Sherden, The Fortune Sellers (New York: John Wiley, 1998), chapter 3. 5. Scott Keller and Carolyn Aiken, “The Inconvenient Truth about Change Management,” McKinsey Paper, Bain & Company, http://www.mckinsey.com/ clientservice/organizationleadership/the_inconvenient_truth_about_change_ management.pdf.
Notes
189
6. KPMG, The KPMG Canada Survey (n.p.: KPMG, 1997); Standish Group, The Chaos Report (Boston, MA: Standish Group, 1995); The OASIG Study (London: Organizational Aspects of Information Technology, 1995). 7. Scott Shane, Illusions of Entrepreneurship: The Costly Myths That Entrepreneurs, Inventors, and Policy Makers Live By (New Haven, CT: Yale University Press, 2008), 99. 8. Amos Tversky and Daniel Kahneman, “Judgment under Uncertainty: Heuristics and Biases,” Science, 1185 (1974): 1124. 9. Lewis Thomas, Medusa and the Snail (New York: Viking Press, 1979), 110. 10. Thomas, Medusa and the Snail, 110. 11. Aaron M. Renn, “The Logic of Failure,” Urbanophile, February 7, 2009. 12. Thomas, Medusa and the Snail, 110. 13. Renn, “Logic of Failure.” 14. J. Edward Russo and Paul Shoemaker, “Managing Overconfidence,” Sloane Management Review 32, no. 2 (1992): 7. 15. Thomas, Medusa and the Snail, 110. 16. Richard H. Thaler and Cass R. Sunstein, Nudge (New York: Penguin Books, 2008), 99. 17. Thayler and Sunstein, Nudge, 3. 18. Renn, “Logic of Failure.” 19. Renn, “Logic of Failure.” 20. Thaler and Sunstein, Nudge, 9. 21. Renn, “Logic of Failure.” 22. Robert Kuttner, “US Fueled Argentina’s Economic Collapse,” Boston Globe, January 7, 2002: A15. 23. Kuttner, “US Fueled Argentina’s,” A15. 24. Kuttner, “US Fueled Argentina’s,” A15. 25. Renn, “Logic of Failure.” 26. Cynthia Karen Swank, “The Lean Service Machine,” Harvard Business Review 81, no. 10 (2003): 126.
Bibliography Andrews, Edmund L. “Greenspan Concedes Error on Regulation.” New York Times, October 23, 2008: 1. Aristotle, Politics. Aspin, Les. “The End of the Star Wars Era,” Department of Defense News Briefing, May 13, 1993. Axelrod, Robert. The Evolution of Cooperation. New York: Basic Books, 1981. Ayling, Julie. “Criminal Organizations and Resilience.” International Journal of Law, Crime and Justice 37, no. 4 (2009): 183–96. Bain & Company. http://www.bain.com. Baker, Peter. “Bush’s New Tack Steers Clear of ‘Stay the Course.’” Washington Post, October 24, 2006. Balz, Daniel, and Ronald Brownstein. Storming the Gates. Boston: Little, Brown, 1996. Barlow, Connie C. From Gaia to Selfish Genes: Selected Writings in the Life Sciences. Cambridge, MA: MIT Press, 1991. Barnes, Robert, and Dan Egger. “Supreme Court Rejects Limits on Corporate Spending on Political Campaigns.” Washington Post, January 22, 2010. Barrett, Christopher B. “Food Aid’s Intended and Unintended Consequences.” Agricultural and Development Economics Division of the Food and Agriculture Organization of the United Nations. ESA Working Paper no. 06–05. Bartholomew, Robert. “The Martian Panic Sixty Years Later: What Have We Learned?” Skeptical Inquirer 22 (1998): 440–43. Bartlett, Bruce. “If It Ain’t Broke, Don’t Fix It.” Wall Street Journal, December 2, 1992. Bastiat, Frederic. Selected Essays on Political Economy. Irvington-on-Hudson, NY: Foundation for Economic Education, 1995. “Behind the Steel-Tariff Curtain: A Blow-by-Blow Look at the Struggle That Culminated in Bush’s Decision to Impose the Levies.” Forbes, March 8, 2002. Bernanke, Ben S. “Monetary Policy and the Housing Bubble.” Washington, DC: Board of Governors of the Federal Reserve System, January 3, 2010. Bennett, Drake. “Paradigm Lost.” Boston Globe, December 21, 2008: D1. Bennett, Michael J. When Dreams Came True. Washington, DC: Brassey’s, 1996.
192
Bibliography
Bhutta, Neil, and Glenn B. Canner. “Did the CRA Cause the Mortgage Market Meltdown?” Community Divided, March 2009. Bickford, Eric. “White Flight: The Effect of Minority Presence on Post World War II Suburbanization.” Working Paper, University of California, Berkeley, 1997. Boorstin, Daniel. Cleopatra’s Nose. New York: Vintage Books, 1995. Brown, Lester. “Starving for Fuel: How Ethanol Production Contributes to Global Hunger.” Globalist, August 1, 2006. Buckley, Cara. “A Fearsome Gang and Its Wannabes.” New York Times Week in Review, August 19, 2007. Buffett, Warren. Annual Letter to Berkshire Hathaway Shareholders, 2002. Bush, President George W. State of the Union Address, January 2002. “Business Cycle Expansions and Contractions.” Cambridge, MA: National Bureau of Economic Research, December 1, 2008. Byrd, Robert C. Losing America: Confronting a Reckless and Arrogant Presidency. New York: W. W. Norton, 2004. Canon, Scott, and Ron Hutcheson. “Bush Intervenes in India-Pakistan Conflict as Leaders Warn of War.” Knight-Ridder Tribune News Service, December 29, 2001: 37. Carroll, James. “On the Verge of Collapse.” Boston Globe, June 21, 2010: A11. Carson, Thomas L. “Self-Interest and Business Ethics: Some Lessons of the Recent Corporate Scandals.” Journal of Business Ethics 43, no. 4 (2003), 389–94. Cederman, Lars-Erik. Emergent Actors in World Politics. Princeton, NJ: Princeton University Press, 1997. Chakravorti, Bhaskar. “The New Rules for Bringing Innovation to Market.” Harvard Business Review 82, no. 3 (2004): 59–67. Chatergee, Patricia. “The Classical Balance of Power Theory.” Journal of Peace Research 9, no. 1 (1972): 51–61. Christensen, Clayton M. The Innovator’s Dilemma. Cambridge, MA: Harvard Business School Press, 1997. Ciarrocca, Michelle. “Missile Defense All Over Again.” Foreign Policy in Focus (Washington, DC), September 30, 2005. Clinton, William J. Address to the Veterans’ Affairs Department, June 1994. Collier, Paul. The Bottom Billion: Why the Poorest Countries Are Failing and What Can Be Done about It. Oxford, UK: Oxford University Press, 2007. Collins, Larry, and Dominique Lapierre. Freedom at Midnight. New York: Simon & Schuster, 1975. Colvin, Geoffrey. “Old Consultants Never Die: They Just Go ‘e.’” Fortune 141, no. 12 (2000): 130. Conway, Christopher. “Issue Ads Allow Unlimited Political Pitches.” Charleston Gazette and Daily Mail, September 15, 1996: 19A. Crimaldi, Laura, Jessica Van Sack, and Hillary Chabot. “Tea Party Members Brew Scott Brown Boost.” Boston Herald, January 16, 2010. Crystal, David. English as a Global Language. Cambridge, UK: Cambridge University Press, 2003.
Bibliography
193
Daniel, Douglass. “Bush’s Words on Iraq Echo LBJ in 1967.” Associated Press, September 22, 2005. Dannes, Jeff. “Collateral Damage: Unintended Consequences of Urban Renewal in Baltimore, MD.” Unpublished Report. Darwin, Charles. On the Origin of Species. New York: Modern Library, 1993. David, Paul A. “Clio and the Economics of Qwerty.” Economic History 75, no. 2 (1985): 332–37. “Declassified Key Judgments of the National Intelligence Estimate on Global Terrorists.” New York Times, September 27, 2006: 6. “Desperate Hours at DEC—Two Years after a Management Shakeup It’s in Even Worse Shape.” Business Week, May 9, 1994: 26. Donohue, John J., and Steven D. Levitt. “The Impact of Legalized Abortion on Crime.” Quarterly Journal of Economics 116, no. 2 (2001): 379–420. Dormann, Knut. “The Geneva Conventions Today.” Address in London, July 9, 2009. Dunn, John M. The Civil Rights Movement. San Diego: Lucent Books, 1998. Dutton, Donald G. The Psychology of Genocide, Massacres, and Extreme Violence. Westport, CT: Praeger Security International, 2007. Egan, Dan. “Troubled Waters: The Asian Carp Invasion.” Journal Sentinel (Milwaukee, WI), October 15, 2006. 88th Congress of the United States of America. Second Session, August 10, 1964. Feeney, Mark. “Boston Desegregation Judge Is Dead at 79.” Boston Globe, September 18, 1999: A1. Felleman, Hazel. The Best Loved Poems of the American People. New York: Doubleday, 1989. “Forget the Maine!” Economist 346, no. 8049 (1998): 32–34. Frey, William H. “Central City White Flight: Racial and Nonracial Causes.” American Sociological Review 44 (1979): 425–48. Fridson, Martin S. Extraordinary Popular Delusions and the Madness of Crowds. New York: John Wiley, 1996. Fromkin, David. A Peace to End All Peace: The Fall of the Ottoman Empire and the Creation of the Modern Middle East. New York: Henry Holt, 1989. Frommer, Daniel. “Microsoft’s Latest Attempt to Derail Google: Sic the Antitrust Cops on Them.” Silicon Alley Insider, February 26, 2010. Fulbright, William. The Arrogance of Power. New York: Random House, 1966. Galbraith, John Kenneth. The Great Crash. Boston: Houghton Mifflin, 1972. “Gang Uses Deportation to Its Advantage to Flourish in U.S.” Los Angeles Times, October 30, 2005: A1. Gavin, Robert. “Strength in Weakness.” USA Today, May 13, 2003: D1. Gavin, Robert. “Study Puts New Spin on the Housing Bubble.” Boston Globe, May 5, 2010: B11. Gibbs, W. Wayt. “Saving Dying Languages.” Scientific American 287, no. 2 (2002): 78 – 85. Gillon, Jim. “Feedback on Gaia.” Nature 406, no. 6797 (2000): 685–86.
194
Bibliography
Gladwell, Malcolm. “Big and Bad: How the S.U.V. Ran over Automotive Safety.” New Yorker, January 12, 2004: 55. Goldman, Marshall I. USSR in Crisis: The Failure of an Economic System. New York: W. W. Norton, 1983. Grantham, Jeremy. “Just Desserts and Markets Being Silly Again.” GMO Quarterly Letter, October 2009: 6. Granvelle, Jane G., Thomas L. Hungerford, and Marc Labonte. “Economic Stimulus: Issues and Policies.” Washington, DC: Congressional Research Service, December 9, 2009. Greenberg, Milton. The G.I. Bill: Historians on America. Washington, DC: U.S. State Department, September 2007. Greenberger, Scott S. “Calendar, Critics May Curb Agency That Reshaped Boston.” Boston Globe, November 25, 2002: A6. Grindley, Peter. Standards Strategy and Policy. Oxford, UK: Oxford University Press, 1995. Grodzins, Morton. The Metropolitan Area as a Racial Problem. Pittsburgh: University of Pittsburgh Press, 1958. Halberstam, David. The Best and the Brightest. New York: Random House, 1972. Hammer, Michael. Seminar on Six Sigma. Cambridge, MA: Hammer & Co., 2002. Healy, Patrick. “Kerry’s Win in Iowa Attracts Cash for N.H. Test.” Boston Globe, January 22, 2004: A19. Herodotus. The History of Herodotus. Translated by G. Rawlinson. London: None Such Press, 1935. Hobbes, Thomas. Leviathan. Oxford, UK: Oxford University Press, 1996. Hoek, Janet. “Tobacco Promotion: Restrictions Ironies and Unintended Consequences.” Journal of Business Research 57, no. 11 (2004): 1250–57. Holt, Janet L. “Hatfield-McCoy Feud Goes to the Graves.” June 1, 2002. www.the freelibrary.com/Hatfield-Coy+feud+goes+to+the+graves.-a088764287. Holt, Mack P. The French Wars of Religion, 1562–1629. Cambridge, UK: Cambridge University Press, 1995. Honey, Martha, and Tom Barry. Global Focus: U.S. Foreign Policy at the Turn of the Millennium. New York: Macmillan, 2000. “Hot Topic: Asian Carp.” U.S. Fish and Wildlife Service Midwest Region, 2010. Hughes, Helen MacGill. Crowd and Mass Behavior. Boston: Allyn & Bacon, 1972. Human Rights Watch. “Getting Away with Torture?” April 23, 2005. Humes, Edward. Over Here: How the G.I. Bill Transformed the American Dream. New York: Harcourt, 2006. IHS Global Insights. http://www.ihsglobal insights.com. “In Come the Waves.” Economist, June 16, 2005. International Institute for Strategic Studies. The Military Balance, 1998/99. New York: Routledge, 1998. “The Invisible Green Hand: A Survey of Global Environment.” Economist, July 4, 2002.
Bibliography
195
Jacobs, Jane. The Death and Life of Great American Cities. New York: Modern Library, 1993. Jervis, Robert. Systems Effects: Complexity in Political and Social Life. Princeton, NJ: Princeton University Press, 1997. Jeter, Jon. “The Dumping Ground.” Washington Post, April 22, 2002: A1. Jones, Del. “New Economy Ideas Bit the Dust Faster Than Usual.” USA Today, June 13, 2001: B1. Kamenetz, Anya. “What? Clean Air Act Caused Half of Global Warming, Says NASA.” Fast Company, April 13, 2009. Kaplan, Fred. Daydream Believers: How a Few Grand Ideas Wrecked American Power. New York: John Wiley, 2008. Katz, Jonathan M. “With Cheap Food Imports Haiti Can’t Feed Itself.” Associated Press, March 20, 2010. Kauffman, Stuart. At Home in the Universe. Oxford, UK: Oxford University Press, 1995. Keller, Scott, and Carolyn Aiken. “The Inconvenient Truth about Change Management.” McKinsey Paper, Bain & Company, http://www.mckinsey.com/cli entservice/organizationleadership/the_inconvenient_truth_about_change_ management.pdf. Keynes, John Maynard. A Tract on Monetary Reform. 1923. Amherst, NY: Prometheus Books, 1999. Keynes, John Maynard. The Economic Consequences of the Peace. New York: Harcourt, Brace, and Howe, 1920. KPMG. The KPMG Canada Survey. N.p.: KPMG, 1997. Krugman, Paul. “Responding to Recession.” New York Times, January 14, 2008: 25. Kuttner, Robert. “US Fueled Argentina’s Economic Collapse.” Boston Globe, January 7, 2002: A15. Lane, Timothy, and Steven Philips. “Does IMF Financing Result in Moral Hazard?” International Monetary Fund Working Paper 00/168. Washington, DC, 2000. Lang, Susan S. “Cornell Ecologist’s Study Finds that Producing Ethanol and Biodiesel from Corn and Other Crops Is Not Worth the Energy.” Cornell University News Service July 5, 2005: 30. Le Blanc, Steven. “Senator Scott Brown Reports Raising $14 Million since January 1.” Associated Press, February 18, 2010. Le Bon, Gustave. The Crowd: A Study of the Popular Mind. 1896. New York: Viking, 1960. Lehigh, Scot. “Bush and Lott Blew It on Jeffords.” Boston Globe, May 25, 2001: A23. Leibenstein, Harvey. “Bandwagon, Snob, and Veblen Effects in the Theory of Consumers’ Demand.” Quarterly Journal of Economics 64, no. 2 (1950): 183 – 207. Lewin, Kurt. Field Theory in Social Science. New York: Harper & Row, 1945. Lewis, Robert. “Every Cost-Cutting Decision Has a Consequence—Even the Easy Ones.” Advice Line InfoWorld, April 14, 2009.
196
Bibliography
Liedtke, Michael. “Yahoo-Microsoft Deal Set, Taking Aim at Google.” Associated Press, AP Technology, February 18, 2009. Lipton, Eric. “Insider Projects Drained Missile-Defense Millions.” New York Times, October, 11, 2008. Lovelock, James E. GAIA. Oxford, UK: Oxford University Press, 1987. Lucchetti, Aaron. “Buffett Defends Moody’s Managers.” Wall Street Journal, June 3, 2010: C1. Malkiel, Burton G. A Random Walk Down Wall Street. New York: W. W. Norton, 1990. Mann, Charles C., and Mark L. Plummer. Noah’s Choice: The Future of Endangered Species. New York: Alfred A. Knopf, 1995. Maraniss, David. “Armey Arsenal: Plain Talk and Dramatic Tales.” Washington Post, February 21, 1995: A1. Marcus, Baram. “Who’s Winning Now? Gramm Slammed by Economists.” ABC News, September 19, 2008. McNeill, William. “The Changing Shape of World History.” History and Theory 34, no. 2 (1995): 8. Meet the Press with Tim Russert. September 16, 2001. Merton, Robert K. “The Unintended Consequences of Purposive Social Action.” American Sociological Review 1, no. 6 (1936): 894–904. Mettler, Suzanne. Soldiers to Citizens. Oxford, UK: Oxford University Press, 2005. Micklethwait, John, and Adrian Wooldrige. The Witch Doctors. New York: Times Books, 1996. Milgram, Stanley. “The Small World Problem.” Psychology Today 2 (1967): 60. Miller, David L. Introduction to Collective Behavior. Belmont, CA: Wadsworth, 1985. Mitchell, Alison. “The Law of Unintended Consequences.” New York Times, March 18, 2001, WK3. Mooney, Brian C. “Outside Donations Buoyed Brown.” Boston Globe, February 24, 2010: 6. “A Mortgage Fable.” Wall Street Journal, September 22, 2008: A22. “The MS-13 Threat: A National Assessment.” Federal Bureau of Investigation Headline Archives. Washington, DC: FBI National Press Office, January 14, 2008. Mydans, Seth. “Across Cultures, English Is the Word.” New York Times, April 9, 2007. National Aeronautics and Space Administration. “Aerosols May Drive a Significant Portion of Arctic Warming.” Washington, DC: NASA, April 8, 2009. National Association of Home Builders. Developer’s Guide to Endangered Species Regulation. Washington, DC: Home Builder Press, 1996. “NBC Universal and News Corp. Announce Deal with Internet Leaders AOL, MSN, MySpace and Yahoo! to Create a Premium Online Video Site with Unprecedented Reach.” Time Warner Newsroom, March 22, 2007. Nyquist, J. R. “Putin’s Munich Speech.” Geopolitical Global Analysis, February 16, 2007.
Bibliography
197
The OASIG Study. London: Organizational Aspects of Information Technology, 1999. Ortutay, Barbara. “The FaceBook Age, Families Spend Less Time Together.” Boston Globe, June 16, 2009: A4. Parke, Richard. “Galbraith and Vietnam: Kennedy, Unlike Bush, Had One Advisor Who Told Him What He Needed to Hear.” Nation 280, no. 10: 16–20. Pascale, Richard. Managing on the Edge. New York: Simon & Schuster, 1990. Pew Research Center. The Pew Global Attitudes Project. Washington, DC: Pew Research Center, July 2009. Pfeffer, Jeffrey, and Gerald R. Salancik. The External Control of Organizations: A Resource Dependence Perspective. New York: Harper & Row, 1978. Porcaro, Jeff. “Have GPS Will Travel (to the Right Place!).” www.gearreview.com/ gpsreview98.asp. Poole, Teresa. “Mao’s Frenzy of Mass Violence.” World Press Review 43, no. 8 (1996): 18–21. “Pratt Says Tariff Cripples US Abroad.” New York Times, August 29, 1930: 1. Radin, Charles A. “Rumors of Rape Fan Anti-American Flames.” Boston Globe, January 4, 2004. Rajan, Raghuram. “Aid and Growth: The Policy Challenge.” Finance and Development 42, no. 4 (2005): 53–55. Rajan, Raghuram G., and Arvind Subramanian. “What Undermines Aid’s Impact on Growth?” International Monetary Fund Working Paper WP/05/126. Washington, DC, 2005. Reagan, President Ronald. Address to the nation on national security, March 23, 1983. Renn, Aaron M. “The Logic of Failure.” Urbanophile, February 7, 2009. Rescher, Nicholas. Luck: The Brilliant Randomness of Everyday Life. Pittsburgh: University of Pittsburgh Press, 2001. Reichmann, Debra. “Bush Tries to Boost Support.” Los Angeles Daily News, October 26, 2005: 13. Reynolds, Neil. “Goodwill May Be Stunting African Growth.” AfricaFiles, December 24, 2009: 1. Roberts, Paul, and Karen LaFollette. Meltdown: Inside the Soviet Economy. Washington DC: Cato Institute, 1990. Roe v. Wade. 410 U.S. 110, 153 (1973). Russo, J. Edward, and Paul Shoemaker. “Managing Overconfidence.” Sloan Management Review 32, no. 2 (1992): 7–17. Saul, Stephanie. “21st Century Babies.” New York Times, November 11, 2009: 24. “Says Hoover Drove British Trade Away.” New York Times, October 14, 1932: 10. Schiller, Herbert I., and Joseph D. Philips. Super State: Readings in the MilitaryIndustrial Complex. Urbana: University of Illinois Press, 1970. Schmitt, Eric. “The Reach of War: Abu Ghraib Report: Abuses at Prison Tied to Officers in Intelligence.” New York Times, August 26, 2004.
198
Bibliography
Seely, Thomas D. “When Is Self-Organization Used in Biological Systems?” Biological Bulletin 202 (2002): 314–18. Senge, Peter M. The Fifth Discipline: The Art and Practice of the Learning Organization. New York: Currency Doubleday, 1990. Shane, Scott. Illusions of Entrepreneurship: The Costly Myths That Entrepreneurs, Inventors, and Policy Makers Live By. New Haven, CT: Yale University Press, 2008. Shannon, Henry. “Store-to-Door Shopping.” Washington Post, November 25, 1999: E1. Shell, Adam. “Tech Teetotalers Have the Last Laugh.” USA Today, January 11, 2001: B2. Sherden, William. The Fortune Sellers. New York: John Wiley, 1998. “The Six Degrees of Separation Is Now Three.” O2 press release, August 19, 2008. “Skies Scrubbed Clean Have Contributed to Half Europe’s Recent Warming.” New Scientist, July 5, 2008: 16. Smith, Adam. An Inquiry into the Nature and Causes of the Wealth of Nations. Chicago: University of Chicago Press, 1976. Snyder, Glenn H. Alliance Politics. Ithaca, NY: Cornell University Press, 1997. Specter, Michael. “Computer Speak: “World, Wide, Web: Three English Words.” New York Times, April 14, 1996: 1–4. Standish Group. The Chaos Report. Boston, MA: Standish Group, 1995. Stephen Walt. The Origins of Alliances. Ithaca, NY: Cornell University Press, 1990. Stockdale, Jim, and Sybil Stockdale. In Love and War. New York: Harper & Row, 1984. “The Sunk-Cost Fallacy: Bush Falls Victim to a Bad New Argument for the Iraq War.” Slate, September 9, 2005: 1. Swank, Cynthia Karen. “The Lean Service Machine.” Harvard Business Review 81, no. 10 (2003): 123–29. Tenner, Edward. Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York: Alfred A. Knopf, 1996. “Texting Is Not Talking.” Boston Globe, June 16, 2009: editorial page. Thaler, Richard H., and Cass R. Sunstein. Nudge. New York: Penguin Books, 2008. Thomas, Evan, Michael Hirsh, and Mathew Philips. “Rubin’s Detail Deficit.” Newsweek, December 8, 2008: 44. Thomas, Lewis. The Medusa and the Snail. New York: Viking Press, 1979. Thompson, Derek. “Blaming Things Not Named Greenspan for the Great Recession.” Atlantic Business, January 12, 2010: 2. “The Torture Question.” Frontline transcript. Public Broadcasting System. pbs.org/ wgbh/pages/frontline/torture/etc/script.html. Tristram, Claire. “Has GPS Lost Its Way?” Technology Review 102, no. 4 (1999): 70–74. Troy, Gil. See How They Ran: The Changing Role of the Presidential Candidate. Cambridge, MA: Harvard University Press, 1991. “The Truth about Torture.” Newsweek, November 21, 2005: 35.
Bibliography
199
Tversky, Amos, and Daniel Kahneman. “Judgment under Uncertainty: Heuristics and Biases.” Science 1185 (1974): 1124–31. “U.S. Spends More on Its Military Than the Rest of the World Combined.” International Herald Tribune, September 16, 2006. Van Munching, Philip. “The Devil’s Adman.” Brandweek, 42 (2001): 42. Vennochi, Joan. “A Vermonter All the Way.” Boston Globe, May 25, 2001: A23. Voltaire. Zadig, and Other Tales. London: George Bell, 1907. Von Clausewitz, Carl. On War. Princeton, NJ: Princeton University Press, 1976. Walker, Martin. The Cold War. New York: Henry Holt, 1993. Walters, Laurel. “Surge of US Education under GI Bill Recalled by D-Day Commemorations.” Christian Science Monitor, June 8, 1994: 1. Waltz, Kenneth. Theory of International Politics. Reading, MA: Addison Wesley, 1976. Ward, John. “The Macintosh Licensing Conundrum.” Vectronic’s Apple World. www. vectronicsappleworld.com/macintosh/licensing.html. Weaver, Warren. “Science and Complexity” American Scientist 36, no. 536 (1948): 536–44. Wilson, Edward O. Consilience: The Unity of Knowledge. New York: Alfred A. Knopf, 1998. Wood, Alistair. “International Scientific English: Some Thoughts on Science, Language, and Ownership.” Science Tribune 2, no. 4 (1997): 71–83. “The World’s Most Dangerous Gang.” National Geographic special, May 23, 2007. Xiuyuan, Lu. “A Step Toward Understanding Popular Violence in China’s Cultural Revolution.” Pacific Affairs 67, no. 3 (1994): 533–63. Zimbardo, Philip. The Lucifer Effect: Understanding How Good People Turn Evil. New York: Random House, 2007.
Index Abortion issues, 20 –21, 114 Adams, John, 31 Adaptation. See Perverse adaptation Adobe Acrobat software, 84 – 85 Afghanistan, 90, 118, 128 –29, 136, 163, 172 Africa, 22 –24, 90, 93, 128, 161 African American issues: civil rights movement, 59 – 60; forced busing, 80, 83; resegregation, 79 – 83; urban renewal and, 149 Aid to Families with Dependent Children Act, 2 Ali, Rahmat, 151 Alliance formation, 90 – 91 al-Qaeda, 49, 128 – 29, 154, 167 Alsop, Stewart, 64 Amazon.com, 36 American Legion, 131–32 American Recovery and Reinvestment Act, 154 – 61 American Revolution, 53 – 54, 60 American Society for Reproductive Medicine, 106 American Society of Civil Engineers, 145 Anti-Ballistic Missile (ABM) Treaty, 126, 127–28 Antisegregation laws, 59 – 60 Apple Computer, 68 –72, 97–99 Argentina, 172 –73 Aristotle (Greek philosopher), 7 Arms race, 56 – 57 Arrogance of Power, The (Fulbright), 102 Arthur, Brian, 7– 8
Asian carp, 143 – 46 Atomic Energy Agency, 77 Axelrod, Robert, 63 – 64 Ayling, Julie, 122 –24 Babson, Roger, 33 Balance-of-power theory, 89 – 94 Balancing forces: balance-of-power theory and, 89 –94; homeostasis as, 7, 87– 89, 100; lessons from, 101–2; in nature, 87– 89; online, 96 – 98; in politics, 94 – 96; transformative innovations and, 98 –101 Balz, Daniel, 2 Bandwagon effect: de-facto standard and, 68 –71; elections and, 74 –75; lingua franca, 75 –79; QWERTY keyboards and, 71–72; resegregation and, 79 – 82; triggering, 71–74; VCR vs. VHS battle and, 72 –74. See also Lock-in concept Bartholomew, Robert, 32 Bartlett, Bruce, 156 Bastiat, Frederic, 5, 6 Bennett, Drake, 15 Bernanke, Ben, 18, 35 Best and the Brightest, The (Halberstam), 56 Bhutta, Neil, 16 Bickford, Eric, 79 – 80 Block, Susan, 29 – 30 Blogging, 95, 118 Blowback, 3– 4, 128 –29 Boorstin, Daniel, 1 Born, Brooksley, 17–18
202
Index
Boston Globe, 15 Boston Redevelopment Authority (BRA), 149 Braswell, Walter E., 127 Briggs, Bill, 97 Brown, Lester, 13 Brown, Scott, 94 –96 Brownstein, Ronald, 2 Bubbles: California gold rush bubble, 38; chain reaction in, 59; dot-com bubble, 15, 36 –37; housing bubble recession, 15 –19, 35 –36; tulipomania, 37– 38 Buffet, Warren, 17 Bush, George H. W., 102, 126 Bush, George W., 5, 19 –20, 31; IEDs and, 49; military protests and, 40 – 41; NMD and, 126 –27; political agenda of, 95 –96; popularity of, 93, 113; prisoner abuse and, 135 –37; U.S. relations and, 64; war on terror and, 91 –92, 102, 134 Business Week, 39 – 40 Byrd, Robert, 92 California gold rush bubble, 38 Canada, 24 –25, 58, 78, 121, 122, 145 Canner, Glenn B., 16 Cannon, Walter, 7 Cantwell, Maria, 17 Carbon dioxide emissions, 12, 25, 87, 169 Carroll, James, 93 Carson, Thomas, 118 Carter, Jimmy, 102 Queen Catherine de Medici, 45 – 46 Catholics, 45 – 46 Cattle mutilation rumor, 33 Cell phones, 115, 116 Central Intelligence Agency (CIA), 127 Chain reactions. See Domino effect Chakravorti, Bhaskar, 85 Chaykovsky, Nikolay, 103 – 4 Cheney, Richard, 93, 137 Chicago Sanitary and Ship Canal, 144 – 45 China, 15, 19, 44 – 45, 55 –56, 60, 64, 90, 101–2 Christensen, Clayton, 99 –100 Civil rights movement, 59– 60 Civil War (U.S.), 54 Clean Air Act, 4, 25 –26, 112
Clinton, Bill, 18, 64, 95, 102, 116, 126, 134 Cold War, 125, 128 Collective behavior, 7, 31, 56, 58 – 60. See also Domino effect Collier, Paul, 161– 62 Collins, Larry, 151 Colmery, Harry, 132 Combat retreat rumors, 33 Coming into being. See Emergence Commodity Futures Modernization Act, 17–18 Common second language. See Lingua franca Communism, 42, 139, 141 Community Reinvestment Act (CRA), 15 Compensation schemes, 105 – 6 Conant, James, 133 Congressional Budget Office (CBO), 159 Congressional Research Service, 156 – 57 Conley, Colleen, 95 Consuming Industries Trade Action Coalition, 19 Conway, Chris, 113 Coral reef ecosystem, 146 Cox, Michael, 145 Crazes, 47– 48 Credit default swaps, 17–18 Crowd: A Study of the Popular Mind, The (Le Bon), 7 Crystal, David, 76, 78 –79 Darwin, Charles, 6, 8, 14 David, Paul, 7 Dayaks’ massacre, 42– 44 Daydream Believers (Kaplan), 93, 101 Death and Life of Great American Cities, The (Jacobs), 8, 147 Decision-making ability, 166 – 68 De-facto standard, 68 –71 Deficit spending, 156 –57 Department of Defense, 49, 56, 78, 107, 125 Deposit insurance, 111 Derivatives, 16 –18 DHIATFNSOR keyboard, 71 Digital Equipment Corporation (DEC), 96 –97 Dimethyl sulfide (DMS), 87
Index Disturbed equilibria, 7– 8 Domino effect: bubbles and, 35 – 38; chain reactions and, 7, 30 –31; crazes and, 47– 48; fads and, 38 – 40; herd mentality and, 7, 31 –35; lessons from, 47– 48; massacres and, 42 – 46; protests and, 40 – 41; rape and, 29 – 30; riots and, 46; witch hunts and, 41– 42 Donohue, John, III, 20 Dormann, Knut, 135 Dot-com bubble, 15, 36 – 37 Dow Jones Industrial Average, 33, 37 Dunn, John, 60 Dutch Disease, 160 – 64 Dvorak, August, 72 Dvorak Simplified Keyboard, 72 Earth Systems Science, 87 Economic Consequences of the Peace, The (Keynes), 138 Economy/economic issues: abortion and, 20 –21; African textile industry, 22 –24; balancing forces in, 89; Dutch Disease, 160 – 64; ethanol and, 11–15; gross domestic product and, 154, 160; housing bubble recession, 15–19; the Internet, 26 –27; railroad gauge specifics, 21–22; steel industry and, 19–20; stimulus plan and, 154 – 61; sunk-cost fallacy and, 60 – 63; turning points in, 166; Viagra drug and, 24 –25 Ecosystem interventions: Asian carp and, 143 – 46; disturbed equilibria in, 7; extinction and, 14, 76, 111; flora and fauna in, 6; foreign aid and, 160 – 63; lessons from, 163 – 64; meddling and, 168 – 69; religious tension and, 150 –54; stimulus plan and, 154 – 61; understanding, 171; urban renewal, 146 –50, 163 Egorov, Sergey, 103 Egypt, 64 – 65, 90 Eisenhower, Dwight D., 125 Elections and bandwagon effect, 74 –75 Elias, Norbert, 123 Emergence (coming into being): al-Qaeda and, 128 –29; defined, 7; GI Bill and,
203
131–34, 140; Indian outsourcing business and, 129 –31; lessons from, 140 – 42; military-industrial complex and, 124 –28; MS-13 gang and, 121–24; postwar Germany, 134 – 40; prisoner abuse and, 134 –37 Endangered Species Act (ESA), 111 Energy Policy Act, 11, 12 Energy Policy and Conservation Act, 112 English, Phil, 113 English as a Global Language (Crystal), 76 English as lingua franca, 77–79 Environmental Protection Agency (EPA), 144 Ethanol, 11–15 European Union, 19, 77 Evolution of Cooperation, The (Axelrod), 63 Ewing, Thomas, 18 External Control of Organizations, The (Pfeffer, Salancik), 99 Extinction concerns, 14, 76, 111 Facebook (online social networking), 26, 117 Fads, 38 – 40 Failed initiatives, 165 – 66 Fay, George R., 136 –37 Federal Bureau of Investigation (FBI), 121, 137 Federal Reserve, 2, 4, 15 –18, 35, 156 Feuds, 51–52 Fifth Discipline, The (Senge), 7, 100 Flora and fauna in ecosystems, 6 Food and Drug Administration (FDA), 144 Forced busing, 80, 83 Foreign aid, 160–63 Forester, Jay, 7 France, 45, 53, 91, 138–41, 154 Frazer, Garth, 24 Freedom at Midnight (Lapierre, Collins), 151 Freedom Riders, 60 Freeze, Mike, 144 French and Indian War, 53 Friedman, Milton, 1 Fromkin, David, 154, 163 Fulbright, J. William, 102 Gadgetry, 115–18 Gaia Hypothesis, 87
204
Index
Galbraith, John Kenneth, 33 Gambler’s fallacy, 167– 68 Gaming the system, 103 – 7 Garrity, W. Arthur, Jr., 80, 82 Gatchyell, Howard, 23 Gates, Bill, 68 – 69, 84 Gavin, Robert, 89 General Theory of Employment, Interest, and Money, The (Keynes), 156 Geneva Convention, 135, 137 Germany, 18, 30, 64, 77–78, 90 –91, 137– 41 G.I. Bill of Rights, The (Greenberg), 132 GI Bill, 2, 131–34, 140 Gingrich, Newt, 113 Gladwell, Malcolm, 79, 109 Glaeser, Edward, 16 Glass-Steagall Act, 2, 17 Global Positioning System (GPS), 2, 4, 97, 107– 8, 116 Global warming, 11, 25 –26 Gomes-Casseres, Benjamin, 97 Goodwill Industries, 22 –23 Google, 96 –98 Gorbachev, Mikhail, 64 Gottlieb, Joshua, 16 Government Accountability Office (GAO), 3 Gramm, Phil, 18 Gramm-Leach-Bliley Act, 17 Grantham, Jeremy, 18 Graphic user interface (GUI), 68 –70 Great Britain, 150–54 Great Depression, 2, 15, 18 –19, 32–33, 47, 58, 156 Great Lakes ecosystem, 143 – 46 Great Proletarian Cultural Revolution, 44 – 45 Greece, 91 Greenberg, Milton, 132 Greenhouse gas emissions. See Carbon dioxide emissions Greenspan, Alan, 2, 15, 17, 18 Grindley, Peter, 70, 84, 85 Grodzins, Morton, 79 Gross domestic product (GDP), 154, 160 Guantanamo Bay detention base, 136, 137 Guare, John, 14 Gulf of Tonkin Resolution, 55 –56 Gyourko, Joseph, 16
Haiti, 13, 163, 164 Halberstam, David, 56 Hammer, Michael, 47– 48, 100 Harp seal harvesting, 24 Hatfield-McCoy feud, 51–52 Head-count measures, 105 Hell’s Angels motorcycle gang, 46 Herd mentality, 7, 31–35 Herrick, John J., 56 Hindus, 52, 150 –53 Hitler, Adolf, 138 – 40 Hobbes, Thomas, 123 Homeostasis (balancing forces), 7, 87– 89, 100 Hoover, Herbert, 141 Housing bubble recession, 15 –19, 35 –36 Hula-hoop fad, 38 Human Rights Watch, 44, 135 –36 Humes, Edward, 131 Humility, 170 Humvee problem, 49 –50 Hussein, Saddam, 115, 154 Hutchins, Robert, 133 IBM/DOS platform, 68 IBM/Windows platform, 69 –71 IHS Global Insights, 157 Illusions of Entrepreneurship (Shane), 166 Improvised explosive devices (IEDs), 49 Increasing Returns and Path Dependency in the Economy (Arthur), 7– 8 India, 52, 101–2, 129 –31, 150 –54 Industrial Dynamics (Forester), 7 Industry standards, 70 Information technology (IT), 71, 129, 166 Info-World, 165 Innovator’s Dilemma, The (Christensen), 99 Institute for Atmospheric and Climate Science, 26 International Monetary Fund (IMF), 161, 172 –73 International Trade Commission, 19 Internet: balance of nature on, 96 –98; blogging on, 95, 118; English language and, 78; Indian outsourcing business and, 130; New Economy panic and, 34 –35; news on, 118; social chain reactions
Index and, 31; social networking on, 26 –27, 116 –18; unintended consequences and, 1, 116 –17 In vitro fertilization (IVF) procedures, 106 Iran, 60, 64, 90, 92 –93, 101, 102, 117, 125, 126 Iraq: alliance formation and, 90; al-Qaeda and, 49, 128 –29; creation of, 154, 163; Hussein, Saddam hanging and, 115; insurgents in, 49 –51; prisoner abuse in, 135; rape rumors in, 29 –30; suicide bomber rumors in, 32; sunk-cost fallacy in, 62; war in, 5, 92, 102, 169 Islamic terrorists: al-Qaeda, 49, 128 –29, 154, 167; Lashkar-e-Taiba terrorists, 52 –53; mujahideen, 128 –29; Taliban, 90, 117–18 Israel, 64 –65, 93–94 Jacobs, Jane, 8, 147– 48 Japan, 90 –91, 141 Japanese internment camps, 41 Jefferson, Thomas, 31 Jeffords, James, 95 Jervis, Robert, 14, 90 Jinnah, Ali, 152 –53 Jobs, Steve, 99 Johnson, Lyndon B., 55 –56, 62– 63 Jones, Anthony R., 136 –37 Jordan, 90 Journal of Peace Research, 89 Kahneman, Daniel, 166, 170, 171 Kamenetz, Anya, 26 Kaplan, Fred, 93, 101 Kashmir, 150 –54 Kauffman, Stuart, 96 Kennedy, John F., 55, 64 Kerry, John, 75, 113 Keynes, John Maynard, 138, 140, 156 King, Martin Luther, Jr., 59 – 60 Kipling, Rudyard, 47 Kiselev, Mikhail, 103 Klare, Michael, 125 Kotter, John, 100 –101, 166 Kozmo.com, 36 –37 Krugman, Paul, 7, 17 Kuttner, Robert, 172
205
LaFollette, Karen, 104 Laibson, David, 18 Lane, Timothy, 110 Language lock-in, 75 –79 Lapierre, Dominique, 151 Laser applications, 115 –16 Lashkar-e-Taiba terrorists, 52–53 Latin as lingua franca, 78 Leading Change (Kotter), 100 Lebanon, 60, 93 Le Bon, Gustav, 7, 32 Leibenstein, Harvey, 74 –75 Levitt, Arthur, 18 Levitt, Bill, 134 Levitt, Steven, 20 Lewin, Kurt, 100 –101 Lingua franca, 75 –79 Lock-in concept: defined, 6, 7– 8, 67, 70, 75; lessons from, 82 – 85; lingua franca and, 75 –76, 79; resegregation and, 79 – 83; triggering, 71–74 Los Angeles Times, 2, 122 Lovelock, James, 87 Loya Jirga tribe, 172 Lucifer Effect: Understanding How Good People Turn Evil, The (Zimbardo), 136 Lugar, Richard, 18 Mackay, Charles, 38 Madurase people, 43 – 44 Major initiatives, 169 Malkiel, Burton, 37 Malone, Jim, 144 Management fads, 38 – 40 Managing on the Edge (Pascale), 39 Manly, Fred, 60 – 61 Mann, Charles, 111–12 Marshal, James, 38 Marshall, Alfred, 7 Marshall Plan, 141 Martian attack hoax, 32 Mason, Daniel, 51 Massacres, 42 – 46 Matsushita company, 72–74 McCain, John, 17, 18, 135, 137 McCarthy, Joseph, 42 McWhorter, John, 78
206
Index
Meddling, 168 –75 Meltdown: Inside the Soviet Economy (Roberts, LaFollette), 104 Merton, Robert, 5 – 6, 165 Metcalfe, Robert, 70 –71 Metropolitan Area as a Racial Problem, The (Grodzins), 79 Mettler, Suzanne, 134 Mexico, 13, 38 Micklethwait, John, 39 Milgram, Stanley, 14, 26 –27 Military-industrial complex (MIC), 124 –28 Military protests, 40 – 41 Miller, David, 38 60 Minutes II, 135 Mitchell, Alison, 112 Moral hazard, 110 –11 Morgenthau Plan, 140 – 41 Morris, Dick, 96 Mountbatten, Louise, 151–52 MS-13 gang, 121–24 Mujahideen, 128 –29 Muslims, 150 –54 MySpace (online social networking), 117 NASDAQ index, 36 –37 National Aeronautics and Space Administration (NASA), 25–26 National Association of Home Builders, 111 National Bureau of Economic Research, 154, 156 National Flood Insurance Act, 110 National Highway Traffic Safety Administration (NHTSA), 109 National Missile Defense (NMD), 126 –27 National Missile Defense Act, 126 National Sports Safety Organization, 110 Native Americans, 30, 38 Nazi Party, 138 – 40 Nepal, 59 Network effect, 70 –71 New Economy panic, 34 –35 Nike (shoe company), 31 Nitrous dioxide emissions, 12 Nixon, Richard M., 102 NMD. See National Missile Defense
Noah’s Choice: The Future of Endangered Species (Mann, Plummer), 111–12 North Atlantic Treaty Organization (NATO), 101 North Korea, 64, 92 – 94, 101, 102, 125, 126 Nuclear Test Ban Treaty, 64 Nuclear weapons, 56 –57 Nudge (Thaler, Sunstein), 171 O2 (phone carrier), 26 –27 Obama, Barack (administration), 94 –95, 102, 114, 157 Oil embargoes, 13 –14 On Meddling (Thomas), 168 On the Origin of Species (Darwin), 8, 14 Organizational Aspects of Information Technology, 166 Organized complexity, 6 –7 Origins of Alliances, The (Walt), 90 PACs. See Political action committees Pakistan, 52 –53, 90, 129, 150 –54, 163 Park, Joseph, 37 Parke, Richard, 55 Parks, Rosa, 59 – 60 Pascale, Richard, 39 Peace to End All Peace, A (Fromkin), 154 Pendleton Act, 113 People’s Liberation Army (PLA), 45 Peretti, Jonah, 31 Personal computer (PC) industry, 68 –71 Perverse adaptation: gadgetry with, 115 –18; gaming the system, 103 –7; lessons from, 118 –19; loopholes and, 111–15; moral hazard and, 110 –11; overview, 6, 8 –9; Titanic effect and, 107–10 Pfeffer, Jeffrey, 99 Pfizer pharmaceutical company, 24 –25 Philips, Steven, 110 Pimentel, David, 11 Plankton-climate connection, 87– 88 Plummer, Mark, 111–12 Political action committees (PACs), 113 –14 Politics, 74 –75, 94 –96, 113 –14 Poole, Teresa, 45 Popov, Andrey, 103 Poverty Action Lab (PAL), 174 –75
Index Pravda, 104 Prisoner abuse, 134 –37 Private mortgage insurance, 111 Protestants, 45 – 46 Protests, 40 – 41 Putin, Vladimir, 93, 128 QWERTY keyboards, 71–72 Radcliffe, Cyril, 152 Railroad gauge specifics, 21–22 Rajan, Raghuram, 161– 62 Rankin, John, 133 Rape and the domino effect, 29 –30 Raze-and-build urban renewal, 149 –50 Reagan, Ronald, 102, 116, 126 Recessions, 2, 4, 155. See also Housing bubble recession Reinforcing feedback loops, 7, 49 –51, 56, 58–59, 63–65; network effect and, 70 –71 Religious tension, 150 –54 Renenger, Aaron, 116 Renn, Aaron, 170, 171 Rescher, Nicholas, 6 Resegregation, 79 – 83 Roberts, Craig, 104 Roemer, Derek, 46 Rogue States and Nuclear Outlaws (Klare), 125 Roosevelt, Franklin D., 47, 113, 131 –33 Roosevelt, Theodore, 54, 64 Rubin, Robert, 18 Rumsfeld, Donald, 126, 137 Russia, 92 –93, 104, 127–28, 139 Russo, J. Edward, 170 Sadat, Anwar, 64 – 65 Salancik, Gerald, 99 Salem witch hunts, 41 Salvation Army, 22–23 Sarbanes-Oxley Act, 2 Saturday Evening Post, 64 Saudi Arabia, 90, 128 Scenario planning, 173 Schafer, David, 37 Schlesinger, James, 136
207
Schmidt, Eric, 96 Scholastic Aptitude Test (SAT), 4, 114 –15 Scully, John, 69 Securities Exchange Commission (SEC), 18 Seeley, Thomas, 123 Senge, Peter, 7, 100 Shane, Scott, 166 Sheehan, Cindy, 40 – 41 Shellow, Robert, 46 Shindell, Drew, 25 –26 Shoemaker, Paul, 170 Sholes, Christopher, 71 Singh, Hari, 152 –53 “Six degrees of separation,” 14, 26 Skype (online social networking), 117 Skywalks, 149 – 50 Small scale projects, 173 –75 Smith, Adam, 4 –5, 8, 88, 90 Smith-Connally Act, 113 Smoot-Hawley Tariff Act, 19, 58 Snyder, Glenn, 96 Social chain reactions, 30 –31 Social mechanism of unintended consequences, 1, 6 –8 Solomont, Alan, 75 Sony company, 72 –74 Soros, George, 113 South Korea, 94, 101 Soviet Union, 64, 101, 103 – 4, 125, 128 Spanish-American War, 3, 54 –55 Specter, Michael, 78 Sport utility vehicles (SUVs), 109, 112 Standards Strategy and Policy (Grindley), 70 Star Wars missile defense program, 124 –26 Steel industry, 19 –20 Stephenson, George, 22 Stiglitz, Joseph, 17, 173 Stimulus plan issues, 154 – 61 Stockdale, James, 56 Strategic Defense Initiative, 57 Subramanian, Arvind, 161– 62 Suicide bombs, 32, 53 Sunk-cost fallacy, 60 – 63 Sunstein, Cass, 171–72 Swank, Karen, 174 Sweden, 77, 172, 175 Swift Boat Veterans for Truth, 113
208
Index
Syria, 60, 90, 125 System dynamics, 7 Szilard, Leo, 7 Taliban, 90, 117–18 Tea Party, 94 –96 Telecommunications Act, 130 Temin, Peter, 18 Tenner, Edward, 109 Terrorism. See Islamic terrorists; War on terror Texting technology, 116, 117 Thaler, Richard, 171–72 Theory of International Politics (Waltz), 89 –90 Thomas, Lewis, 14, 168 – 69, 171 Timur’s invasion, 151 Tipping Point: How Little Things Can Make a Big Difference, The (Gladwell), 79 Tipping point dynamic, 79 – 80 Titanic effect, 107–10 Tobacco advertising bans, 114 Trade wars, 57–59 Traditional Chinese Medicine (TCM), 24 –25 Transformative innovations, 98 –101 Treaty of Versailles, 18, 137, 138, 140 Trilla, Lester, 19 –20 Truman, Harry S., 141 Tulipomania, 37–38 Turkey, 30, 37, 44, 93, 101 Tversky, Amos, 166, 170, 171 Twitter (online social networking), 26, 117–18 Unintended consequences: as beneficial, 2; blowback and, 3 – 4; decision-making ability and, 166 – 68; failed initiatives and, 165 – 66; meddling and, 168 –75; social mechanisms of, 1, 6 –8 United Nations (UN), 77 Urban renewal, 146 –50, 163 VCR vs. VHS battle, 72 –74 Viagra drug, 24 –25 Vicious cycles: arms race and, 56 –57; collective behavior in, 59– 60; feuds and, 51–52; Humvee problem, 49 –50;
reinforcing feedback and, 49 –51, 63 – 65; sunk-cost fallacy and, 60 – 63; trade wars and, 57–59; war and, 52–54 Vietnam War, 55 –56, 62– 63 VisiCorp, 68 – 69 Volcker, Paul, 4 Volga Metal Works, 103 Von Clausewitz, Carl, 33, 53 Von Hippel, Frank, 24 –25 Wall Street Journal, 16, 31, 77, 156 Walt, Stephen, 90 Waltz, Kenneth, 89 –90, 123 War and vicious cycles, 52 –54 War of the Worlds, The (Wells), 32 War on terror, 91–92, 102, 134 Washington, George, 30, 53 Washington Post, 2, 40, 62, 77 Wealth of Nations, The (Smith), 4 –5, 8, 88 Weaver, Warren, 6 –7 Welles, Orson, 32 Wells, H. G., 32 White flight dynamic, 79 – 80 Kaiser Wilhelm III, 139 Wilson, Edward O., 165 – 66 Wisdom of the Body, The (Cannon), 7 Witch Doctors, The (Micklethwait, Wooldridge), 39 Witch hunts, 41– 42 Wooldridge, Adrian, 39 World Bank, 13, 23, 77 World Health Organization (WHO), 77 World War I (WWI), 3, 18, 58, 64, 132, 137, 139 World War II (WWII), 2, 64, 67, 79, 124 –25, 132, 138, 140 Wozniak, Steve, 69 –70, 99 Yahoo, 97–98 Yazgur, Eugene, 51 Yeni Safak journal, 29–30 Y2K problem, 131 YouTube, 97 Zambia, 23 Zimbardo, Philip, 136 Zoellick, Robert, 13
About the Author WILLIAM A. SHERDEN is an adjunct professor at Brandeis University’s International Business School. He spent most of his career as a management consultant working with major international corporations. Sherden has published two books and a number of business-related articles. His first book, Market Ownership, was chosen as Fortune’s alternative book of the month. His second book, The Fortune Sellers, has been favorably reviewed by leading newspapers and journals, including the New York Times and the Wall Street Journal. William Sherden lives with his wife in Boston, Massachusetts.