Marshall McLuhan once quipped, "I don't know who discovered the water but it sure wasn't a fish." Typical of his nutty wit, although I heard from someone that he actually paraphrased it from a Chinese proverb. And come to think of it, this also reminds me of another ancient dude who liked deep thoughts. Isn't this akin to the conundrum that Plato's cave presents? Indeed, the meta challenge of being aware of our context—and how it influences and conditions our perspectives— is a perennial and timeless one.
To that end, I wrote a working paper on the use of learning journeys, mainly for a corporate audience. Learning journeys, incidentally, are one of those tools for getting out of the proverbial "box." This is a draft version so comments and suggestions are welcome.
This paper also explains my blog silence lately.
But what I want to know is this: where do you swim?
When do "natural" disasters become "unnatural" ones? Without question, the line between these two phenomena is getting blurred and increasingly unhelpful. Human activities precipitate or acerbate natural weather patterns and events—and vice versa.
Famines, droughts, and flooding often have social and political aspects, even causes, with linkages to everything from poor farming practices to consequences of prolonged civil war. The exploitation of natural resources make regions more vulnerable to weather hazards. In the developed world, the collateral damage of hurricanes, and its more deadly companion flooding, has increased because individuals and property developers are making stupid choices like building near coastlines and other places prone to the cycles of nature. In the poor world, more people are being killed not because of peoples' choices (they have fewer options) but because of the poor choices of governments, international donors, not to mention the fact that disaster protection is just not a priority when millions are hungry. For instance, poor housing standards in earthquake zones are the cause of many needless deaths; and over-forestation and poor city planning trigger massive flooding and landslides. (Yup, turns out those trees have many purposes, including the stabilization of soil.)
The reverse is also true, if least frequent: large scale natural disasters, the kind that have nothing to do with human agency, such as the impact of asteroids and the enormous volcanic eruptions of Krakatoa in Java, Indonesia, have been identified as initial triggers to a cascade of significant social, economic and political changes. Nature, in the end, has the final trump card. Two fascinating histories come to mind, both recent: Catastrophe: An Investigation into the Origins of the Modern World by David Keys, 2000 (he also had a BBC TV series); and Krakatoa, The Day the World Exploded: August 27, 1883 by Simon Winchester, 2003.)
Thomas Homer-Dixon, the author of The Ingenuity Gap, was one of the first researchers to highlight these linkages between the natural and unnatural, arguing that we need to look at these things systemically and framed in the terms of "environmental security." This was, and still is, a counter-intuitive idea for most policy-makers, but this concept is likely to ascend as we start seeing just how important our ecological context is to a smoothly running, prosperous society. Incidentally, Homer-Dixon's work was in turn taken up by Robert Kaplan in his famous "The Coming of Anarchy" article in The Atlantic Monthly, which cased a huge stir in the mid 1990s.
The good news is that earth systems scientists, a new breed of researchers armed with the very best of computing technology, will make important and timely contributions (we hope) to understand these relationships. Meanwhile, we still have a formidable educational and awareness problem. Human perception is notoriously bad at intuitively understanding "lag times" and exponential change—two classic traits of complex adaptive systems whether it be an ocean or the economy. For instance, it took twenty years for scientists to understand the cause-and-effect of our depleting ozone layer, long after it started becoming a problem. Public perception also has built-in lag times; it often takes years before an important issue builds a head of steam and crosses the threshold of mainstream interest in an enduring way. I don't have polling data on this, but it might be safe to say that public perception about the "weirding of the weather" is now tipping—regardless of whether this is "real" or not from a scientific perspective—as exemplified by Hollywood's sensational interpretation of the "rapid climate change" scenario in The Day After Tomorrow (the trailers look ridiculous) and this recent article, Unnatural Weather, Natural Disasters: A New U.N. Focus" by Elizabeth Olson, The New York Times. Also see The Change in the Weather: People, Weather, and the Science of Climate by William K. Stevens.
As Olson reports: "The cost of natural disasters and their negative effects on development have attracted the attention of the World Bank, which no longer thinks of disasters as a purely humanitarian issue. Natural disasters can decimate a country's economy. Venezuela's 1999 mudslides cost the country $3.2 billion. Honduras lost 41 percent of its annual gross domestic product when Hurricane Mitch barreled through in October 1998, according to the World Bank."
But why do more "unnatural" things happen in the developing world? We take for granted in the developed world the many subsidies, institutions and tools we have at our disposal which protect us from bad weather. These subsidies include investments in R&D focused on weather forecasting to good transportation networks to government disaster assistance. Access to things like insurance also cushion many people from the worst effects of weather gone wrong. The developing world doesn't have these.
Fortunately there have been "sweeping improvements in forecasting" which has "made it possible to notify people of impending disasters in time to evacuate them or shore up their defenses."
"'Five-day forecasts today are as good as two-day forecasts were about 20 years ago'... And they can be broadcast almost instantaneously, almost anywhere in the world. 'This is not just a natural phenomenon and there's nothing to do about it,' said Margaret Arnold, a hazard management official at the bank. 'There is a lot you can do."
Unfortunately, many of our institutions and systems are still not up to the task to doing something about this, but the causes are quite complex and not just about bringing better technology.
"[As} Dr. McPherson noted that developing countries often lacked preparedness plans. 'Part of it is lack of money and lack of experience,' he said, 'and it's lack of political will.' Many politicians do not understand what modern forecasting can do, he said, and some cultures are fatalistic about such catastrophes...
In research, he said, every nation would benefit from more systematic studies and observations of weather phenomena. The world needs to know more about 'how and why natural hazards happen, and how they can escalate into disasters,' he said."
There are also a host of bureaucratic and organizational problems in how the international system responds to disasters. For instance, going back to our blurred categories, the distinction between "natural" and "man-made" cleaves disaster relief systems in unhelpful, if not counter-productive and damaging ways.
A project I'm helping to launch called "Humanitarian Futures" has the aim of improving these organizations' capacity to adapt to this shifting context. Dr. Randolph Kent, an experienced humanitarian hand and now a senior researcher at Kings College London, is leading this project. A brief summary of the project can be found here. Kent has written some excellent articles on this topic; I recommend this article "Humanitarian Failures, Adaptive Failures."
One of the hypotheses of the project is that in the future we will have more "unnatural" disasters, and these will occur not just in the developing world but also in the rich countries as well. Witness the recent extreme temperatures in Europe last summer, particularly France. Witness the wild flooding of Prague. Witness the rapid depletion of water tables in California, huge forest fires, and concerns about the West Nile Virus. Another hypothesis is that, like in the distant past, we may be confronted with some truly global disasters that have global impacts, At present, there is a serious "ingenuity gap", especially of the institutional kind. Our systems, tools, and thinking are not prepared for this kind of disaster of large-scale, unconventional disaster. (The OECD recently did a study of this.) Wish us luck.
"Fire in a Ponderosa Pine Forest"
Forest fire burning out of control in a pine forest on the Mescalero Apache Indian Reservation in New Mexico.
© Raymond Gehman/CORBIS
Maureen Dowd, a columnist for The New York Times, is famous for mixing over-the-top criticism verging on satire with humorous irreverence.
Regrettably, this is in contrast to the recent reporting in this paper which has shied away from thorough criticism and the inclusion of dissenting views. See "Now They Tell Us"by Michael Massing in The New York Review of Books (Volume 51, Number 3 · February 26, 2004) for a piece documenting the reporting biases in the Times leading up the Iraq Invasion. Many blogs have covered this ground as well— bless them all and bless the blog—but this quite a detailed and thorough treatment.
Now, what to make of Dowd's writing? I find her entertaining, daring, sometimes witty but bordering on trite, silly, hyperbolic, and cutesy. Granted, this approach is probably a stylistic strategy; otherwise the truth of her barbs, if taken seriously, might never have a chance to be said. She is clearly playing the Court Jester, the Foil. I'm sure she incenses right wingers, especially hard-core Bushies: the temptation to find ways of "getting rid of her," I'm sure, is very great. And this administration has had a rather scary record of punishing people for speaking out. For example: Former Ambassador Joe Wilson's experience, which he writes about in his new book The Politics of Truth: Inside the Lies that Led to War and Betrayed My Wife's CIA Identity: A Diplomat's Memoir.
Anyway, perhaps because my expectations were low, I was struck by her quoting Bornstin's snippet of wisdom, especially timely for now. These words were found in the closing sentences of "Clash of Civilizations", May 13, 2004.
"The hawks, who promised us garlands in Iraq, should have recalled the words of the historian Daniel Boorstin, who warned that planning for the future without a sense of history is like planting cut flowers."
History is such a fertile place for learning and new ideas, but it can also constrain and mislead people. The past is not always a reliable predictor of the future. While huge generalizations, Europe is often said to have too much history, and America too little, which is possibly why these two perspectives are important compliments to each other—why the current disaccord is so worrisome. Unencumbered by too much past, new things are easier to start in America. This is the great virtue and asset of America. This freedom from the past. But serious and long-lasting mistakes can be made if the past is not understood as part of the context; and indeed, the thing we call "history" is often the only place where we can see slower moving drivers at work, drivers that may pop up to the surface and bite you in the ass. So good planners, good learners, need to always look both forwards and backwards. A rule of thumb we have: for whatever time interval you are looking ahead, you need to look backwards twice as long.
America is now living with the costs of not paying attention to history. And moments like these only accentuate certain proclivities, certain cultural predispositions. While this is true with the specific case of Iraq, a case has been made that this is also a broader "asymmetric weakness" of the US: the difficulty of having a Long View, the difficulty of understanding of the power of history beyond its boarders— especially in contrast to places like China, which is more culturally accustomed to thinking ahead in 100 year terms. (This was something we identified when doing national security scenarios for DARPA in the summer of 2001.)
Switching the conversation to Boorstin, the person Dowd quotes, I think he should be mandatory reading. He recently passed away at the age of 89, so this is partly why it jumped out at me. The Economist's obituary featured him as the one of the last great amateurs. This would have pleased him greatly since his work (The Discovers, The Creators) documented how many of the world's most important developments and discoveries were made by amateurs—not the experts and specialists. He wrote extensively about America's exceptionalism in this front as well.
This in turn reminded me of a saying I love, "The Titantic was built by experts, whereas the Ark was built by amateurs." We have too much Titantic thinking these days, and not enough amateur wisdom. How can this emerge in our highly specialized, knowledge-intensive world? Is it possible to be an amateur in things like the biosciences? (Some people tell me that bio-hacking is just hitting the "garage" startup stage; that is, it's possible to do with basic tools and tech. But the knowledge barriers are still quite high, no?) Or in finding solutions to climate change? Where does amateur wisdom, the Beginner's Mind in us all, fit into to the collective problem-solving process? The phenomenon of blogging may be part of the answer, filling this need, the precursor to future innovations of this sort.
I think we are at a point where the existing knowledge paradigm, which prized rationality and vertical specialization over other ways of knowing, has reached its logical limitations. The most important breakthroughs and insights are now coming at the intersection of disciplines and domains. Horizontal or "lateral" knowledge, intuition, and wisdom (dare I say it?) is starting to be valued, that is, the kinds of skills that link, connect, and make sense of patterns beyond the traditional parameters of a problem. So stay tuned. Start looking for new knowledge strategies, and other ways of knowing, emerging to the forefront and increasingly gaining acceptance. This is what's needed now.
For more about the role of private contractors in the Iraqi prisoner abuse scandal, listen to Security Analyst Peter Singer on NPR's Fresh Air (May 11, 2004):
Singer, an analyst at The Brookings Institution, is the author of the book Corporate Warriors: The Rise of the Privatized Military Industry. He'll discuss the use of private military contractors in Iraq, especially in light of the abuses at the Abu Ghraib prison where civilian military contractors were involved in interrogations. Singer is an Olin Fellow in the Foreign Policy Studies Program at the Brookings Institution and coordinator of the Brookings Project on U.S. Policy Towards the Islamic World.
Kaki and Titan are the two companies named specifically. Titan is currently being bought by Lockheed Martin. There are more firms, of course, but these are the names I picked up.
First thing that I thought interesting: this private military industry didn't really exist, at least in this incarnation, until the 1990s. (Mercenaries pop up throughout history like bad weeds.) Why?... Perhaps to fill certain post-Cold War vacuums, and commerce follows the money: this is incredibly lucrative work, with many multi-million dollar contracts for everything from IT services, logistics, analytic capabilities, training the Iraqi Army, interrogation and translator services.
About 15,000-20,000 private contractors are now in Iraqi. The US has never before outsourced so many responsibilities and roles to private contractors before. This phenomenon is causing many concerns and raising critical questions. Here are some of Singer's worries:
Why is the US outsourcing these critical jobs to the private sector? Singer's theory: it was probably a function of a lazy bureaucracy, a way to get around all of the red tape within the current military organizational setup. They got a big check from Congress, and this was the easiest way to spend it. While outsourcing is often used to save money, improve efficiencies and quality of service—all good things and are probably needed in the military— it's not clear yet that this was the case . The early evidence suggests these good outcomes have not materialized, especially given the fact that these incidences may undermine the entire war project!
Where do these contractors fit into the chain of command? How are they held accountable? What kind of oversight do they have? No very much, it turns out. This is especially troubling since contractors are not subject to the court marshall system. None of the private contractor suspects have been charged with anything yet because of certain legal loopholes and gaps. Contractors, for instance, are provided special immunity from prosecution for committing acts that are part of your official duties. When does official duties include raping a prisoner? In a nutshell, these companies' activities are largely unregulated. For instance, some of the contractors have unsavory backgrounds, like former apartheid police and individuals associated with Chile's Pinochet regime. These legal gaps are not a problem until things go wrong. But this isn't the first time things have gone wrong. Recent history in the Balkans made this problem very clear when a private contractor who was caught committing major sexual crimes, e.g. raping and enslaving refugee women, the evidence being his videotapes. He was never convicted. Congress tried to make some changes, but they were superficial.
Is this appropriate work for the private sector to do? Governments do "guardian" work, said the sage thinker, Jane Jacobs, whereas commercial actors have a "trading" moral system—a crucial distinction that many policy-makers should be reviewing right now. In an interview, she explains
"...in a book called Systems of Survival, in which I separated the worlds of work into guardian operations, which have to do with territory—that means politics, religion, all the things that people have to do who are responsible for administering or guarding territory, and providing public amenities for it—and the other division is what is usually referred to as the private sector—commerce, manufacturing, banking, that sort of thing. When the two get mixed up—it’s sometimes inevitable that they do, but mostly it’s not, and in every case it’s hazardous to mix them—they get corrupt, and they get skewed in non-functional ways. The urban renewal kind of subsidies are a terrible example of mixing this sort of thing, so that the politicians you’re talking about, the policy-setters, they’re really guardians, they’re really territorial administrators, but they have provided monetary incentives for the private sector to do things it wouldn’t do otherwise, and that’s a mess." [My italics.]
Clearly, companies have certain conflicts of interest that complicate things when it comes to military situations. For one, companies have added organizational layers; they have shareholders and employees. Their shareholders want them to make money and make good risk-reward decisions. Military situations invariably take you well into high risk situations. If these jobs were non-essential jobs, then I could see this as being an attractive business opportunity. Things like mowing the grass, taking the garbage out, etc. Yet recently many of these contractors are carrying out tactical military jobs. Another factor to consider: the organizational relationship is quite different between employer and employee. This is a contractual relationship, and companies can't order employees the same way the military can a solider.
The upshot of this begs this key question: is it a good idea to be moving the location of military decisions outside of the military system? As the Taguba report surfaced, some of the contractors didn't even have military clearance. Not a good idea, if you ask me.
So if this is a pattern, part of a broader systemic issue, why has there been little or no policy response within the US? As Singer notes, there was some low level action. But Congress's attention span tends to follow what the public knows and cares about. Since the public really didn't really know about any of this, since this has been carefully kept out of the public eye, this really wasn't a priority.
What troubles Singer the most? Watching how Rumsfeld and some of the companies involved in this kind of work have responded to these questions. No transparency and even blatant untruthful spinning, which is something Rumsfeld was recently caught out on in a letter he wrote to certain Congressmen before the scandal broke. In gold-rush situations, when you don't know when the luck is going to run out, all kinds of bad behaviour starts to happen.
If this privatization of the military is proving to be a bad idea, who made this strategic decision within the US government? Who should be held accountable for this? Rumsfeld, of course. This was part of his grand plan to do "War-Lite". While enriching his friends and Republican party donors may have something to do with it as well, the main issue is his faulty policy judgment. This is enough reason to get rid of him. We can no longer trust this watcher, this Guardian of the state, to watch his watchers.
Okay. If I were to try to glean any lessons from what I've been reading thus far from all the reports and interviews of the Abu Ghraib disaster, I'd say these failures can be attributed to three interacting factors:
1) Organizational problems, e.g. the clumsy and slow process of risky information filtering up through the chain of command structure; the mismatch between putting military-intelligence operatives in charge instead of military police (misaligned roles, incentives, etc.); and what is getting much coverage lately, the extensive use of the private sector, which is not subject to military discipline, to perform key interrogation roles.
All of this will become clear to you after reading "CHAIN OF COMMAND: How the Department of Defense mishandled the disaster at Abu Ghraib" by SEYMOUR M. HERSH, The New Yorker. Hersh, btw, was one of the journalists who broke the Abu Ghraib story, also in this magazine.
Here are some choice quotes:
“Our prerequisite of perfection for ‘actionable intelligence’ has paralyzed us. We must accept that we may have to take action before every question can be answered.” The Defense Secretary was told that he should “break the ‘belt-and-suspenders’ mindset within today’s military . . . we ‘over-plan’ for every contingency. . . . We must be willing to accept the risks.” With operations involving the death of foreign enemies, the memo went on, the planning should not be carried out in the Pentagon: “The result will be decision by committee.”
"By placing military-intelligence operatives in control instead, Miller’s recommendations and Sanchez’s change in policy undoubtedly played a role in the abuses at Abu Ghraib. General Taguba concluded that certain military-intelligence officers and civilian contractors at Abu Ghraib were “either directly or indirectly responsible” for the abuses, and urged that they be subjected to disciplinary action."
"Human Pyramid" © Images.com/CORBIS
Creator: David Ridley
2) Cultural issues within the military that encouraged dehumanizing practices such as taking pictures while torturing. This started before Iraq in Afganistan, and possibly before that.
3) And then when the issue was known, a pervasive mindset problem, starting from the very top (Rummy and Bushy), to suppress and/or procrastinate dealing with bad news, news that doesn't "fit" their mental map or plan. As Hersh reports, Rummy in particular didn't want to hear bad news, and if there was bad news, the best thing to do was to cover it up and hope it would go away.
In our business, we call this "willful wishful" thinking, a particularly bad form of denial combined with an arrogant belief that most events can be controlled and manipulated using the levers of power and authority. This combo is a classic recipe for setting yourself up and the organization (in this case, the United States of America) for being blind-sided by the unexpected, and most crucially, for weak decision-making with a brittleness locked in because decisions were made using only a narrow range of information—the information that only they wanted to see or hear. (This Administration is going to make a great case study one of these days.)
While organizations may have gotten away with this mentality in the past, and still do today, in highly complex and uncertain situations, command and control structures start to break down in their effectiveness. After Sept 11, you would think this lesson was starting to sink in, but when you are dealing with a culture as dsyfunctional as the US military right now, perhaps this will take a decade or more before it really does change.
A copy of the Taguba Report, named after the general who blew the whistle, is found here.
I discovered some notes I jotted down from a piece written in The New York Times Magazine by Andrew Sullivan called "This is a religious war" (October 7, 2001.) I found a copy here. Also check out Sullivan's pioneering, eponymous blog The Daily Dish.
Initially, one quote grabbed my attention: “All monotheisms have an inherent temptation for terrorism.” Yes, 'dem are fighting words. (Bad pun intended.) Provocative, without question, but not without substantial backing, especially through the brilliant research found in One True God: Historical Consequences of Monothesism by Rodney Stark which was featured by Stewart Brand as a GBN Bookclub. There are, of course, downsides to a strictly scholarly treatment of the spiritual, but it's worth the read. Educational is putting it mildly. "Monotheism is inherently combative—one true God," summarized Brand. "The four monotheistic religions enthusiastically battle with each other and within themselves, often over issues that seem trivial to outsiders but are deemed worth fighting over to the death by believers."
But what really made me think, going back to Sullivan's wee article, was the part where he alluded to the famous "Grand Inquisitor" passage from The Brothers Karamazov by Fyodor Mikhailovich Dostoevsky (1879). I needed to re-read this and saw why, years later after my liberal arts education, it was such an important and subversive book.
In Dostoevsky's infamous scene, Jesus Christ comes back to earth and stumbles upon the Grand Inquisitor at his grim work dispensing justice to the heathen. Shockingly, the Inquisitor, when confronted with his Savior, is compelled to condemn him, to burn him at the stake with the rest of the sinners. Why? Not necessarily out of a cynical desire to preserve power, at least not at first. No, he thinks Jesus should burn as punishment because he gave humanity the prospect of salvation, the freedom to choose the "right" path, knowing full well that most of us will fail to do so. In short, this freedom was a form of cruelty, argued the Inquisitor. (Not coincidentally, this resonated, with my recent blog on The Paradox of Choice. )
To paraphrase Sullivan, the Grand Inquisitor passage is really about the tension between the transcendent claims of most religions and our human inability to live up to these ideals. Many people "want to resist the terror of choice, the abyss of unbelief." People need the certainty of certain truths to get through life's doldrums, dead-ends, and disappointments. Above all, people crave the comfort of a community of believers.
As much as I'm critiquing our abundance of choice, I'm in no way advocating the reverse, that is, a systemic taking away of freedoms or reverting back to a day when authoritarian systems dominated our lives. No way, José. We just need to get better at understanding the relationship between freedom and choice, its limitations and "initial conditions", on multiple levels. We also need to recognize the kind of serendipity and structure that comes with the interaction of choice and life's randomness, which is only nature's way of making things interesting, diverse, and dynamic. But I digress. Back to religion...
Detail of Hands from Creation of Adam by Michelangelo Buonarroti
© World Films Enterprises/CORBIS
In March, when my mentor and colleague Napier Collyns was visiting, we touched on aspects of this conversation as we walked to Parc Monceau, one of the most treasured and aristocratic of greenspaces in Paris. In absolute frustration and disgust with all of this "God business", all of this sectarian violence and escalating fundamentalism, Napier asked, why do people need religion? I tried to argue really what Dostoevsky and Sullivan were saying, but without the sophistication and nuance. I think human beings will always need some larger narratives to help them make meaning of their lives, to help them negotiate the hardest questions of human existence. Modern life, which has essentially fragmented and challenged most larger narratives, is obviously failing to provide the goods. The hallmarks of modern life—art, shopping, sports—are not sustainable alternatives to many of the world's religions. In fact, modernity is likely strengthening religion, if certain statistics are correct in the rise of Islam and church attendance.
This immediately reminded me of what Don Michel, another mentor of mine and long-standing friend of Napier's, wrote in Planning to Learn and Learning to Plan, an almost forgotten piece of deep thinking and insight. (1973 reprinted in 1996.)
"A major cause and consequence of planet-wide social turmoil is increased efforts and demands to establish, change, or remove boundaries, and counter-efforts to perverse them. Networks may encourage boundaries to alter more rapidly, but boundaries are not disappearing. Perhaps more than in pre-network times, today's boundaries are built around concepts, convictions, relationships, and flows of information in the form of money or other symbols. But while flows of symbols may be unbounded in the abstract, they are absorbable only in the concrete — via organizational and personal interests, and actual operating modes. Ultimately, via individual human minds. All bounded. The mode and degree of absorption involve crucially important, mostly unmet, learning challenges, certainty in civil society."
So we need to ask the question, how can modern life and secular ways of living do a better job? What can we learn from these religions, from their spiritual experiences and practices? Or will, over time, something emerge from the post-modern period, a new meta-framework that helps people manage uncertainty and the creative challenge of meaning-making on an individual and collective level? Or is this division of labour between the secular and spiritual a good one?
After our walk, I became convinced that part of the problem is that non-believers don't understand what drives believers. This lack of understanding is compounded by a lack of respect, a feeling that this way of being is inferior to a "rational" or secular modus vivendi. Hopefully new insight from the cognitive sciences into how the rational and irrational interact, into how our emotional and intellectual states fuse and influence each other, might take the edge off some of these biases. But no, the Rationalist Paradigm is still Top Dog in the ways of knowing game.
Taking this back to my work, as part of our orthodoxy as scenario thinkers, the postmodern practitioners and framers of the future, we preach that living with uncertainty and ambiguity will be a core competency, something to embrace and master. That individuals and groups who develop this ability will be the ones that adapt and survive the current set of transitions. Intellectually, I buy this thesis, especially when I scan back on much of the literature in evolutionary anthropology. When restricted to certain domains (e.g. shifting political and commercial structures) this is definitely true. I also agree with Mary Catherine Bateson who claims that woman are much better at uncertainty management than men given our extensive practice with this throughout the ages. But emotionally, as someone who lives with a great amount of uncertainty partly by choice, as someone who has actively courted ambiguity in an almost perverse fascination, I am starting to appreciate the downsides of its embrace. It's not for everyone, or in fact, for most people. In Michel's words, "Most humans are insufficiently educated, skilled and motivated for reasoning systematically, cybernetically, in multivariate terms, and in terms of both/and, rather than either/or. They are unskilled at brining together rational reasoning, emotions, and feelings in the service of solving problems and living with predicaments."
Perhaps the biggest creative challenge, then, is helping people discover different pathways to meaning-making that avoid the structural disadvantages (if there are indeed these) to monotheism? Instead of just saying "live with uncertainty", an unsustainable vacuum in my opinion, perhaps the greatest thing we can do is help people live with uncertainty by active engagement with it, by bridging these two states somehow, by teaching "dilemma management" and deep Dialogue where the task is co-creating common ground. We need also to be clearer about the advantages of living in a post-modern sense, for being boundary-spanners instead of boundary-defenders, for being learners and questioners, and co-creators of meaning.
I'm a believer in these things, yet even I struggle. The advantages, in brief, are a strengthened sense of self, relatedness, and ultimately freedom: freedom gleaned from the realization that reality is socially constructed, both from the inside out (that is, as individuals) and from the outside-in (society, culture.) Hopefully this understanding will help us create better futures, futures less constrained by maladaptive dogmas, either secular or religious. This is probably not enough, not the stuff that can glue together a society, but I may lack the imagination to see how it might, how this new narrative(s) with new meta-values—"bounded" tolerance being a good candidate— might keep at bay the powerful drivers of fear, greed and the darker side of the Rogue Primate in us all. It's not coherent yet, at least in my mind. Another big project for the Now. But I'm quite certain that I'm a boundary-spanner, that this is my telos in life. Now put me to the test, Universe.
NPR's Fresh Air Radio program just featured the retail anthropologist, Paco Underhill. [What a wonderfully apt name, digging underneath the perceptual surface of things, as he does.]
From the radio show, Thursday May 6th, 2004:
"Underhill studies and tracks the habits of shoppers in order to learn the best way to lead them to make purchases. His retail consulting firm, Envirosell, has helped big-name companies such as McDonald's, Levi Strauss and Blockbuster to study their customers' browsing and buying habits. He's the author of the book Why We Buy, and the new book Call of the Mall."
In his new book he observes "somehow how the glorious history of commerce has culminated in a sanitized architectural cliché in which you typically find not exquisite treasures and exotic wares but rather eighty styles of sneakers or sixteen varieties of chocolate chip cookies."
On the outside, malls are often aesthetic abominations. "Boxes with mouse holes in them," said a colleague of Underhill. They are so damned ugly, he explains, because of how they are conceived, starting with a leasing agent, a spreadsheet and lawyers and thinking about aesthetics much further down the line. And once built, they become part of our environment, sunk infrastructure, and thus hard to tear down or remove from our visual consciousness.
Most developers never really thought about how long these malls would last—or should last; part and parcel of the short-termist culture in the US. Most malls are about 20 years old right now, in decay and dying. And you don't see anyone rushing out to grant them landmark status, nor are likely to win any architectural awards. Contrast this with Europe: when the Grand Magasins du Bon Marche was first built in Paris in 1876 by Gustave Eiffel—the first department store in the world—the designers had longevity and beauty in mind. And sure enough, it is still a lovely landmark, a physical asset to its' surroundings, a pleasure to behold and be in. We need much more of this longer term mindset in our designers today.
In pure business terms, the opportunity to fill some evident vacuums are ripe for the taking. As Underhill observes, no one is building a shopping mall to service new markets, but rather, they are being built to steal someone else's market. We know that the best business opportunities, the best margins, are those where you don't compete with other competitors, but rather where you create an entirely new marketspace. Home Depot and Quicken are classic examples. In business literature, this is known as "value innovation." (You can check one Value Innovation article describing these tools here.) Other factors that make this ripe for new entrants include changing demographics, especially the role and shopping patterns of women, for whom malls were built around. Women now have multiple roles: breadwinners, mothers, spouse, purchasing agent. Why not create something much more conducive to her new needs? Also, watch where the boomers flock to in the coming years. As a rule, retail follows housing, which means when the baby boomers start deciding how they want to spend the last third of their lives, this is likely to put significant pressure on how malls are conceived today.
Out of the decaying malls, however, we are seeing signs of renewal. Malls become "ethnicized" or get taken over by churches. The New Urbanism ethos is also having an impact. New malls tend to be more integrated into communities and part of larger redevelopment visions. Malls are also being repurposed into interesting experiences like London's Selfridges and its latest controversial Bullring store in Birmingham. Selfridges' business model is also quite different, sharing risk with its merchants and giving them much more autonomy. This is smart because as Underhill notes, independent merchants make department stores interesting because they bring goods that are not found in other places. The dilemma for most mall owners, however, is that independent merchants can't afford to pay the steep rents. Perhaps some clues to resolving this dilemma and innovating a new value proposition can be found in experimental places like Selfridges. And, it should be noted, Selfridges's HAS won heritage status and awards.
Why is such a reflective researcher so intently focused on malls and the science of shopping? Apart from where he makes lots of money, Underhill believes that shopping is the "dipstick of our culture." Much of human civilization is wrapped up into the exchange of goods and money. This is what drove us to the New World, built the Silk Route, and tracks the evolution of the human condition (especially the Western-version of this story.)
Malls have been a place where commerce and community interact, where social capital was built. They performed this social role for while, and perhaps not as well as the agoras or small village markets before them. This is why the return of small markets and focus on local production in North America is such an interesting sign, with California leading the way. Of course in Europe this way of shopping never went a way, and is one of the reasons why we love living in Paris, our open-air market full of amazingly fresh produce is just 50 metres away from our front door. But even in Paris, these markets have dwindled. In the suburbs, most families hit Carrefour and other Supermarchés.
Underhill started out studying environmental engineering. Perhaps he feels that by influencing the future of malls, and how we design them, he can make a dent in improving how we live by improving the quality of our consumption experience. In a resource-constrained future where global consumers can't possibly consume the same quantities of goods and services—especially as China and India hit their growth targets— this kind of thinking will be sorely needed.
Related to the running topic— why happiness eludes many people — I found some interesting research in this new book, The Paradox of Choice: Why More is Less, by Dr. Barry Schwartz. Learn more about it on a great CBC radio program, "Quirks and Quarks".
"We've never had more choices, more selection, or more trouble dealing with the multitude of options that modern society presents us with. And, according to Dr. Barry Schwartz, a professor of psychology at Swarthmore College in Pennsylvania, it's making us miserable. In his book, The Paradox of Choice: Why More is Less, he argues against the conventional wisdom that more choice brings greater contentment. His studies show that as options multiply, various factors lead us to choose less well, and to enjoy the fruits of our decisions less. He thinks this paradox has implications for medicine, marketing and social policy."
There is another good review of this book in The New Yorker.
The logic explaining why we are unhappy and uneasy in world of choices seems counter-intuitive, but it makes sense to me after hearing Schwartz's interview. It goes something like this:
The more options we have -> the more information and effort we have to go into evaluating them -> the more likely we are to be dissatisfied with the outcome. Why?
1) Most people hate making trade-offs and will often avoid making choices until they absolutely have to, so having an abundance of choices reminds us of this dilemma: that life is about making choices, yet we must make them within the vacuum of uncertainty and an unknown future.
2) Most people are bad at dealing with uncertainty, estimating odds and often don't calculate probabilities properly because we have incomplete information. So trump this up to certain cognitive flaws in our human decision-making apparatus.
3) Our expectations get raised after spending time weighting the tradeoffs and understanding the choices, so we get disappointed when the outcome is not as perfect as we expect. The irony is that the outcome is almost always better, i.e. a better fitting pair of jeans, but our perceptions don't see this rationally or objectively. As we know from countless studies, not to mention certain wisdoms found in traditions like Buddhism, our satisfaction is often function of when expectations match our perceived reality. In economist language, dissatisfaction occurs when the transaction costs of making the decision exceed the actual benefit.
4) What is called "adaptation." In a nutshell, we adapt to our circumstances. This happens within our hedonic system as well, i.e. our internal system that modulates things that feel good and bad. So things that feel good, feel less and less good over time. Just as things that feel bad, like the grating sound of construction work outside my window, feel less bad over time (although that's debatable.) So the more we have, the more we get used to this stuff, the less special it feels.
The author makes clear that providing more choice is not a bad thing and often an important social goal. The trouble is that we, in the rich West, are spending too much time going through this cycle making trival choices. That we have too much of a good thing. He cites research which tracks the relationship between well-being and choice. These two factors are highly corelated and rise together over time, but after a certain point, they become decoupled— the law of diminishing returns kicks it. More becomes less. In fact, most research tells us that happiness, across most cultures, is largely unrelated to material well-being after people pass the subsistence level of existence. Of course, we have billions of people who are still at this dismal level. Still too many people without many choices. And I believe on a larger ecological scale we are collectively eroding future choices and options for sustainable living.
Interestingly, the author links the explosion of depression over the last few decades (news to me but a meme theme that keeps pinging me of late) as possibly related to the vertigo of choice, that this may be contributing to extra stress, anxiety and uncertainty in people's lives.
I'm thinking relevance now to the commercial sphere and I think this is already impacting businesses. Indeed, this research is strongly connected to the trend of commoditization and why brands are struggling, which I talk about in an earlier blog. (See "Towards the Commoditization of Love".)
From a social policy perspective, there are important implications as well. Governments invest a great deal in helping people have more choice. But perhaps this is a waste of money, Schwartz suggests, if in fact more choice is reducing people's well-being? This is especially so given that opportunities are not equally distributed. In effect, we have an imbalance between the people who have too few choices (the rich) and the people who have too many (the poor). Perhaps this the role of tax policy and government to manage this imbalance? He is a utilitarian, in essence, and I can see lots of things wrong with his argument—namely, that younger generations will likely adapt new cognitive strategies for dealing with excess choice, that the marketplace will respond and "self-correct" this problem— but the research might be worth looking at in closer detail. He's on to something, without question.
All of this reminds of of the last line in Samuel Beckett's play, "Waiting for Godot."
Wait! (He moves away from Vladimir.) I sometimes wonder if we wouldn't have been better off alone, each one for himself. (He crosses the stage and sits down on the mound.) We weren't made for the same road.
(without anger). It's not certain.
No, nothing is certain.
Vladimir slowly crosses the stage and sits down beside Estragon.
We can still part, if you think it would be better.
It's not worthwhile now.
No, it's not worthwhile now.
Well, shall we go?
Yes, let's go.
They do not move.
Paralysis is indeed a hallmark of the Now. I'm getting a better sense of why.
"Meanwhile: You can't buy happiness, but you can measure it" by Andrew Johnston IHT (Tuesday, May 4, 2004.)
"Quantifying happiness is a relatively new endeavor. In the past, if we wanted to know whether life was getting better, we simply measured wealth. The most useful figure was gross domestic product, the measure of the goods and services that a nation produces. When GDP went up, crime generally went down and health got better. Some time in the 1970s, according to researchers, that connection broke. It will always be true that people suffer in a recession. But when social policy experts subtracted from GDP the cost of unpaid work, crime, family breakdown, pollution, environmental damage and other factors, they found that for the last 30 years, rising hey found that for the last 30 years, rising incomes have not necessarily meant better lives. In fact, they number of people suffering from depression in industrialized countries had grown tenfold in 50 years." (My emphasis)
Very interesting, I thought. And this resonates with a whole range of things I'm seeing around me, everything from increasing angst amongst executives in large companies to the kinds of questions "ordinary" folks are asking themselves about how they want to live their lives. See the Netherlands-based magazine Ode, for instance, for one version of this from a younger generation of change-makers.
The original meaning of "wealth" actually came from the word "well-being". So wealth in its first usage really meant something much broader than just material and economic prosperity. I think one of the great projects of the present is to reintroduce this fuller meaning of wealth and well-being back into our political and social consciousness—and thus choices. Helping people see a broader range of choices, possible routes towards happiness, is a noble and exciting place to focus ones' energies these days; and I see more and more people deciding that their professional life should facilitate this.
Shifting our metrics and measurements is where many people are focusing their attention. Often the dry and boring domain of accountants, shifting the what and how we measure things is a high leverage place to intervene in any system because they form the "soft infrastructure" and shape the rules of the game. Metrics incent, prevent, reward behaviours and they focus attention on what's important and what isn't, which is why many measures are often proxies for deeper values and beliefs within a society. The fact that the priority has been given to economic progress over the last 70 years speaks volumes about what our society cares about, or less judgmentally, felt that it was supposed to care about. The fact that software is not measured in GNP is also an example of how hard it is for metrics to change with the times. But metrics and rules can change and should evolve with the times. Most people think these measures are sacrosanct and static laws of nature, which they are not. They are human inventions. GDP and GNP were invented by a group of experts in the 1930s.
So this article is just another reminder of the much broader conversation about what metrics and measures we need to create for our current context. I'm not 100% where the serious research and experiments are happening towards this end. One place to look is the New Economic Foundation. We also have the Himalayan country of Bhutan (population 750,000), a Buddhist theocracy, developing a Gross Domestic Happiness index, which is has been doing since 1972. Most "serious" economists just laugh at this project, but getting better insight into happiness-- and its implication for social policy-- is likely to come from reperceiving our world through a different lens. Other disciplines beyond the "hard" sciences will also play an important role. Psychology is an obvious place to look as is philosophy for wasn't it Aristotle who first talking about the Good Life? According to Seligman, a professor at the University of Pennsylvania, there are three ways to find happiness:
1. The Pleasant life, which consists of experiencing as many of life's pleasures as possible, is what we mean most often when we talk about happiness.
2. The Good life, which is more about self-actualization and deriving joy and satisfaction with a more narrow band of relations, e.g. family and friends.
3. The Meaningful life "where engaging with a cause or an institution supplies a sense of belonging to something much bigger than you are."
Again from IHT article, "we should be asking what governments and corporations can do to recraft work so that people can see their jobs as part of something larger," wrote Seligman in an article, Beyond Money: Toward an Economy of Well-Being. It's not surprising then that we have grassroots phenonoma like the Socrates Cafe and Philosophers' Club popping up all over the world.
[A book from one of the originators of the idea of promoting Socrates Cafes.]
What's amazing is that it's not just well educated elites joining in but people from all backgrounds just curious and wanting to discuss the big questions of life in a safe setting. (Read this article to learn more.) What does this mean? It's hard to say, but for me it's an indicator that some bottom-up forces might be building a head of steam in reaction to a wide range of dissatisfactions and angst within modern life.
There was a time when I was fascinated with the European project, so much so that I spent a year writing about it for my undergraduate thesis. My fascination started with the observation that we were seeing twin and potentially contradictory forces going on: economic integration and ethnic or political fragmentation—the EU being the best example of this phenomenon.
Today, now living in Europe, my fascination is more muted. Like most people outside of evangelized Europhiles, I'm ambivalent. My opinions ossilitate with my shifting perceptual moods, varying depending on which lens I chose to see this through. From a Long View, for instance, and on the surface level the EU project and its recent expansion is unquestionably optimistic, a bold experiment in governance and human organization. But closer in, and perhaps deeper down—unarticulated and fermenting in people's hearts and minds— there are some major uncertainties.
I think this recent article captures some of these discontents. "Europe comes together in fear and trepidation" by Dominique Moïsi, International Herald Tribune (Friday, April 30, 2004.)
"In 2004, the European Union is much less secure about itself, and about the quality and future of its model. Deep down, the EU has greater doubts than before about the ability of the new Europeans to "learn" from the old.
In fact, Europe, old as well as new, is in the midst of a deep identity crisis. How can we be secure about what we are bringing, when we no longer know what we are? We do not know where our continent ends. Is Turkey in Europe or not? The question goes to the heart of our identity debate. Is Europe about the value of geography or about the geography of values? Turkey is clearly not in Europe, but that must be balanced against the risk of saying no to the only example of a modern, democratic and secular Islam.
More than ever, we ignore our institutional future. We can no longer be sure than we shall have a constitution ratified by all, with a preamble encompassing all the values we are so proud of. And we do not know - especially with the war in Iraq and our fundamental divisions with respect to Washington and the future of trans-Atlantic relations - whether we can have a common foreign and security policy.
Even if we ignore our geography and our institutional and diplomatic future, we know too well the depressing state of our demography, which makes of us - in contrast to that other West, the United States - an aging continent, with a need to integrate others, who most likely are not going to be Europeans.
We realize also the decline of the European ideal. Europe may have become a growing and expanding reality, from the euro to the Schengen Accords, but it is less and less an enthusiastic dream, a project that can mobilize us. Even among us, in the old West, one can perceive signs of regression in the form of a re-nationalization, if not a retribalization of political exchanges.
We must avoid any form of arrogance when we confront the question of what we will bring to the new members, but we should also refrain from too negative an assessment of ourselves."