Think Smart Act Smart by Jim Nightingale
|Book Details :|
Think Smart Act Smart
- Chapter 1 Avoiding Error: An Introduction
- Chapter 2 Wishful Thinking
- Chapter 3 Mythical Thinking
- Chapter 4 Tribal Thinking
- Chapter 5 Royal Thinking
If this book had a hero, it would probably be Sir Isaac Newton. Today there is an entire branch of physics known as Newtonian. Newton made great discoveries in mathematics, optics, and physics, and his formulation of the law of gravitation is still in use today.
As president of the Royal Society, Newton became the world’s first great administrator of science, laying the foundation for the way research is conducted today. In short, Newton was a true genius.
Not just a very smart guy, but the real deal, a man who could, by sheer force of intellect, uncover the deepest mysteries of nature. Yet, as this book will describe, when Newton deduced that a crash in England’s young stock market was inevitable, he chose not to withdraw from the market and lost the equivalent of over a million dollars (in today’s money).
He later commented ‘‘I can predict the motions of the planets, but not the limits of human folly.’’ His mistake looked much like that made by many investors in America’s dot.com crash. Sometimes apparently intelligent individuals, even geniuses, do things that look exactly like stupidity.
That is what this book is about. In general, we don’t worry about smart people making stupid decisions. We believe that having smart people involved in an enterprise means things will turn out well. Smart people are well paid.
Corporations, the government, and academia seek them out for their talent. When things go wrong, we look for smart people to fix them.
Why? Because we think that they will get it right. Yet if we look around us, we can find all kinds of examples of things that smart people got wrong. In hindsight, some of them seem so simple.
We look at them and say, ‘‘How could a smart guy like him do something so stupid?’’ The Ford Edsel was created by some of Detroit’s best automobile brains, men who achieved great success both before and after that debacle.
Similarly, the Bay of Pigs invasion was approved and executed by a group of individuals who were supposedly America’s ‘‘best and brightest,’’ yet it was laughably ill-conceived and poorly thought through.
How did the space shuttle Challenger crash when, looking back, the data and causes were so obvious? How many of us have asked ourselves why such a wonderful person as our friend is dating such a loser when the entire rest of the planet knows she deserves better?
The list of high-profile stupidities committed by intelligent individuals is endless. The examples come from all aspects of society.
The commercial arena, for instance, gives us examples like the never-ending stream of management fads it has embraced since the 1980s, the buying behavior of the investing public during the dot.com runup of the 1990s, and the Long-Term Capital Management disaster where Nobel Prize winners collaborated in the destruction of a successful hedge fund.
Every segment of society has offered up gems of idiocy, or at least something that looks like idiocy, for our consideration. This book is about why these things happen, and why they have to happen, at least until we understand their causes.
All the errors presented in this book were caused by ways of thinking that are not stupid; indeed, we need them in order to live. But like many good things, too much of them can be harmful. There are trade-offs in ways of thinking, just like in anything else.
If I buy a car that is big and comfortable, I have to spend more on gas; if I buy a small, fuel-efficient car, I lose in safety.
The same trade-offs apply to the ways we think. For example, if we are wired to value associations with other people, if we are to be pack creatures and have to obtain crucial support from our fellows, we will sometimes make mistakes by ‘‘going along with the crowd’’ because we value our group membership so highly.
Every error in this book is the result of this kind of trade-off; they are good things gone bad. Each one is an example of a behavior we need but that sometimes leads us to error.
But it also progresses with every ‘‘Well, that stung. Not going to try that again!’’ Certainly, if we couldn’t learn from our mistakes, humanity’s time on this planet would have been very short, but there is a larger unknown about errors, which is: Why do we make them in the first place?
It was a big company, and a leader, at least in terms of size, in building and operating nuclear power plants with all the complex technology that involves. After a brief stint as a trainer, I went to work helping to build and test a new plant.
I was surrounded by smart people, everyone involved was intelligent, educated, and many were already experienced in the power industry. Yet all was not well.
Time and again I saw us make mistakes that we should have avoided, not ‘‘China Syndrome’’ material, but things that kept us from bringing the plant online as planned.
Yet they all had smart people. When my consulting work took me outside the power industry, I saw that same thing in other areas.
In some ways being part of the nuclear power industry was very educational. One facet of my involvement in that industry was being a bit on the inside of the public debate that went on over the safety of the technology.
There were clearly smart people on both sides of the debate, yet some thought that this technology was literally the end of the world, and others felt it was its salvation.
If they couldn’t both be right, how could they both be smart? As an engineer working on the plants, I was nominally on the pro side, but I tried to understand everyone’s arguments. One particular incident stuck with me.
In the latter stages of plant construction, there was a finding of low-level radioactivity in the local water supply. The uproar was immediate.
The anti-nuclear activists were furious; the plant was not even running yet and already it was polluting the local environment. There were protests and articles in the local papers. Those inside the plant were equally incensed.
At that time there was not a gram of nuclear fuel on the site. The reason for the low level of radioactivity in the local water was that the plant was being built over an abandoned coal mine, and coal has certain natural radioactivity, which had always been in the water on that site and always would be. We perceived the protesters as cynical.
I now believe that the anti-nuclear crowd, at least most members, were probably sincere, just mistaken in this case. They were neither stupid nor dishonest, but the victims of some of the thought patterns you will discover in this book.
This type of incident interested me enough to look into the psychology of belief and the process of how humans make decisions.
What I found startled me. The errors that I had noticed in the business world paled next to some of the weird things that people believe in everyday life.
How on earth can people convince themselves, as members of the Heaven’s Gate group did, that if they committed suicide, an alien spaceship would transport them to heaven? Most of us just shrug; those people were just weird, after all.
The more I thought about it, the more all these mistakes started to look alike to me. Even some of the beliefs in the left-field were rooted in the same type of thought processes that made the smartest engineers and managers I knew to do things wrong.
Download Think Smart Act Smart by Jim Nightingale PDF Free