Do What You Feel, Maybe – the power and perils of relying on intuition

David G. Myers | Posted on 01/01/07

Consider these statements of President George W. Bush:

• After meeting Russian president Vladimir Putin, Mr. Bush had him sized up: “I looked the man in the eye. I was able to get a sense of his soul.”

• Explaining to journalist Bob Woodward his decision to launch the Iraq War, he said, “I’m a gut player. I rely on my instincts.”

• The purpose of the president’s 2006 fly-in to Baghdad was, he explained to American troops, “to look Prime Minister Maliki in the eyes — to determine whether or not he is as dedicated to a free Iraq as you are.” The president’s snap assessment: “I believe he is.”

• When interviewed on TV by Larry King, Bush confidently said, “If you make decisions based upon what you believe in your heart of hearts, you stay resolved.”

The president, in expressing such self-reliant faith in his gut, echoes Pascal, who opined in 1670 that “[t]he heart has its reasons which reason does not know.” But it’s by no means obvious that Pascal had it right. King Solomon offered in Proverbs 28:16, “He that trusteth in his own heart is a fool.” So who offers better advice? Is Pascal generally right that we should tune in to our right-brain intuition and trust the force within? “Buried deep within each and every one of us, there is an instinctive, heartfelt awareness that provides — if we allow it to — the most reliable guide,” Prince Charles has said. We need, he adds, “to listen rather more to the common sense emanating from our hearts.” Trust yourself. Rely on your gut instincts. Whether hiring, firing, or investing, silence distraction and tune into the whispers of your intuition — your own immediate wisdom.

Or is Solomon more reliable when he suggests that we should be wary of such gut inclinations? Given our vulnerability to self-inflated overconfidence, should humility trump self-reliance? With bright people so often doing demonstrably dumb things, do we instead need more left-brain rationality, more checking of hunches against reality, more critical thinking? “The first principle,” says physicist Richard Feynman, “is that you must not fool yourself — and you are the easiest person to fool.”

This much is sure: Intuition — our effortless, immediate, unreasoned sense of truth — is inescapable. Self-reliant intuitions shape our fears, our first impressions, and our relationships. Intuition influences prime ministers in times of crisis, gamblers at the table, and personnel directors eyeing applicants.

In flying by the seat of his pants, President Bush has a lot of company. For those disposed to follow that inner guide, today’s pop psychology offers books on “intuitive healing,” “intuitive learning,” “intuitive managing,” “intuitive trading,” and much more. Magazines have offered scores of articles on topics such as how to “let intuition be your guide” (by giving “yourself permission to listen to ... your intuitive voice” and learning to exercise your “intuitive muscle”). If you wonder whether your partner may be cheating on you, you can learn to “trust your body. One way intuition speaks to us is through actual physical sensations.”

Are these intuitions about intuition valid? If our consciousness is sometimes invaded by unbidden truths, then there is innate wisdom for us to behold, if only we would desist from thinking and listen to the still, small voice within. But if that premise is incorrect — if such advice on intuition is to cognitive science what professional wrestling is to athletics — then we had better also rely on evidence and intellect. The consequences, in either case, could be profound.

.....

Recent cognitive science reveals a fascinating unconscious mind — a below-the-radar mind — that Freud never told us about. More than we realized a decade ago, thinking occurs not onstage but off-, out of sight. Studies of automatic processing, subliminal priming, implicit memory, heuristics, spontaneous trait inference, right-brain processing, instant emotions, nonverbal communication, and creativity unveil our intuitive capacities. Thinking, memory, and attitude all operate on two levels — conscious/deliberate and unconscious/automatic. “Dual processing,” today’s researchers call it. As the following examples illustrate, we know more than we know we know.

Blindsight. Some individuals, after stroke or surgical damage to the brain’s visual cortex, experience blindness in part of their field of vision. Shown a series of sticks in their blind visual fields, they report seeing nothing. Yet when asked to guess whether the sticks are vertical or horizontal, they unerringly offer the correct response. When told, “You got them all right,” they are astounded. There is, it seems, a “second mind” — an intuitive parallel processing system — operating unseen.

Indeed, “sight unseen” is how University of Durham psychologist David Milner describes the brain’s two visual systems — “one that gives us our conscious perceptions, and one that guides our actions.” The second he calls “the zombie within.” Milner describes a woman with brain damage who can see fine details — the hairs on the back of a hand — without being able to recognize the hand. Asked to use her thumb and forefinger to estimate an object’s size, she can’t do it. Yet when reaching for the object, her thumb and forefinger are appropriately placed. She knows more than she is aware of.

Automatic processing. A number of recent experiments confirm that most of our everyday thinking, feeling, and acting operates outside conscious awareness, often “primed” by subtle influences. This big idea of contemporary psychological science “is a difficult one for people to accept,” report researchers John Bargh and Tanya Chartrand. The nature of our consciousness gives us a bias toward thinking that our intentions and deliberate choices rule our lives (understandably, since tip-of-the-iceberg consciousness is aware only of itself). But consciousness often overrates its own control. In reality, we operate like jumbo jets, flying through life mostly on intuitive autopilot.

Take something as simple as speaking. Strings of words in unplanned sentences effortlessly spill out of our mouths with near-perfect syntax (amazing, given how many ways there are to mess up). We hardly have a clue how we do it. But there it is. We just know, without knowing how we know, that “a big red barn” sounds better than “a red big barn.” Cognitive psychologist George Miller offers a conversational metaphor for our out-of-sight information processing: “There sure is a lot of water in the ocean,” said one ship passenger to another as they gazed over the sea. “Yes,” said the other, “and we’ve only seen the top of it.”

Reading “thin slices.” Sometimes, after observing someone for a mere “blink,” we find ourselves responding positively or negatively. Our gut reactions after witnessing such “thin slices” can indeed be revealing, report Nalini Ambady and Robert Rosenthal.

They invited people to rate college professors’ teaching skills after observing them for only ten seconds. With remarkable accuracy, these ratings predicted end-of-the-term ratings by the professors’ own students. Even thinner slices — three two-second clips — yielded ratings that were similarly congruent with student evaluations. To get a sense of someone’s personality — his or her confidence, energy, and warmth — six seconds will often do.

Even micro-slices can be revealing. When John Bargh flashed an image of an object or face for just two-tenths of a second, his New York University students evaluated it instantly. “We’re finding that everything is evaluated as good or bad within a quarter-second,” he reports. Before engaging in rational analysis, we may find ourselves mildly loathing or loving someone.

There is biological wisdom in this express link between perception and emotion. When confronting someone or something in the wilderness, one needed instantly to decide: Is this a threat? Those who could read facial expressions quickly and accurately were more likely to survive and leave descendants (us). And that helps explain our intuitive prowess at instantly distinguishing among expressions of anger, sadness, fear, and pleasure.

Intuitive expertise. We are unthinkingly reliant on our automatic processing as we read faces, experience unbidden creative insights, and perceive the everyday world. From our two eyes our brain receives slightly differing images of an object. In a microsecond, it analyzes the difference and infers the object’s distance. Even with a calculator at hand, our conscious mind would make the same geometric computation much more slowly. No matter — our intuitive mind already knows.

Studies show that as we gain expertise, even reasoned judgments often become automatic. Rather than wend their way through a decision tree, experienced car mechanics and physicians will often, after a quick listen and look, diagnose problems. After barely more than a glance at a chess board, chess masters, who may have 50,000 board layouts stored in memory, intuitively know the right move. Japan’s chicken-sexers use complex pattern-recognition to separate newborn pullets and cockerels with near-perfect accuracy, although they are hard pressed to explain how they do it.

In each case, intuition is learned expertise that’s instantly accessible. Driving initially requires concentration. With practice, it becomes automatic. One’s hands and feet intuitively do it, while one’s mind is elsewhere. At every moment, skilled violin players know, without thinking, just where to place the bow, at what angle, with what pressure.

Athletes, like chess masters, develop intuitive expertise at reading patterns in developing plays. In his article “The Physical Genius,” Malcolm Gladwell explains how Wayne Gretzky liked to keep the game in front of him, enabling him to anticipate events. “When he sends a pass to what to the rest of us appears an empty space on the ice,” Gladwell writes, “and when a teammate magically appears in that space to collect the puck, he has in reality simply summoned up from his bank account of knowledge the fact that in a particular situation, someone is likely to be in a particular spot, and if he is not there now he will be there presently.”

Learned expertise that becomes automatically intuitive facilitates a smart self-reliance. When experienced gourmet cooks say they “just use experience and intuition” in mixing ingredients, they do not mean novice intuition. They are stating “the theory of expert performance that has emerged in recent years,” noted psychologist Herbert Simon. “In everyday speech, we use the word intuition to describe a problem-solving or question-answering performance that is speedy and for which the expert is unable to describe in detail the reasoning or other process that produced the answer. The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.”

.....

So, is our intuition-guided president smart to harness the powers of his inner wisdom? Or should he and we be subjecting our intuitive hunches to skeptical scrutiny?

Intuition drives much of our behavior, and in domains ranging from routine perceptions to our developed expertise, it has powers upon which we can rely. Yet often we underestimate intuition’s perils. My geographical intuition tells me that Reno is east of Los Angeles, that Rome is south of New York, and that Atlanta is east of Detroit. But I am wrong, wrong, and wrong.

As I explain in my book Intuition: Its Powers and Perils, hundreds of experiments have shown that people greatly overestimate their lie-detection accuracy, their eyewitness recollections, their interviewee assessments, their psychic powers, and their stock-picking talents. It’s humbling to realize how often, relying on our own gut instincts, we misjudge and mispredict reality, and then display “belief perseverance” when facing disconfirming information. Two examples:

Overconfidence. By giving people all sorts of factual questions, and then asking them to state their confidence in their answers, researchers have identified a potent “overconfidence phenomenon.” Is absinthe a liqueur or a precious stone? Which is longer, the Panama Canal or the Suez Canal? In the United States, which claims more lives each year, homicide or suicide? The routine result: When the questions are challenging, people are usually more confident than correct. On questions where 60 percent of folks answer correctly, they typically feel 75 percent sure. (The answers, by the way, are: a liqueur; the Suez, which is twice as long as the Panama; and suicide, which takes nearly twice as many lives.)

People also display overconfidence when predicting others’ behavior, and even when predicting their own. Robert Vallone and his colleagues had Stanford University students predict in September whether they would drop a course, declare a major, elect to live off campus next year, and so forth. Although the students felt, on average, 84 percent sure of these self-predictions, they erred nearly twice as often as they expected. Even when feeling 100 percent sure, they erred 15 percent of the time.

People also routinely err when they predict the intensity and duration of their future emotions, for which there is a shorter half-life than typically supposed. Get what you long for and you’ll be happy, but likely for a shorter period than you expect. Then again, suffer a disability, a romantic breakup, exam failure, tenure denial, or team defeat, and you may discover a resilience you didn’t realize you had (thanks to what Daniel Gilbert and Timothy Wilson call our “psychological immune system”).

Other studies show that, because of inflated estimates of how self-reliant they are, people are similarly overconfident in predicting how they’re going to change — by losing weight, studying harder, quitting smoking, or exercising daily. For most, the typical pattern is to start well, then to regress to old habits. (One of psychology’s maxims is that the best predictor of future behavior is past behavior, not present intentions.) Change can happen, but it’s most likely for those who realistically appreciate the challenge and the needed discipline and mental energy.

At its worst, overconfidence breeds catastrophes. It was an overconfident Hitler who invaded the countries of Europe. It was an overconfident Lyndon Johnson who sent the U.S. Army to salvage democracy in South Vietnam. People sustain such overconfidence by seeking information that confirms their decisions, and also — when failure can’t be denied — by recalling their mistaken judgments as near misses.

University of California professor Philip Tetlock observed this phenomenon after inviting various academic and government experts to project — from their viewpoint in the late 1980s — the future governance of the Soviet Union, South Africa, and Canada. Five years later communism had collapsed, South Africa had become a multiracial democracy, and Canada continued undivided. Experts who had felt more than 80 percent confident were correct in predicting these turns of events less than 40 percent of the time. Yet, thinking back on their judgments, those who erred felt they were still basically right. “I was almost right,” said many. “The hard-liners almost succeeded in their coup attempt against Gorbachev.” “The Quebecois separatists almost won the secessionist referendum.” “But for the coincidence of de Klerk and Mandela, there would have been a much bloodier transition to black majority rule in South Africa.” Among political experts — and stock market forecasters, mental health workers, and sports prognosticators — judgments persist partly because overconfidence is hard to dislodge.

Fearing the wrong things. With images of 9/11 etched on their memories, and periodic terrorist plots and warnings, travelers worry. “I’m going Greyhound rather than fly to California,” one of my relatives explained. “Al Qaeda’s not so likely to target a bus.” But people’s fears often misalign with the facts. The National Safety Council reports that from 2000 to 2002 Americans were, mile for mile, 39.5 times more likely to die in a vehicle crash than on a commercial flight. For most air travelers, the most dangerous part of the journey is the drive to the airport. In an essay written just after the World Trade Center attacks, I calculated that if we flew 20 percent less and instead drove half of those unflown miles, about 800 more people would die in traffic accidents in the next year as indirect terrorist casualties. In a follow-up article, German psychologist Gerd Gigerenzer confirmed that the last three months of 2001 indeed produced 350 more American traffic fatalities than normal.

Why, when we rely on our own sense of what endangers us, do we so often fear the wrong things? Why do so many smokers, whose habit shortens their lives on average by about five years, fret before flying, which, averaged across people, shortens life by one day? Why do we fear violent crime more than obesity? Why do most women fear breast cancer more than heart disease, which kills five times as many women? Why do we fear terrorism more than accidents — which kill nearly as many people per week in the United States alone as did worldwide terrorism during the 1990s. Even with the horrific scale of 9/11, more Americans in 2001 died of food poisoning (which scares few) than terrorism (which scares many).

Psychological science has identified four influences on our intuitions about risk.

We fear what our ancestral history has prepared us to fear, which includes confinement and heights.

We fear what we cannot control. When behind the wheel of a car, we feel self-reliant, and hence confident.

We fear what is immediate. Flying telescopes its risk into short time frames, while driving or smoking diffuses the danger over time. Global warming may be a potent future weapon of mass destruction, but we live in the present.

We fear threats most readily available in memory. Horrific images of terrorist acts form indelible memories. And the availability of various threats in our memory provides our intuitive rule for judging risks. If a surface-to-air missile brings down a single American commercial airliner, the result will be traumatic for the airline industry. Given the difficulties in grasping the infinitesimal odds of its being a plane you’ll be on, probabilities won’t reassure people. Images and emotions will hijack the mind.
For these reasons, we fear too little those threats that claim lives undramatically, one by one, rather than in bunches. Smoking gradually and quietly kills 400,000 Americans a year, while terrorists kill many fewer in ways that cause more terror. It is normal to fear terrorism from those who hate us, and when terrorists strike again we will understandably recoil in horror. But smart thinkers will not be too self-reliant when judging risk. They will check their intuitive fears against the facts and will resist political fear-mongering. They will be mindful of the realities of how humans die. And with such mindfulness they will take away the terrorists’ omnipresent weapon: exaggerated fear that diverts our attention from even bigger dangers.

Intuition — automatic, effortless, unreasoned thinking — guides our lives. Intuition feeds our automatic behaviors and grows from learned expertise. But intuition is also perilous. In realms from sports to business to risk assessment, we now understand how self-reliant intuitions may go before a fall, and why and when we should check our intuitions against the facts. So, let us welcome the creative whispers of the unseen mind, but as the beginning of inquiry. Smart thinking, critical thinking, often begins with self-reliant hunches, but continues as one examines assumptions, evaluates evidence, invites critique, and tests conclusions.