Transcript
Page 1: Seeing is Not Believing (Lawhead)

Seeing is Believing:

Insights for Intelligence Analysis from Professional Magicians

William LawheadUniversity of Mississippi

Does magic really have something to contribute to the task of intelligence? This seems like an unlikely thesis. However, professional magicians have often made contributions to their nation’s interests in war. Jean Eugène Robert-Houdin, the 19th-century French magician, is considered to be the father of modern magic. In 1856 the French government sent Robert-Houdin to Algiers to quell a potential rebellion. Robert-Houdin’s “miracles” impressed the Algerians with the supernatural power of the French and the rebellion was avoided. Jasper Maskelyne was a British stage magician in the 1930s and 1940s, who came from a family of magicians. During World War II he worked for the British military, applying his skills at developing stage illusions to the cause of wartime camouflage, ruses, and deception.1

Magicians have also been involved with espionage. The best known case is that of John Mulholland. He was a prominent American magician in the mid-twentieth century. However, in 1953 he retired from the magic scene, supposedly for health reasons. We now know he was working on a manual for CIA agents. It has now been declassified and published as The Official CIA Manual of Trickery and Deception.

While battlefield activities and espionage receive support from intelligence analysis and contribute to it, the three activities are not the same. Hence, I will have to go further in supporting the claim that magic has relevance for intelligence analysis.

Deception, Conception, and Misperception

The magician and the strategic deceiver are similar in that they are both trying to manage the perceptions and the cognitive processes of their targets. On the other hand, the magician’s audience and the intelligence analyst are similar in that they both are trying to figure out what is really going on and to detect and penetrate any attempts to deceive them.

What enables us to be deceived (whether this is in a magic show or in an intelligence context) is that certain features of the human visual and cognitive apparatus allow us to misperceive what is going on and/or to draw the wrong inferences from our observations. In magic, the attempt to deceive is deliberate and calculated. The same is true for strategic deception. In both cases there is a deceiver and there is the deceived. However, the same visual and cognitive mechanisms that make these sorts of deception possible also make self-deception possible. Magician Roberto Giobbi says, “[O]ur spectators fool themselves. All you need to do is avoid any words, thoughts or actions that interrupt this tendency.”2 Similarly, because of these

1

^

Page 2: Seeing is Not Believing (Lawhead)

same cognitive mechanisms an intelligence analyst can make faulty judgments even if he or she is not the victim of purposeful deception. Hence, this paper is not concerned exclusively with counterdeception.

While I will discuss principles of deception that are of common interest to magicians and analysts, there is a deeper and more interesting point of contact between these two professions. The effectiveness of the magician’s deception is a function of how successful the performer is at using applied psychology. This is more important to the experience of magic than are the secret moves and gimmicks the magician may use. The main theme of this paper is that understanding the psychological principles underlying magical performances can be helpful to the analyst. Hence, the major point of contact between magicians and intelligence analysts could be termed perception management. This is a term created by the military and is not found in magic literature even though it exactly describes what magicians do. The following is the definition of “perception management” provided by the U.S. Department of Defense, but stripped of the terms that are specific to the military.

Perception Management:Actions to convey and/or deny selected information and indicators to . . . audiences to influence their emotions, motives, and objective reasoning . . . ultimately resulting in . . . behaviors and . . . actions favorable to the originator’s objectives. . . .3

The contribution magicians can make to our understanding of perception is indicated by the fact that in 2007 the Association for the Scientific Study of Consciousness had their annual meeting in Las Vegas. The reason for the location was in order to have a special symposium on the “Magic of Consciousness.” At this meeting, well-known professional magicians lectured to neuroscientists, psychologists, and cognitive scientists on what they know about human perception. The neuroscientists discovered that the magicians understood quirks in human perception that had not yet been known or investigated by brain scientists or psychologists. Teller, the silent one of the magic team of Penn and Teller, explains: “Neuroscientists are novices at deception. Magicians have done controlled testing in human perception for thousands of years.”4 Since the conference, there has been a flurry of experiments and published research in which scientists attempt to find a scientific answer for why our eyes and brains are fooled. These experiments involve eye-tracking technology and brain scans of subjects while they are watching a magic performance.5

Magic and the Brain

Paradoxically, the challenge facing our brains is both too little information and too much information. At any given time, we are only dealing with a very limited slice of the world around us, from which we must draw conclusions about the whole. From our fragmented and partial sensory input the brain must construct a model of the reality it is confronting. In perception, for example, we can only see a few sides of an object. We have to fill in the details by assuming the contents of what exists in that portion that falls outside of what we directly perceive. We do so from our memory of what we have experienced before. At the same time, we are bombarded with more stimuli than we can handle. Hence, the brain takes in and processes what it decides is relevant and ruthlessly ignores 95 percent of what is happening.6

2

Page 3: Seeing is Not Believing (Lawhead)

The fact that the brain necessarily interprets the data it receives and the fact that it must select what data it deems relevant and block out the rest, are important features of experience for both magicians and analysts. Political psychologist Robert Jervis observes, “Facts can be interpreted, and indeed identified, only with the aid of hypotheses and theories. Pure empiricism is impossible. . . .”7 It is the sneaky and self-serving goal of magicians to try to provide that interpretive framework for their audience. With respect to analysts, if they ignore the fact that the brain selects and organizes the data, then the interpretive scheme they employ will be unconscious, unexamined, and based on preconceptions and biases.

Misdirection

There is no doubt that misdirection plays a central role in magic. It also has a role in strategic deception. In both cases, misdirection is an attempt to control what the audience or a target perceives and how they interpret what is perceived. The suffix “mis” comes from the Old English for “wrong” or “bad,” while “direction” can mean “the act of pointing out the proper route.” Hence, psychologist and magician Gustav Kuhn says, “Misdirection can therefore literally be defined as pointing out the wrong way.”8 A more complex definition of “misdirection” that could apply to both magic and strategic deception is:

Any number of actions, words, events, or set of conditions that direct the target’s attention away from what the deceiver wants concealed or ignored and that focuses attention on what the deceiver wants the target to perceive or to think.

Most magic theorists say there are two kinds of misdirection: physical and psychological.9 Physical misdirection, which is usually visual, is an attempt to control the spectator’s gaze, by influencing where the spectator looks.10 To do this, the deceiver needs to create areas of primary (or high) interest and areas of secondary (or low) interest. Some magicians refer to these two subjective divisions of the perceptual field as “illuminated areas” and “dark areas.” In reality, the area of secondary interest is what should be of maximum interest to the spectator, for this is where the sneaky moves are made. However, if the magician is successful, the area of secondary interest appears to have minimal relevance and the area designed to be of high interest attracts all the attention. Hence, there is a slogan in magic which says, “Big actions cover little actions.”

On the other hand, psychological or cognitive misdirection in an attempt to control the spectator’s attention and to shape what the spectator thinks. As with physical misdirection, areas of high interest and low interest are created in the data provided, so that the focus of attention is misplaced. According to some theories, attention is like a spotlight. It is very focused, but can miss what is in the periphery. In psychological misdirection, the deceiver pivots the spotlight of the mind toward the wrong place at the right time. As Richards Heuer, a former CIA analyst, says: “Deception is, above all, a cognitive phenomenon; it occurs in our minds.”11

The rest of this paper discusses various ways in which magicians carry out misdirection and “hack” our visual and cognitive systems. In each case, an example from magic will be discussed and then an application to intelligence will be made. The final section on “Anomaly

3

Page 4: Seeing is Not Believing (Lawhead)

Analysis” will present some of the limitations of both the magician’s craft and strategic deception that make efforts at counterdeception possible.

Assumptions: Dangerous but Essential

As was mentioned previously, our brain has to fill in the gaps in experience. It does this by making assumptions about what it is observing. Hence, it is mistake to suppose that assumptions are avoidable. The problem is with assumptions that are taken for granted, implicit, unexamined, and insufficiently grounded.

Assumptions play a central role in all deception. The magician or strategic deceiver will suggest assumptions to us or exploit assumptions they know we have. However, even if no one external is attempting to deceive us, our own preconceptions and biases can be the source of unwarranted assumptions.

In 1907 magician David Abbott invented the now famous trick of causing a golden ball to float around in the air. After entertaining his friends in his parlor, he would casually leave the ball on a bookshelf while he went to get drinks. The guests would take advantage of the situation to inspect the ball. Lifting it up they discovered that it was enormously heavy. It was obviously too heavy to be supported by a thread. What they failed to realize was that Abbott’s deception continued long after he left the room. In other words, his friends allowed him to plant the assumption in them that the ball he placed on the shelf was the same one he used in his performance.

The magician Teller published an article in the Smithsonian magazine in which he reveals the secret of a trick in order to illustrate some of the principles magicians use in perception management.12 In his own words, the effect is as follows:

I cut a deck of cards a couple of times, and you glimpse flashes of several different cards. I turn the cards facedown and invite you to choose one, memorize it and return it. Now I ask you to name your card. You say (for example), “The queen of hearts.” I take the deck in my mouth, bite down and groan and wiggle to suggest that your card is going down my throat, through my intestines, into my bloodstream and finally into my right foot. I lift that foot and invite you to pull off my shoe and look inside. You find the queen of hearts. You’re amazed. If you happen to pick up the deck later, you’ll find it’s missing the queen of hearts.

Here’s the secret to Teller’s trick. Before the performance he slips a queen of hearts in his right shoe, an ace of spades in his left shoe, and a three of clubs in his wallet. Then he creates an entire deck out of duplicates of those three cards. It requires 18 decks to acquire that many duplicates. The left column of the following table steps through each phase of the trick and the right column shows the assumptions that the spectator is induced to make.

The Steps of the Trick The Spectator’s Assumptions1. Teller removes a deck of cards from his

pocket.1. The deck is a normal deck. (Default

assumption)

4

Page 5: Seeing is Not Believing (Lawhead)

2. He cuts the deck several times, flashing several faces, but too quickly to notice anything but their color.

2. If I see several different cards, that means that all 52 cards are different. (Hasty generalization)

3. The spectator chooses a card , looks at it and returns it to the deck.

3. If I am given a choice, then I acted freely. (However, the choice is limited to three cards.)

4. Teller pretends to swallow a card and he reveals that it has landed in his right shoe (or left shoe or wallet).

4. The card I chose is the same physical object as the one in Teller’s shoe (or wallet).

5. Where the card is revealed depends on which of the three forced cards is chosen.

5. If I had chosen a different card, then it too would have ended up in Teller’s shoe. (False assumption)

6. While the spectator is examining the revealed card, Teller swaps the rigged deck for a normal one. He sets this deck down to tempt the spectator to examine it.

6. If I examine the deck afterwards, I can be sure that it was the same one as the one used in the trick.

7. The spectator finds that his card is missing from the deck. (But Teller has removed all three possible choices from the otherwise normal deck.)

7. The card I chose and which ended up in Teller’s shoe is the only card missing from the deck.

During a routine, magicians try to facilitate the creation of assumption chains in the minds of the audience members, each assumption reinforcing the others, so that when the apparent miracle occurs, the spectator does not know which assumption to question. For example, in Teller’s trick Assumption 4 (the card that is revealed is the same physical card that was chosen) and Assumption 7 (the spectator’s card is the only one missing from the deck) reinforce each other.

In intelligence, as in magic, the reliance on unexamined assumptions can make one the subject of deception or even self-deception. One way in which this happens is in the phenomenon of “layering.” This is the mistake of basing conclusions on previous assumptions without carrying forward the uncertainties of the previous layer.13 A good example of valid inferences and true premises layered upon an unexamined assumption is found in the 1973 Yom Kippur War. A leading assumption of the Israelis and the Americans was so well-entrenched that the Israeli post-mortem referred to it as “the Concept.” This was the assumption that the Egypt would go to war only for the sake of a military victory. However, this required that Egypt had sufficient air power to neutralize Israel’s air force. The following layers of reasoning were based on these assumptions.

1. If it is not possible for Egypt to have a military victory, then Egypt will not attack.

5

Page 6: Seeing is Not Believing (Lawhead)

(Unexamined Assumption = “the Concept”)2. If Egypt has insufficient air power, then it is not possible for them to have a military victory.

(Reasonable military judgment)3. Egypt has insufficient air power. (Well-established observation)4. So, it is not possible for Egypt to have a military victory. (Inference from 2, 3.)5. Therefore, Egypt will not attack. (Inference from 1, 4.)

This reasoning was deceptive because the two arguments are valid and several of the premises were definitely true. However, this chain of reasoning was built upon a questionable assumption. Rather than pursuing a military victory, Anwar Sadat went to war to accomplish political goals.14

Critical Questions about Assumptions

1. What are the assumptions I am bringing to the data?2. What grounds do I have for adopting them?3. What information should cause me to rethink my assumptions?4. Is the final level of certitude attached to my conclusions consistent with the certitude

levels of the assumptions on which my conclusions are built?

The Seductiveness of the Familiar

Our sense of what is familiar, routine, repetitive, or “normal” can be an important weapon in the hands of a magician or strategic deceiver. Seemingly familiar actions can cause the target to miss the novel, the unusual, and the sneaky that is embedded within routine actions. There are four psychological effects of repetition. If you see something done repeatedly in a particular way: (a) you will expect that it will continue to happen in the future and (b) suppose that what happens in the future will always happen in the same way that it did in the past, and (c) since what is happening now is similar to what happened before, you infer that they are exactly the same type of event, and, finally, (d) since repetition tends to be both reassuring and boring, it leads to complacency, which causes us to lose focus and to diminish our attention. We tell ourselves, “Nothing to be suspicious about here, I’ve seen this before.” Hence, repeated actions that are familiar, consistent, and uniform create the feeling that things are normal, which reduces the sense of suspicion on the part of the target. Magicians sometimes refer to this as a “conditioning action.”

Typically, the magician will lure us into a relaxed mode by repeating an action several times before making a sneaky move. For example, he will toss a ball back and forth between his two hands, to get us used to the idea of the ball being passed from his right hand to his left. Then he will do a “fake transfer,” apparently placing the ball into his left hand as he had done before, only to secretly retain it in his right, thus leaving the audience deceived as to the location of the ball. Some spectators will report that they definitely saw the ball pass through the air into the left hand, when in fact this was a false memory on their part, created by repetition.15

One way in which repetition can serve deception is in the case of a series of false positives that are a prelude to a genuine positive. This is an example of the “crying wolf” effect.

6

Page 7: Seeing is Not Believing (Lawhead)

The close-up magician Tony Slydini (who has been called “the master of misdirection”) perfected this type of strategy. For example, while handling a card chosen by a spectator, Slydini would suddenly make a very quick, suspicious maneuver. The spectator would be alarmed, thinking that something sneaky had been done. Slydini then showed that the spectator’s card was still where it was supposed to be. The spectator would feel sheepish for jumping to conclusions and would relax his guard. While the spectator’s attention was occupied with trying to figure out what had just happened, Slydini would then make the sneaky move, completely undetected.

The Egyptian attack in the 1973 war took place in a context that was carefully designed to exude a sense of normalcy and business-as-usual. For years the Egyptians had engaged in annual autumn military exercises. In 1973, these annual exercises were highlighted in the newspapers. Ironically, drawing attention to the exercises actually served to diminish their importance.

In late September, troops began to move forward toward the Suez Canal, consistent with the annual autumn exercises. However, to give the impression that this was nothing to be concerned about, the soldiers were forbidden to wear helmets. Furthermore, they could regularly be seen without shirts or weapons. Special units known as “lazy squads” sat on the bank of the canal fishing and eating oranges. The atmosphere was casual and low-key. The Egyptians knew their enemy well. They exploited the Israeli prejudice that the Egyptians lacked military discipline.

Similar to the magician’s technique of wearing down the audience’s defenses by arousing suspicions only to defuse them, the Egyptian attack was preceded by a series of false alarms. Since 1971, three major mobilizations and several minor ones occurred which did not culminate in an offensive strike. These unfulfilled threats reinforced Israeli preconceptions of the Egyptians as incompetent and indecisive. It was thought that Anwar Sadat was only capable of posturing. Prime Minister Golda Meir said: “No one in this country realizes how many times during the past year we received information from the same source that war would break out on this or that day, without war breaking out.”16

Critical Questions about the Familiar

1. Is this really something I have seen before, something that is routine? Or is it something new that is disguised as the familiar?

2. Have I been conditioned by previous false positives to not pay attention to what is happening now?

Premature Pruning of Hypotheses

It is rare that a correct hypothesis is consistent with all the data. Scientists realize that every large-scale theory has some conflicts. However, it is tempting to lighten our task by eliminating a hypothesis prematurely, simply because some information can’t be reconciled with it. The problem is that once a hypothesis has been eliminated, it is hard to put it back on the table again.

7

Page 8: Seeing is Not Believing (Lawhead)

One of the most effective ways to deceive an audience or an adversary is to get them to prune the hypothesis that is actually true and that provides the explanation they seek. For example, the ancient magic trick of the cups and balls involves small balls disappearing from the magician’s hand and mysteriously appearing underneath previously empty cups. From the audience’s perspective, the trick involves three cups and only three balls. Some magicians will directly confront the suspicions the audience has by saying “I have often been accused of using a fourth ball.” (In reality, a secret extra ball is essential to the working of the trick.) Having introduced this hypothesis, the magician then says, “Actually, this is true. In addition to the three balls you are aware of, I do use an extra fourth ball and here it is.” He then lifts up one of the cups to reveal a ball the size of a tennis ball, which is quite unlike the one-inch balls used in the trick. This manipulates the audience’s thinking in the following two ways.

1. The audience concludes: “Since the magician mentioned the extra ball hypothesis, it must be false.”

2. Instead of mentally reconstructing the previous tricks using the extra ball hypothesis, the audience now has been psychologically misdirected to ponder how the large ball was secretly placed in the cup.

Similarly, during the 1960s, prior to any Soviet missiles arriving in Cuba, there were persistent rumors of the presence of Russian missiles. It is possible that the Soviets used the technique of inducing U.S. intelligence to prune the true hypothesis. According to some reports, the Soviets planted rumors among anti-Castro Cuban refugees in Florida that the Russians were delivering missiles to Cuba.17 When the refugees reported these rumors to the U.S. authorities, it was assumed, that since this claim was so unbelievable, the refugees must be saying it because of their anti-Castro sentiments and their desire to provoke the U.S. into attacking Cuba. As a result of the Soviets poisoning the well in this way, when actual eyewitness reports about missiles began surfacing in Cuba, U.S. intelligence dismissed them as biased and naïve reports.

A classic example of a magician getting the audience to prematurely eliminate a hypothesis is the case of “the tuned deck,” an invention of Ralph Hull, an early 20th-century magician. The trick was a typical pick-a-card-shuffle-it-back-into-the-deck-and-I-will-find-it kind of trick. What made it different from most tricks is that its secret became even more elusive the more it was performed. Hull performed the trick over and over again to hundreds of professional magicians and none of them could figure out how he did it. The psychology of the trick was perfect. Hull would systematically eliminate each possible method of doing the trick in the minds of his fellow magicians. The first time the trick was performed, the knowledgeable observer would suspect that the card had been subtly forced on the spectator. However, the second time, the trick was performed in such a way that a force was impossible. Therefore, the audience of magician hypothesis-testers then suspected that Hull used a false shuffle in order to control the location of the selected card. This hypothesis was exploded in the next round when Hull allowed a spectator to shuffle the deck. The next hypothesis proposed was that Hull used a trick deck. But this hypothesis was eliminated when Hull used a borrowed deck. At this point, Hull could go back to using any of the methods that had previously been ruled out.

8

Page 9: Seeing is Not Believing (Lawhead)

Magicians call this type of routine “Canceling Methods.” The same effect is repeated multiple times, using a different method each time. Each method has its weakness. However, the weakness of each method is cancelled out in the next phase by the strengths of another method. This method suggests two points. (1) In magic there are always multiple methods for achieving a given effect. In intelligence, there may be multiple possible explanations for any given data. (2) In magic and intelligence, the situation may be dynamic. What was true at time T1 may not be true at time T2.

An example of the premature pruning of a hypothesis is the surprise attack on Pearl Harbor. Initially, the analysis indicated that Japan would not attack American Territory, because it would not be in their best interests to involve us. That was probably true at the time. This strategic assumption meant that Pearl Harbor was in no danger. But the situation changed and American intelligence began to worry that Japan might attack the Philippines. But this meant that one of the major reasons for thinking that Pearl Harbor was not threatened had been removed. However, the original hypothesis that Pearl Harbor was not in danger was never reexamined.18

Critical Questions about Pruning Hypotheses:

1. What are my reasons for rejecting this hypothesis? 2. Do I have a principled or evidential reason for rejecting it?3. Is it possible that the hypothesis being rejected and the evidence that is contrary to it can

be reconciled in some way?4. What conditions would require me to reconsider the viability of this hypothesis?5. Have any conditions changed that would affect the initial assessment of the plausibility of

this hypothesis?

Premature Closure

The opposite of prematurely pruning a hypothesis is the cognitive pitfall of a premature closure on one particular hypothesis. A subtle technique of really experienced magicians is what is known as “the method of false solutions.” In this technique, the magician plants a false solution of how the trick is done in the minds of the audience. This leads the audience to construct a bogus interpretation of what is happening, which leads them away from the real method. By feeding the audience a false solution, reason and logic are satisfied, because a rational explanation of the trick is now in hand. At the end of the trick, however, the magician shows that the solution they have in mind is incorrect. By then it is too late for the audience to reconstruct an alternative story of what happened.

For example, a classic trick is the dove vanish box. A live dove is placed in a box that rests on a table. The magician makes the appropriate magical gestures and then, one by one, the sides of the box are disassembled and taken away by the assistant. However, the audience notices that the fringed apron of the table is wide enough to conceal the dove. To reinforce this false solution, there are some feathers peeking out from underneath the side of the table. The audience members are smug in the belief that they have discovered the secret. But then the magician pulls out a feather duster and dusts the table, showing that the feathers were a

9

Page 10: Seeing is Not Believing (Lawhead)

diversion. The apron of the table is then ripped away and the table is disassembled. At that point, the audience realizes that their theory about the method is incorrect, but it is too late at this point to reconstruct what actually happened.

The danger of premature closure is abundantly illustrated in the history of intelligence. For example, take the initial skepticism about offensive missiles in Cuba. Prior to 1962 the Soviets had never placed nuclear weapons outside its borders, not even in communist satellite states in Eastern Europe. When there were fears that this might happen in Cuba, the U.S. Special National Intelligence Estimate on September 19 declared that Soviet missiles in Cuba “would be incompatible with Soviet practice to date and with Soviet policy as we presently estimate it.”19 When a situation is dynamic or when an event violates previous paradigms, previous settled conclusions can make us blind to the unique features of the present situation. In the case of Soviet missiles, they had never placed them outside their direct control in Europe for fear that they could be used against the USSR some day. However, the situation of missiles placed in Cuba was unique. Medium-range or intermediate-range missiles placed there could be used against Washington or New York, but not reach Moscow.

Critical Questions about Selecting a Hypothesis:

1. Even though my preferred hypothesis is consistent with the data, could the appeal of this solution result from deception or my own preconceptions?

2. In embracing this one hypothesis, am I ignoring evidence that would support other hypotheses?

3. In settling on the one solution, are there sufficient reasons for dismissing the other hypotheses?

4. What evidence would cause me to reject this hypothesis? Am I open to recognizing this evidence should it appear?

5. What if my conclusions should turn out to be completely wrong? How then would I reconceptualize the situation or data? Should I do that now? (Pre-mortem analysis)

Limits of the Imagination: Mistaking the Unbelievable for the Impossible

In contrast to the problem of prematurely pruning hypotheses, this analytic problem concerns hypotheses that are not even on the horizon of possibilities. In the history of science, there have been a number of world-changing theories, such as those of Copernicus, Galileo, Newton, Einstein, and Pauli’s postulation of the neutrino. In every single case, the claims they put forth were considered to be not simply false, but absolutely impossible by their scientific contemporaries. Eventually, their theories climbed up the cognitive ladder from “impossible,” to “unbelievable but possible,” to “believable but highly unlikely,” to “believable and worthy of consideration,” and finally to acceptance by the scientific community. This confusion between the impossible and the merely unbelievable takes place in both magic and intelligence.

In one of their books, magicians Penn and Teller describe a trick that involves destroying a banknote and magically restoring it. However, the secret method requires the magician to really destroy one bill and, instead of miraculously restoring it, another bill is substituted for the original. Penn and Teller point out that the trick works fine with, say, a one pound note, but it is

10

Page 11: Seeing is Not Believing (Lawhead)

really impressive and deceptive when done with a twenty pound note (or higher) because no one can believe you would destroy twenty pounds just to do a trick. Hence, because the audience refuses to entertain the possible but the unbelievable, they are induced to reject an essential feature of the secret.

In an article on “Intelligence Epistemology: Dealing with the Unbelievable,” Mark Lowenthal describes the category of the “unbelievable” as “facts that are true but are so staggering, so far from the norm or the predictable, as to not be believable.”20 Lowenthal’s most important point is that the unbelievable is too often identified with the “impossible,” whereas very few things belong in the latter category.

Similarly, in an article titled “Capturing the Potential of Outlier Ideas in the Intelligence Community,” Clint Watts and John Brennan describe outliers as data and hypotheses that are too quickly dismissed because they are “outlandish, unthinkable, and wholly anomalous.”21 Among the many examples they give of outlier hypotheses that turned out to be true is the case of Soviet missiles in Cuba and the thesis that Saddam Hussein abandoned his weapons of mass destruction program. The problem is that some outliers end up being the “seeds of surprise.” This is illustrated in the insight of German Field Marshal Helmuth Von Moltke:

In war you will generally find that the enemy has at any time three courses of action open to him. Of those three, he will invariably choose the fourth.

The magician or strategic deceiver have succeeded if they can keep our minds on a false alternative that is seemly plausible and obvious at the same time that we ignore the true conclusion because it is too “unbelievable” to consider.

Critical Questions about the Unbelievable:

1. Have I exhaustively considered all the hypotheses?2. Am I limiting my alternatives to what is typical and familiar?3. Am I failing to consider a hypothesis because of my unexamined preconceptions concerning

what is possible, likely, or believable?4. Have I underestimated my opponent’s willingness to take risks, choose less than optional

alternatives, or accept short-term losses in order to achieve their long-term goals?

Anomaly Analysis

This final section concerns not methods for deception, but a method for uncovering deception. R. V. Jones, the British intelligence expert, wrote “No imitation can be perfect without being the real thing.”22 Barton Whaley (an expert on strategic deception as well as a magician himself) expanded on Jones’s insight with Whaley’s “Plus-Minus Rule.”23 He points out that every imitation must lack a characteristic that the original has (a minus) and it usually has some characteristic not found in the original (a plus). A magician is engaged in imitation. As Robert-Houdin famously said, “A magician is an actor, playing the part of a magician.” In other words, no one is really a magician. But when magicians invent a trick they sometimes ask themselves, “If I really had supernatural powers, how would this routine go? Now, given the

11

Page 12: Seeing is Not Believing (Lawhead)

fact that I don’t really have such powers, how can I use deception to come as close to the ideal of exactly imitating someone who has such special powers?” However, if Jones and Whaley are correct, every magic routine will necessarily contain some incongruities. Hence, when a professional magician attempts to imitate someone with genuine supernatural powers, his routine will typically involve some unnatural actions and some gratuitous actions. The unnatural actions will have to be disguised and made to look normal in some way and the gratuitous actions will have to be artificially motivated or justified.

On the other hand, when a spectator, such as another magician, wants to figure out a trick, she can ask, “In what ways does this performer’s behavior deviate from what it would be if he really had the powers he is pretending to have?” and “What is he trying to make me believe?” The anomalies, the gratuitous actions, and the weakly justified actions will be the key to the secret. For example, I once saw a magician place his pretty assistant in a cabinet from which she promptly disappeared only to appear again in another cabinet at the far side of the stage. The assistant wore a garish, pink wig and sunglasses. The sunglasses were anomalous and lacked justification. However, their presence, while odd, was necessary to the trick. One can disguise the second assistant to look like the first, but the eyes are a dead give-away and so they had to be concealed. To cite another example, a magician may need to ditch a palmed coin. He could do so by putting his hand in his pocket and leaving the coin there. However, that would be a gratuitous, unmotivated action and, hence, suspicious. Instead, he may claim that he needs some “magic dust” and reaches in his pocket to obtain the invisible dust while leaving the coin behind. While this is an improvement over the gratuitous action, it is a weakly justified one.

Apparently, the Egyptian preparations for an attack contained some detectable, non-normal, anomalous characteristics. On the morning of October 6, 1973, as the attack was taking place, the U.S. President’s daily briefing for that day was being read. (It had been prepared before the news of the attack.) It said: “The Egyptians and Syrians have engaged in military exercises every fall. This year their military exercises are unusually realistic” (emphasis added).24 Unfortunately, the analysts did not dwell on this anomaly. The Egyptian exercises were “unusually realistic” because this year it was different. This was preparation for an imminent attack.

Another case of a suspicious anomaly was the large Soviet ships traveling to Cuba who were riding too high in the water to carry farm equipment or conventional weapons as was maintained in the cover story. Instead, the ships had to be carrying extremely long, but unusually light cargo. However, the strategic assumption that the Soviets would not place offensive missiles in Cuba prevented intelligence analysts from initially giving much attention to this revealing anomaly. Critical Questions about Anomalies:

1. What is anomalous in what I am observing? What are the actions that are not sufficiently motivated?

2. Is there something missing that should be there if things were normal?3. Is there something extra that should not be there if things were normal?

12

Page 13: Seeing is Not Believing (Lawhead)

I will close with a quote from John McLaughlin, former Deputy Director of Central Intelligence in the US. In addition to his career in intelligence, McLaughlin is also recognized as an accomplished amateur magician. He once said to an audience of magicians, “Magic and Intelligence are really Kindred Arts.” I hope that my discussion has helped support that point.

13

Page 14: Seeing is Not Believing (Lawhead)

1 References

In the case of both Robert-Houdin and Jasper Maskely, the legends of their wartime exploits come mainly from their own memoirs. In both cases there may have been some exaggeration and literary embellishment. Nevertheless, even if all the details are not correct, it is clear that both applied their magical skills to the war effort of their respective nations.

2 Roberto Giobbi, Card College, vol. 2 (Seattle: Hermetic Press, 1996).

3 Department of Defense Dictionary of Military and Associated Terms, Joint Publication 1-02, 12 April 2001 (As Amended Through 17 December 2003).

4 “Teller Reveals His Secrets,” Smithsonian March 2012. Also at http://www.smithsonianmag.com/arts-culture/Teller-Reveals-His-Secrets.html.

5 See the account of this conference by the two neuroscientists who organized it in Stephen Macknik and Susana Martinez-Conde, Sleights of Mind: What the Neuroscience of Magic Reveals about Our Everyday Deceptions (New York: Henry Holt, 2010).

6 Macknik and Martinez-Conde, pp. 8-15.

7 Robert Jervis, “Hypotheses on Misperception,” World Politics, vol. 20, no. 3 (April 1968), p. 457.

8 Gustav Kuhn and Luis M. Martinez, “Misdirection—Past, Present, and the Future,” Frontiers in Human Neuroscience, vol. 5, no. 172 (January 2012), http://www.frontiersin.org/human_neuroscience/10.3389/fnhum.2011.00172/full

9 Peter Lamont and Richard Wiseman, Magic in Theory: An Introduction to the Theoretical and Psychological Elements of Conjuring (Hatfield: University of Hertfordshire Press, 1999), p. 36.

10 Although most physical misdirection is visual, it could include manipulating the other senses. For example, an explosion that captures the audience’s attention would be auditory misdirection. Magicians who do pickpocketing in their act will grab a spectator’s arm with a firm grip, causing him not to notice the light touch of the other hand which is lifting his wallet. This would be tactile misdirection.

11 Richards J. Heuer, Jr., “Strategic Deception and Counterdeception: A Cognitive Process Approach” International Studies Quarterly, vol. 25, no. 2 (June 1981), p. 321.

12 “Teller Reveals His Secrets,” Smithsonian, March 2012. Also at http://www.smithsonianmag.com/arts-culture/Teller-Reveals-His-Secrets.html.

13 Jeffrey R. Cooper, Curing Analytic Pathologies: Pathways to Improved Intelligence Analysis (Washington, DC: CIA Center for the Study of Intelligence, 2005), p. 33. See also Senate Select Committee on Intelligence, S Report 108-301, Report on the U.S. Intelligence Community’s Prewar Intelligence Assessments on Iraq, S. 2386 (Washington D.C.: 108th Congress, July 2004), conclusion 4, pp. 22-23. Available at:http://www.intelligence.senate.gove/pub108thcongress.html.

14 By initiating an attack, Sadat not only hoped to reestablish his credibility, but he calculated (correctly) that the superpowers would intervene and force Israel into a settlement.

Page 15: Seeing is Not Believing (Lawhead)

15 On a similar note, see the research of psychologist and magician Gustav Kuhn and his colleagues on misdirection and false memories in magic in Gustav Kuhn, Alym A. Amlani, and Ronald A. Rensink, “Towards a Science of Magic,” Trends in Cognitive Sciences, vol. 12, no. 9 (September 2008), pp. 349-354.

16 Avi Shlaim, “Failures in National Intelligence Estimates: The Case of the Yom Kippur War,” World Politics, vol. 28, no. 3 (April 1976), p. 356

17 Domingo Amuchastegui, “Cuban Intelligence and the October Crisis,” in James Blight and David Welch, Intelligence and the Cuban Missile Crisis (London/Portland, OR: Frank Cass Publishers, 1998), p. 101. Amuchastegui is a former official in Cuban intelligence. Two of the authors in this volume, Aleksandr Fursenko and Timothy Naftali, claim that Khrushchev did not inform the KGB of the missile deployment, which contradicts the account of their deception scheme. Whether there was a deception scheme or not, the U.S. intelligence community’s dismissal of early reports of Soviet missiles in Cuba led them to prune this hypothesis prematurely.

18 Robert Jervis, Perception and Misperception in International Politics (Princeton, NJ: Princeton University Press, 1976), p. 412.

19 Special National Intelligence Estimate 85-3-62: "The Military Buildup in Cuba."

20 Mark M. Lowenthal, “Intelligence Epistemology: Dealing with the Unbelievable,” The International Journal of Intelligence and Counterintelligence, vol. 6, no. 3 (Fall 1993), pp. 319-325.

21 Clint Watts and John E. Brennan, “Capturing the Potential of Outlier Ideas in the Intelligence Community,” Studies in Intelligence, vol. 55, no. 4 (December 2011), pp. 1-10.

22 B. Whaley and J. Busby, “Detecting Deception: Practice, Practitioners, and Theory,” inGodson, R., and J. Wirtz, (eds.), Strategic Denial and Deception: The Twenty-First CenturyChallenge (New Brunswick, NJ: Transaction Publishers, 2002), p. 197.

23 Ibid., p. 192.

24 Audio recording of the keynote presentation given by Lt. Gen. Brent Scowcroft on May 15, 2009 at a National Research Council workshop sponsored by the Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security. http://www7.nationalacademies.org/bbcss/DNI_Scowcroft_Audio.mp3