6
All Web and email links can be clicked to visit the URL, retrieve a resource, view an online article, or send an email to the author. 12 12 12 ISSA PreemInent truSted GlobAl InformAtIon SecurIty communIty Abstract Information security often overlooks what motivates people. Everything people do is for a reason – we may not agree with the reason, or even fully realize it, but the reason exists. The fact is we have to contend with thousands of years of instinct and basic human nature. Psychology can help us understand how best to work with our users to improve security. Quick question: How many of your users use good passwords? According to the analysis done by Imperva on the 32 million passwords 1 that were exposed in a recent database intrusion at RockYou Inc., the answer is probably not enough. I mperva discovered that about 30% of the passwords were six characters or smaller, while 60% were passwords cre- ated from a limited set of alphanumeric characters. Nearly 50% were easily guessable names, common slang words, adja- cent keyboard keys, and consecutive digits. In fact, the most common password was “123456,” followed by “12345” and “123456789.” Rounding out the top five were the passwords “password” and “iloveyou.” I propose, for the most part, these users did not consider their passwords to be risky or unsafe. And even for those users who might have had momentary qualms about their password se- lection, they were willing to accept the risks posed by their 1 Jaikumar Vijayan, “Users still make hacking easy with weak passwords,” Computer World, January 21, 2010 – http://www.computerworld.com/s/article/9147138/Users_ still_make_hacking_easy_with_weak_passwords. weak passwords. As Bruce Schneier put it, “Security is both a feeling and a reality. And they’re not the same.” 2 So what makes us feel secure? Actually a better question is: Where does convenience of use cross the line of risk we are willing to accept? The most secure computer system is one en- cased in five feet of concrete, powered off, disconnected, and at the bottom of the ocean, but that is not a very useful computer system (or very convenient). How the brain assesses risk Deciding where to draw the line between convenience and se- curity is something we do constantly, whether deciding what route to take to get to the office or whether to allow our teen- ager to drive to the movies alone. In fact, our brains have two different systems which assess risk; the amygdala which han- dles the processing of immediate risk and the neocortex which handles the processing of future risks. “The brain is a beautifully engineered get-out-of- the-way machine that constantly scans the environ- ment for things out of whose way it should right now get. That’s what brains did for several hundred million years – and then, just a few million years ago, the mam- malian brain learned a new trick: to predict the timing and location of dangers before they actually happened. 2 Bruce Schneier, “The Psychology of Security,” January 21, 2008 – http://www.schneier. com/essay-155.html. Information security often overlooks what motivates people. Psychology can help us understand how best to work with our users to improve security. The Psychology Behind Security By Greg Sternberg ISSA member, Denver, USA Chapter ISSA Journal | April 2010

The Psychology Behind Security - ISSA Journal Abril 2010

Embed Size (px)

DESCRIPTION

Disponível em: https://www.issa.org/images/upload/files/Sternberg-Psychology Behind Security.pdf

Citation preview

Page 1: The Psychology Behind Security  - ISSA Journal Abril 2010

All Web and email links can be clicked to visit the URL, retrieve a resource, view an online article, or send an email to the author.121212

ISSA PreemInent truSted GlobAl

InformAtIon SecurIty communIty

AbstractInformation security often overlooks what motivates people. Everything people do is for a reason – we may not agree with the reason, or even fully realize it, but the reason exists. The fact is we have to contend with thousands of years of instinct and basic human nature. Psychology can help us understand how best to work with our users to improve security.

Quick question: How many of your users use good passwords? According to the analysis done by Imperva on the 32 million passwords1 that were exposed in a recent database intrusion at RockYou Inc., the answer is probably not enough.

Imperva discovered that about 30% of the passwords were six characters or smaller, while 60% were passwords cre-ated from a limited set of alphanumeric characters. Nearly

50% were easily guessable names, common slang words, adja-cent keyboard keys, and consecutive digits. In fact, the most common password was “123456,” followed by “12345” and “123456789.” Rounding out the top five were the passwords “password” and “iloveyou.”

I propose, for the most part, these users did not consider their passwords to be risky or unsafe. And even for those users who might have had momentary qualms about their password se-lection, they were willing to accept the risks posed by their

1 Jaikumar Vijayan, “Users still make hacking easy with weak passwords,” Computer World, January 21, 2010 – http://www.computerworld.com/s/article/9147138/Users_still_make_hacking_easy_with_weak_passwords.

weak passwords. As Bruce Schneier put it, “Security is both a feeling and a reality. And they’re not the same.”2

So what makes us feel secure? Actually a better question is: Where does convenience of use cross the line of risk we are willing to accept? The most secure computer system is one en-cased in five feet of concrete, powered off, disconnected, and at the bottom of the ocean, but that is not a very useful computer system (or very convenient).

How the brain assesses riskDeciding where to draw the line between convenience and se-curity is something we do constantly, whether deciding what route to take to get to the office or whether to allow our teen-ager to drive to the movies alone. In fact, our brains have two different systems which assess risk; the amygdala which han-dles the processing of immediate risk and the neocortex which handles the processing of future risks.

“The brain is a beautifully engineered get-out-of-the-way machine that constantly scans the environ-ment for things out of whose way it should right now get. That’s what brains did for several hundred million years – and then, just a few million years ago, the mam-malian brain learned a new trick: to predict the timing and location of dangers before they actually happened.

2 Bruce Schneier, “The Psychology of Security,” January 21, 2008 – http://www.schneier.com/essay-155.html.

the Psychology behind Security

Information security often overlooks what motivates people. Psychology can help us

understand how best to work with our users to improve security.

The Psychology Behind Security

By Greg Sternberg ISSA member, Denver, USA Chapter

ISSA Journal | April 2010

Page 2: The Psychology Behind Security  - ISSA Journal Abril 2010

All Web and email links can be clicked to visit the URL, retrieve a resource, view an online article, or send an email to the author. 13

the Psychology behind Security | Greg Sternberg

Our ability to duck that which is not yet coming is one of the brain’s most stunning innovations, and we wouldn’t have dental floss or 401(k) plans without it. But this in-novation is in the early stages of development. The ap-plication that allows us to respond to visible baseballs is ancient and reliable, but the add-on utility that allows us to respond to threats that loom in an unseen future is still in beta testing.” – Daniel Gilbert3

Information and sensory input is sent to both the amygdala and the neocortex. The amygdala does a quick scan of a small portion of the input and rapidly comes to a conclusion based on instinct and previous history. It gathers any emotional context which might have been previously experienced and sends the entire “data packet” to the neocortex. It also pre-pares the body for a response if it feels the input represented a threat.

The neocortex, on the other hand, looks everything over in a logical manner and takes a correspondingly longer time to draw a conclusion. This delay, could result in the amygda-la’s conclusion overriding the neocortex.4 It is this override mechanism which makes us jump at loud sounds or feel un-comfortable walking down dark alleyways.

Both the amygdala and neocortex need data before they can evaluate the situation. This data must be accurate and timely or our risk evaluation results may be flawed. Both the amyg-dala and the neocortex reference past events in coming to their conclusions, but it is the amygdala which references the emotional context and the neocortex which references ab-stract concepts like how reliable the data is or how reasonable the initial response is.5

Unfortunately we are flooded with so much information (some 3.6 zettabytes6 worth in 20087) from a host of sources such as TV, radio, Internet, newspapers, books, blogs, Twit-ter, movies, Facebook, other people, etc., that we have trouble remembering where we heard a particular piece of informa-tion and how reputable the source was. Fully and completely processing the information we receive is beyond the capabili-ties of either the amygdala or the neocortex, so we have devel-oped mental “short-cuts,” called heuristics, to keep up.

These heuristics are our way of handling the immensely com-plex world we live in. In fact, we use these heuristics without even realizing it. For example, count the number of Fs in the following text:

Finished files are the result of years of scientific study combined with the experience of years.

3 Daniel Gilbert, “If Only Gay Sex Caused Global Warming, “ Los Angeles Times, July 2, 2006 - http://www.commondreams.org/views06/0702-26.htm.

4 Daniel Kahnemann, “A Perspective on Judgment and Choice,” 2003, American Psychologist.

5 Amos Tversky and Daniel Kahneman, “Judgement under Uncertainty: Heuristics and Biases,” Science, 1974.

6 1 zettabyte = 1 billion terabytes.

7 Doug Ramsey, “How Much Information Americans Consume,” UC San Diego News, 2008 – http://ucsdnews.ucsd.edu/newsrel/general/12-09Information.asp.

If you are like most people, you answered “3” – but there are actually six. Or to use the example in Don Norman’s essay “Being Analog”:

How many animals of each type did Moses take on the Ark?

Most people would say two but the actual answer is none. Moses did not take animals into the ark, Noah did.

This “mental adjustment” happens because our brains are wired to notice big differences like day vs. night or bears vs. blueberries since it is important we notice those – our lives might depend on it. However, if the difference is subtle (i.e., Moses and Noah are both biblical) we may miss the differ-ence entirely.

When these heuristics and biases are accurate, they are in-credibly useful. After all, it really does not matter how many Fs are in the sentence, since being close is usually good enough and much faster to determine. It also does not matter that Moses was used in the question above, since most of us mentally substitute Noah and in most situations that is the question that was really being asked.

the ISSA Store Is openorder your ISSA Shirt today

Stand out at your next chapter or regional event by wearing the navy blue polo shirt featuring the embroi-

dered ISSA logo. The stainless steel Thermos makes a statement and is the perfect beverage com-panion. Each tumbler holds 16 oz. of your favorite beverage.

Our logoed pens with fraud-resistant ink are a popular choice. Paired the fraud-resistant pen and ISSA notepad make for the perfect chapter or industry event door prize/giveaway, thank you gift for speakers, welcome gift for new members, or to ex-press appreciation to volunteers.

To find out more about purchas-ing these or other ISSA promo-

t i o n a l products, contact Dana Pau-lino, 1-866-349-5818, U.S. toll-free; 206-388-4584, interna-tional; extension. 103.

ISSA Journal | April 2010

Page 3: The Psychology Behind Security  - ISSA Journal Abril 2010

All Web and email links can be clicked to visit the URL, retrieve a resource, view an online article, or send an email to the author.14

the Psychology behind Security | Greg Sternberg

When these heuristics and biases are inaccurate or based on faulty information, we make dangerous or invalid risk assess-ments such as hiding under a tree or thinking lightning does not strike the same place twice.8 While we use many heuris-tics and biases in our daily lives, the following ones have a direct relation to how secure we feel.

risking gains and accepting lossesWhen it comes to evaluating gains or losses, people have a built-in heuristic against risking gains or accepting losses. Called the Prospect Theory, this is best demonstrated by an experiment9 put together by Daniel Kahneman and Amos Tversky in which they gave one group the following gain-related alternatives:

• Alternative A: A sure gain of $500

• Alternative B: A 50% chance of gaining $1,000

and another group loss-related alternatives:

• Alternative C: A sure loss of $500

• Alternative D: A 50% chance of losing $1,000

The results of the experiment were 84% chose the sure gain of A over the risky gain of B, but when faced with loss only 30% chose the sure loss of C over the risky loss of D. This translates into our users being more likely to risk a larger se-curity loss (i.e. a break-in) then accept the certainty of a small security loss (i.e. forgetting a password). Especially when one considers that a break-in will likely have little direct affect on the user but forgetting a password certainly will and has the emotional context of embarrassment.

“It won’t happen to me”While this heuristic enables us to strive in the face of adver-sity or continue when others have failed, it also is the cause of much frustration for security personal. The optimistic bias is best described as “...the demonstrated systematic ten-dency for people to be over-optimistic about the outcome of planned actions. This includes over-estimating the likeli-hood of positive events and under-estimating the likelihood of negative events.”10 And the more control we have over an event the more optimistic we tend to be - i.e. “I would not let it happen that way.”

Research suggests this bias is largely due to people overes-timating how skilled they are relative to other people.11 The most interesting effects of this bias are that:

• individuals who had the least experience rated them-selves to be far more capable then they actually were

8 wikiHow – http://www.wikihow.com/Avoid-Being-Struck-by-Lightning-When-Caught-Unawares.

9 Daniel Kahneman and Amos Tversky, “Prospect Theory: An Analysis of Decision Under Risk,” Econometrica, 1979.

10 Wikipedia – http://en.wikipedia.org/wiki/Optimism_bias.

11 David Dunning and Justin Kruger, “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments,” Journal of Personality and Social Psychology, 1999 – http://www.scirp.org/Journal/PaperDownload.aspx?paperID=883&fileName=Psych.20090100004_39584049.pdf.

• individuals who had the most experience rated them-selves to be slightly less capable then they actually were

Or as one researcher put it, “The more you know the less you think you know.” This heuristic goes a long way in explaining why our most naive users feel safer than those of us who have decades of experience on our side.

the trust factorPeople instinctively trust other people.12 We trust people to follow the rules of the road when we drive; we trust people to take care of our children; we trust taxi drivers to take us to our destination in a strange city. Colman described some fundamental components of trust which have direct correla-tions to security:13

1. Placement of trust allows actions which are other-wise not possible – The vast majority of people turn their computers on with the expectation their com-puters will “just work” and have very little idea or care how they work.

2. If the person being trusted is trustworthy, then the person doing the trusting is better off; conversely if the person being trusted is untrustworthy, then the person doing the trusting is worse off – Our users trust the sites they visit and the programs they run. When they work, we are better off; but when they do something unexpected or undesired, then we are worse off – especially if our identity is compromised.

3. Trust is an action involving the voluntary place-ment of resources at the disposal of the person being trusted with no real commitment from the trustee – As we know, allowing webpages to install programs requires a significant amount of trust not only of the webpage but in the software being installed, the con-nection between computers, the writers of the web-page, etc. Our users, however, only see a popup that they click on so it will go away.

Small change blindnessAs long as the changes in our environment occur slowly, we adapt and are unlikely to detect the change. Eventually, if the change is cumulative, like the number of minutes the sun is up from day to day, we will notice, but it may be weeks or months before the accumulation of changes is great enough.14

Sitting in front of a computer we are blissfully unaware of what is happening “behind the curtains.” We notice when someone breaks into our house because the damage and/or missing property is visible. We are unlikely to notice when

12 Niklas Luhmann, “Trust: A Mechanism For the Reduction of Social Complexity,” Trust and Power: Two Works by Niklas Luhmann, New York. John Wiley & Sons, 1979.

13 James Coleman, “Foundations of Social Theory,” Belknap Press, Cambridge, MA, 1990.

14 Ronald Rensink, J. Kevin O’Regan, and James Clark, “To See or Not To See,” Psychological Science, 1997 – http://www.psych.ubc.ca/~rensink/publications/download/PsychSci97-RR.pdf.

ISSA Journal | April 2010

Page 4: The Psychology Behind Security  - ISSA Journal Abril 2010

Finally, unified content security. Visit websense.com to learn more.

Page 5: The Psychology Behind Security  - ISSA Journal Abril 2010

All Web and email links can be clicked to visit the URL, retrieve a resource, view an online article, or send an email to the author.16

the Psychology behind Security | Greg Sternberg

someone breaks into our computer because the damage is invisible15 (unless you know exactly how and where to look) and nothing is missing – our data is still there. In fact, unless something noticeable happens, usually outside the realm of the computer, like money being siphoned out of our bank ac-count, our system could remain infected for years.16

What this meansIn the general IT industry there is the misconception that end users are dumb and all the training in the world will not help them. This mind set is prevalent even in the information se-curity field, hence decals and T-shirts which say “Social En-gineering Specialist: Because There Are No Patches For Hu-man Stupidity.” If, however, we view their decisions (right or wrong) in the light of psychology, we begin to realize why they made those decisions, and even better, how we might affect a change in those decisions.

We know there is no magic potion to keep users from clicking on unknown email attachments or smart pills for recogniz-ing phishing scams, but it is our job to develop the tools of awareness, training, and education17 which our users need to make our jobs easier.

AwarenessTo some, security is the use of a username and password and a vague impression of a group that makes it difficult to get things done. We need to raise the awareness of both the board and our users by evangelizing security. This evangelism needs to be in the spirit of “security can help you” and not in the spirit of “repent sinner” for it to be successful. From NIST publication 800:

“Awareness is not training. Security awareness is a blended solution of activities that promote security, establish ac-countability, and inform the workforce of security news. Awareness seeks to focus an individual’s attention on an issue or a set of issues. The purpose of awareness presen-tations is simply to focus attention on security. Aware-ness presentations are intended to allow individuals to recognize information security concerns and respond accordingly.”18

NIST goes even further in SP 800, suggesting some activities to raise the awareness of security:

• Host an information security day

• Conduct briefings on current issues, like the dangers of social networking

• Distribute promotional items with motivational slo-gans (think coffee mugs, mouse pads)

15 Jeffrey Carr, “Under attack from invisible enemies,” The Independent, 2010.

16 Kelly Higgins, “The World’s Biggest Botnets,” DarkReading, 2007 – http://www.darkreading.com/security/management/showArticle.jhtml?articleID=208808174.

17 I don’t know what it is about triads in security but there certainly are a lot of them.

18 NIST Special Publication 800-16, “Information Security Training Requirements: A Role- and Performance-Based Model,” Section 2.2.1.

• Provide login banners serving as security reminders

• Show awareness videos (Computer Security Aware-ness Poster & Video Contest 2009)

• Distribute posters and flyers

Whatever approach chosen, it is important to make it a regu-lar event and not be repetitive in form but repetitive in mes-sage. A bored audience will remember little other than bore-dom; an interested and engaged audience will remember the message.

trainingIt is important for our users to develop the skills necessary to be secure. In most companies security has to place a par-ticular level of trust in their users (i.e., security cannot be everywhere at once). You have to trust users to adhere to se-curity policies, use secure coding practices, take appropriate precautions when around secret or classified hardware and software, etc.

This requires an understanding of how secure the systems need to be, what materials and processes will be needed to bring the users up to the appropriate level of “secure,” and what materials and processes will be needed to keep them there. Processes within security need to be implemented to ensure this training is appropriate, accurate, frequent, and up-to-date. Training which is not relevant or empowering to the user will be ignored at best and subverted at worst.

educationChris Christensen describes ten essential principles which can be used to teach adults in “How Do People Learn?”19 Four of them are discussed below:

Education is only one part of a behavior change program

For education to be effective it must be reiterated on a regu-lar basis and reinforced by other things such as an awareness program, visual reminders, or even included in quarterly performance goals.

Apply learning immediately

Learning is highly dependent on reinforcement. Reinforce-ment can come in the form of repetition or practice. For ex-ample, we remember that two plus two equals four because we have heard it so many times. Reinforcement can also oc-cur via an emotional context. For example, people remem-ber where they were when they heard about the World Trade Center because of the highly emotional context surrounding that event.

Education must improve the business

Sometimes in security we fail to understand the risks our us-ers face. We see our users not following proper security pro-cedures and assume the user does not understand the risks. To a user the biggest risk is not a security breach; rather it is

19 Chris Christensen, “How Do People Learn?” 2008 – http://www.camcinc.com/library/HowDoPeopleLearn.pdf.

ISSA Journal | April 2010

Page 6: The Psychology Behind Security  - ISSA Journal Abril 2010

All Web and email links can be clicked to visit the URL, retrieve a resource, view an online article, or send an email to the author. 17

the Psychology behind Security | Greg Sternberg

As technologists, we often overlook what motivates people; we focus only on the symptoms. The research into human psychology and motivation has much to teach us about our users and their reactions. Improving security involves chang-ing beliefs, attitudes, and behavior, both of individuals and of groups. Psychology can help us understand how best to work with our users to achieve the goal of improving security:

• Understanding how people think helps us understand how to craft training

• Knowing why people think helps us overcome im-proper ingrained (or instinctual) reactions

• Looking from the user’s point of view and being aware of what motivates him shows us how to persuade and change attitudes

• Understanding how groups think helps us enhance our security procedures by protecting users from so-cial pressures which might encourage risky behavior

In short, we in security must understand why our users make the decisions they do and how we need to influence our us-ers so they “buy into security.” To paraphrase a cliche: If you make a user secure; you are safe for today. If you convince a user to be secure; you are safe for a lifetime.

About the AuthorMy name is Greg Sternberg and at some point I have worked in every aspect of soft-ware engineering for almost three decades. My involvement in security started on the wrong side very early in my career, but I soon “moved into the light.” Since then I have worked on integrating security into the development life cycle, secure coding practices, compliance, and security architecture. I currently hold a CISSP and am TOGAF-certified. I can be reached at [email protected].

not getting his job done. Almost all users, when faced with the risk of not doing their job or not following a security policy, will opt to not follow the security policy, especially if the penalties are not very severe. However, if the business sees security as a vital component, then users will be far more inclined to follow the security policy since it is a natural com-ponent of their jobs.

Education must entertain

As previously mentioned, emotional context is a key to learn-ing. While either a positive or negative reinforcement can be used, researchers have found positive reinforcement results in lasting behavioral modification, whereas punishment only temporarily changes behavior.20

conclusionWe all have commiserated with our colleagues about the seemly unbelievable risks our users have taken and still take. We express disbelief when people hold open secured doors for their colleagues (or total strangers) and circumvent the million dollar card-access system we fought so hard to have installed. Feeling frustrated by the very people we are trying to protect, we begin to rely on expensive technology, impos-sible to enforce policies, or over-zealous watch-dogging.

We need to step back and consider the why behind what peo-ple do or do not do. Everything people do is for a reason. We may not agree with the reason, or even fully realize it, but the reason exists. We need to understand that while our poli-cies and procedures are intended to keep the company safe, they may be, and likely are, viewed as impediments by the very people we are trying to protect.21 The fact is we have to contend with thousands of years of instinct and basic human nature.

20 B. F. Skinner, Walden Two, Macmillan, Toronto, 1970.

21 Larry Greenemeier, “The Threat Within: Employees Pose The Biggest Security Risk,” Information Week, 2007 – http://www.informationweek.com/news/security/showArticle.jhtml?articleID=201001449.

Kevin Spease

Advanced Persistent Threat: Sacramento Valley February Meeting Now on ConnectIn February, David Blackburn of the Sacramento Valley Chapter recorded a presentation by Greg Hoglund, which has been posted in ISSA Connect so that all ISSA members might share.

Presentation Overview: The term “Advanced Persistent Threat” (APT) has been used to describe high profile incidents such as the one reported by Google earlier this year. The primary means for data theft are malware programs that infect computers in your Enterprise. Malware has always had the ability to steal data, and mal-ware has always been operated by real humans. The true threat is not the malware itself, but the human behind the malware. This is why existing security products cannot stop the attacks - the attacker is always evolving. By examining the malware attacks in your enterprise, you can gain insight into the intent of the attacker, and also his methods and capabilities. Technical analysis of malware will reveal actionable intelligence that can be used im-mediately to detect additional infections, update perimeter security devices, and shutdown data egress points. This information is critical for mitigating risk.

ISSA Journal | April 2010