56
Cognitive Biases, Fallacies, and Public Relations

List of Cognitive Biases

Embed Size (px)

Citation preview

Page 1: List of Cognitive Biases

Cognitive Biases, Fallacies, and Public Relations

Page 2: List of Cognitive Biases

List of cognitive biases

Cognitive biases are tendencies to think in certain ways. Cognitive biases can lead to

systematic deviations from a standard of rationality or good judgment, and are often studied

in psychology and behavioral economics.

Although the reality of these biases is confirmed by replicable research, there are often

controversies about how to classify these biases or how to explain them.[1] Some are effects of

information-processing rules (i.e. mental shortcuts), called heuristics, that the brain uses to

produce decisions or judgments. Such effects are called cognitive biases.[2][3]Biases in judgment

or decision-making can also result from motivation, such as when beliefs are distorted

by wishful thinking. Some biases have a variety of cognitive ("cold") or motivational ("hot")

explanations. Both effects can be present at the same time.[4][5]

There are also controversies as to whether some of these biases count as truly irrational or

whether they result in useful attitudes or behavior. For example, when getting to know others,

people tend to ask leading questions which seem biased towards confirming their assumptions

about the person. This kind of confirmation bias has been argued to be an example of social

skill: a way to establish a connection with the other person.[6]

The research on these biases overwhelmingly involves human subjects. However, some of the

findings have appeared in non-human animals as well. For example, hyperbolic discounting has

also been observed in rats, pigeons, and monkeys.[7]

Decision-making, belief, and behavioral biases

Many of these biases affect belief formation, business and economic decisions, and human

behavior in general. They arise as a replicable result to a specific condition: when confronted

with a specific situation, the deviation from what is normally expected can be characterized by:

Name Description

Ambiguity effect

The tendency to avoid options for which missing information

makes the probability seem "unknown."[8]

Anchoring orfocalism The tendency to rely too heavily, or "anchor," on one trait or

piece of information when making decisions.[9][10]

Page 3: List of Cognitive Biases

Attentional bias

The tendency of our perception to be affected by our recurring

thoughts. [11]

Availability heuristic

The tendency to overestimate the likelihood of events with

greater "availability" in memory, which can be influenced by

how recent the memories are or how unusual or emotionally

charged they may be.[12]

Availability cascade

A self-reinforcing process in which a collective belief gains

more and more plausibility through its increasing repetition in

public discourse (or "repeat something long enough and it will

become true").[13]

Backfire effect

When people react to disconfirming evidence by strengthening

their beliefs.[14]

Bandwagon effect

The tendency to do (or believe) things because many other

people do (or believe) the same. Related

to groupthink and herd behavior.[15]

Base rate fallacy or base rate

neglect

The tendency to ignore base rate information (generic, general

information) and focus on specific information (information

only pertaining to a certain case).[16]

Belief bias

An effect where someone's evaluation of the logical strength

of an argument is biased by the believability of the

conclusion.[17]

Bias blind spot

The tendency to see oneself as less biased than other people,

or to be able to identify more cognitive biases in others than in

oneself.[18]

Page 4: List of Cognitive Biases

Cheerleader effect The tendency for people to appear more attractive in a group

than in isolation.[19]

Choice-supportive bias

The tendency to remember one's choices as better than they

actually were.[20]

Clustering illusion

The tendency to over-expect small runs, streaks, or clusters in

large samples of random data (that is, seeing phantom

patterns).[10]

Confirmation bias

The tendency to search for, interpret, focus on and remember

information in a way that confirms one's preconceptions.[21]

Congruence bias

The tendency to test hypotheses exclusively through direct

testing, instead of testing possible alternative hypotheses.[10]

Conjunction fallacy

The tendency to assume that specific conditions are more

probable than general ones.[22]

Conservatism orregressive

bias

A certain state of mind wherein high values and high

likelihoods are overestimated while low values and low

likelihoods are underestimated.[23][24][25][unreliable source?]

Conservatism (Bayesian)

The tendency to insufficiently revise one's belief when

presented with new evidence.[23][26][27]

Contrast effect

The enhancement or reduction of a certain perception's stimuli

when compared with a recently observed, contrasting

object.[28]

Page 5: List of Cognitive Biases

Curse of knowledge

When better-informed people find it extremely difficult to think

about problems from the perspective of lesser-informed

people.[29]

Decoy effect

Preferences for either option A or B changes in favor of option

B when option C is presented, which is similar to option B but

in no way better.

Denomination effect

The tendency to spend more money when it is denominated in

small amounts (e.g. coins) rather than large amounts (e.g.

bills).[30]

Distinction bias

The tendency to view two options as more dissimilar when

evaluating them simultaneously than when evaluating them

separately.[31]

Duration neglect

The neglect of the duration of an episode in determining its

value

Empathy gap

The tendency to underestimate the influence or strength of

feelings, in either oneself or others.

Endowment effect

The fact that people often demand much more to give up an

object than they would be willing to pay to acquire it.[32]

Essentialism Categorizing people and things according to their essential

nature, in spite of variations.[dubious – discuss][33]

Exaggerated expectation

Based on the estimates, real-world evidence turns out to be

less extreme than our expectations (conditionally inverse of

the conservatism bias).[unreliable source?][23][34]

Page 6: List of Cognitive Biases

Experimenter'sor expectation

bias

The tendency for experimenters to believe, certify, and publish

data that agree with their expectations for the outcome of an

experiment, and to disbelieve, discard, or downgrade the

corresponding weightings for data that appear to conflict with

those expectations.[35]

Functional fixedness

Limits a person to using an object only in the way it is

traditionally used.

Focusing effect

The tendency to place too much importance on one aspect of

an event.[36]

Forer effect orBarnum effect

The observation that individuals will give high accuracy ratings

to descriptions of their personality that supposedly are tailored

specifically for them, but are in fact vague and general enough

to apply to a wide range of people. This effect can provide a

partial explanation for the widespread acceptance of some

beliefs and practices, such as astrology, fortune telling,

graphology, and some types of personality tests.

Framing effect

Drawing different conclusions from the same information,

depending on how or by whom that information is presented.

Frequency illusion

The illusion in which a word, a name or other thing that has

recently come to one's attention suddenly seems to appear

with improbable frequency shortly afterwards (see

also recency illusion).[37]

Gambler's fallacy

The tendency to think that future probabilities are altered by

past events, when in reality they are unchanged. Results from

an erroneous conceptualization of the law of large numbers.

For example, "I've flipped heads with this coin five times

consecutively, so the chance of tails coming out on the sixth

Page 7: List of Cognitive Biases

flip is much greater than heads."

Hard-easy effect

Based on a specific level of task difficulty, the confidence in

judgments is too conservative and not extreme

enough[23][38][39][40]

Hindsight bias

Sometimes called the "I-knew-it-all-along" effect, the tendency

to see past events as being predictable[41] at the time those

events happened.

Hostile media effect

The tendency to see a media report as being biased, owing to

one's own strong partisan views.

Hot-hand fallacy

The "hot-hand fallacy" (also known as the "hot hand

phenomenon" or "hot hand") is the fallacious belief that a

person who has experienced success has a greater chance of

further success in additional attempts.

Hyperbolic discounting

The tendency for people to have a stronger preference for

more immediate payoffs relative to later payoffs, where the

tendency increases the closer to the present both payoffs

are.[42] Also known as current moment bias, present-bias, and

related to Dynamic inconsistency.

Identifiable victim effect

The tendency to respond more strongly to a single identified

person at risk than to a large group of people at risk.[43]

IKEA effect

The tendency for people to place a disproportionately high

value on objects that they partially assembled themselves,

such as furniture from IKEA, regardless of the quality of the

end result.

Page 8: List of Cognitive Biases

Illusion of control

The tendency to overestimate one's degree of influence over

other external events.[44]

Illusion of validity

Belief that furtherly acquired information generates additional

relevant data for predictions, even when it evidently does

not.[45]

Illusory correlation

Inaccurately perceiving a relationship between two unrelated

events.[46][47]

Impact bias

The tendency to overestimate the length or the intensity of the

impact of future feeling states.[48]

Information bias

The tendency to seek information even when it cannot affect

action.[49]

Insensitivity to sample size The tendency to under-expect variation in small samples

Irrational escalation

The phenomenon where people justify increased investment

in a decision, based on the cumulative prior investment,

despite new evidence suggesting that the decision was

probably wrong.

Just-world hypothesis

The tendency for people to want to believe that the world is

fundamentally just, causing them to rationalize an otherwise

inexplicable injustice as deserved by the victim(s).

Less-is-better effect

The tendency to prefer a smaller set to a larger set judged

separately, but not jointly

Page 9: List of Cognitive Biases

Loss aversion

"the disutility of giving up an object is greater than the utility

associated with acquiring it".[50] (see also Sunk cost

effects and endowment effect).

Ludic fallacy The misuse of games to model real-life situations.

Mere exposure effect

The tendency to express undue liking for things merely

because of familiarity with them.[51]

Money illusion

The tendency to concentrate on the nominal (face value) of

money rather than its value in terms of purchasing power.[52]

Moral credential effect

The tendency of a track record of non-prejudice to increase

subsequent prejudice.

Negativity effect

The tendency of people, when evaluating the causes of the

behaviors of a person they dislike, to attribute their positive

behaviors to the environment and their negative behaviors to

the person's inherent nature or of young people to be more

negative information in the descriptions of others

Negativity bias

Psychological phenomenon by which humans have a

greater recall of unpleasant memories compared with positive

memories.[53]

Neglect of probability

The tendency to completely disregard probability when making

a decision under uncertainty.[54]

Normalcy bias

The refusal to plan for, or react to, a disaster which has never

happened before.

Page 10: List of Cognitive Biases

Observation selection bias

The effect of suddenly noticing things that were not noticed

previously – and as a result wrongly assuming that the

frequency has increased.

Observer-expectancy effect

When a researcher expects a given result and therefore

unconsciously manipulates an experiment or misinterprets

data in order to find it (see also subject-expectancy effect).

Omission bias

The tendency to judge harmful actions as worse, or less

moral, than equally harmful omissions (inactions).[55]

Optimism bias

The tendency to be over-optimistic, overestimating favorable

and pleasing outcomes (see also wishful thinking, valence

effect, positive outcome bias).[56][57]

Ostrich effect Ignoring an obvious (negative) situation.

Outcome bias

The tendency to judge a decision by its eventual outcome

instead of based on the quality of the decision at the time it

was made.

Overconfidence effect

Excessive confidence in one's own answers to questions. For

example, for certain types of questions, answers that people

rate as "99% certain" turn out to be wrong 40% of the

time.[23][58][59][60]

Pareidolia

A vague and random stimulus (often an image or sound) is

perceived as significant, e.g., seeing images of animals or

faces in clouds, the man in the moon, and hearing non-

existent hidden messages on records played in reverse.

Page 11: List of Cognitive Biases

Pessimism bias

The tendency for some people, especially those suffering

from depression, to overestimate the likelihood of negative

things happening to them.

Planning fallacy The tendency to underestimate task-completion times.[48]

Post-purchase rationalization

The tendency to persuade oneself through rational argument

that a purchase was a good value.

Pro-innovation bias

The tendency to have an excessive optimism towards an

invention or innovation's usefulness throughout society, while

often failing to identify its limitations and weaknesses.

Pseudocertainty effect

The tendency to make risk-averse choices if the expected

outcome is positive, but make risk-seeking choices to avoid

negative outcomes.[61]

Reactance

The urge to do the opposite of what someone wants you to do

out of a need to resist a perceived attempt to constrain your

freedom of choice (see also Reverse psychology).

Reactive devaluation

Devaluing proposals only because they are purportedly

originated with an adversary.

Recency illusion

The illusion that a word or language usage is a recent

innovation when it is in fact long-established (see also

frequency illusion).

Restraint bias

The tendency to overestimate one's ability to show restraint in

the face of temptation.

Page 12: List of Cognitive Biases

Rhyme as reason effect

Rhyming statements are perceived as more truthful. A famous

example being used in the O.J Simpson trial with the

defense's use of the phrase "If the gloves don't fit, then you

must acquit."

Risk compensation / Peltzman

effect

The tendency to take greater risks when perceived safety

increases.

Selective perception The tendency for expectations to affect perception.

Semmelweis reflex

The tendency to reject new evidence that contradicts a

paradigm.[27]

Social comparison bias

The tendency, when making hiring decisions, to favour

potential candidates who don't compete with one's own

particular strengths.[62]

Social desirability bias

The tendency to over-report socially desirable characteristics

or behaviours in one self and under-report socially undesirable

characteristics or behaviours.[63]

Status quo bias

The tendency to like things to stay relatively the same (see

also loss aversion, endowment effect, and system

justification).[64][65]

Stereotyping

Expecting a member of a group to have certain characteristics

without having actual information about that individual.

Subadditivity effect

The tendency to judge probability of the whole to be less than

the probabilities of the parts.[66]

Page 13: List of Cognitive Biases

Subjective validation

Perception that something is true if a subject's belief demands

it to be true. Also assigns perceived connections between

coincidences.

Survivorship bias

Concentrating on the people or things that "survived" some

process and inadvertently overlooking those that didn't

because of their lack of visibility.

Time-saving bias

Underestimations of the time that could be saved (or lost)

when increasing (or decreasing) from a relatively low speed

and overestimations of the time that could be saved (or lost)

when increasing (or decreasing) from a relatively high speed.

Unit bias

The tendency to want to finish a given unit of a task or an

item. Strong effects on the consumption of food in

particular.[67]

Well travelled road effect

Underestimation of the duration taken to traverse oft-traveled

routes and overestimation of the duration taken to traverse

less familiar routes.

Zero-risk bias

Preference for reducing a small risk to zero over a greater

reduction in a larger risk.

Zero-sum heuristic

Intuitively judging a situation to be zero-sum (i.e., that gains

and losses are correlated). Derives from the zero-sum game

in game theory, where wins and losses sum to zero.[68][69] The

frequency with which this bias occurs may be related to

the social dominance orientation personality factor.

Social biases[edit]

Most of these biases are labeled as attributional biases.

Page 14: List of Cognitive Biases

Name Description

Actor-observer

bias

The tendency for explanations of other individuals' behaviors to

overemphasize the influence of their personality and underemphasize the

influence of their situation (see also Fundamental attribution error), and for

explanations of one's own behaviors to do the opposite (that is, to

overemphasize the influence of our situation and underemphasize the

influence of our own personality).

Defensive

attribution

hypothesis

Attributing more blame to a harm-doer as the outcome becomes more

severe or as personal or situational similarity to the victim increases.

Dunning–Kruger

effect

An effect in which incompetent people fail to realise they are incompetent

because they lack the skill to distinguish between competence and

incompetence. Actual competence may weaken self-confidence, as

competent individuals may falsely assume that others have an equivalent

understanding.[70]

Egocentric bias

Occurs when people claim more responsibility for themselves for the results

of a joint action than an outside observer would credit them.

Extrinsic

incentives bias

An exception to the fundamental attribution error, when people view others

as having (situational) extrinsic motivations and (dispositional) intrinsic

motivations for oneself

False

consensus

effect

The tendency for people to overestimate the degree to which others agree

with them.[71]

Forer effect(aka

Barnum effect)

The tendency to give high accuracy ratings to descriptions of their

personality that supposedly are tailored specifically for them, but are in fact

vague and general enough to apply to a wide range of people. For

Page 15: List of Cognitive Biases

example, horoscopes.

Fundamental

attribution error

The tendency for people to over-emphasize personality-based explanations

for behaviors observed in others while under-emphasizing the role and

power of situational influences on the same behavior (see also actor-

observer bias, group attribution error, positivity effect, and negativity

effect).[72]

Group

attribution error

The biased belief that the characteristics of an individual group member are

reflective of the group as a whole or the tendency to assume that group

decision outcomes reflect the preferences of group members, even when

information is available that clearly suggests otherwise.

Halo effect

The tendency for a person's positive or negative traits to "spill over" from one

personality area to another in others' perceptions of them (see also physical

attractiveness stereotype).[73]

Illusion of

asymmetric

insight

People perceive their knowledge of their peers to surpass their peers'

knowledge of them.[74]

Illusion of

external agency

When people view self-generated preferences as instead being caused by

insightful, effective and benevolent agents

Illusion of

transparency

People overestimate others' ability to know them, and they also overestimate

their ability to know others.

Illusory

superiority

Overestimating one's desirable qualities, and underestimating undesirable

qualities, relative to other people. (Also known as "Lake Wobegon effect,"

"better-than-average effect," or "superiority bias").[75]

Ingroup bias

The tendency for people to give preferential treatment to others they

Page 16: List of Cognitive Biases

perceive to be members of their own groups.

Just-world

phenomenon

The tendency for people to believe that the world is just and therefore people

"get what they deserve."

Moral luck

The tendency for people to ascribe greater or lesser moral standing based

on the outcome of an event

Naive cynicism Expecting more egocentric bias in others than in oneself

Outgroup

homogeneity

bias

Individuals see members of their own group as being relatively more varied

than members of other groups.[76]

Projection bias

The tendency to unconsciously assume that others (or one's future selves)

share one's current emotional states, thoughts and values.[77]

Self-serving

bias

The tendency to claim more responsibility for successes than failures. It may

also manifest itself as a tendency for people to evaluate ambiguous

information in a way beneficial to their interests (see also group-serving

bias).[78]

Shared

information bias

Known as the tendency for group members to spend more time and energy

discussing information that all members are already familiar with (i.e., shared

information), and less time and energy discussing information that only some

members are aware of (i.e., unshared information).[79]

System

justification

The tendency to defend and bolster the status quo. Existing social,

economic, and political arrangements tend to be preferred, and alternatives

disparaged sometimes even at the expense of individual and collective self-

interest. (See also status quo bias.)

Page 17: List of Cognitive Biases

Trait ascription

bias

The tendency for people to view themselves as relatively variable in terms of

personality, behavior, and mood while viewing others as much more

predictable.

Ultimate

attribution error

Similar to the fundamental attribution error, in this error a person is likely to

make an internal attribution to an entire group instead of the individuals

within the group.

Worse-than-

average effect

A tendency to believe ourselves to be worse than others at tasks which are

difficult[80]

Memory errors and biases[edit]

Main article: List of memory biases

In psychology and cognitive science, a memory bias is a cognitive bias that either enhances or

impairs the recall of a memory (either the chances that the memory will be recalled at all, or the

amount of time it takes for it to be recalled, or both), or that alters the content of a reported

memory. There are many types of memory bias, including:

Name Description

'Bizarreness effect' Bizarre material is better remembered than common material.

Choice-supportive bias

In a self-justifying manner retroactively ascribing one's choices

to be more informed than they were when they were made.

Change bias

After an investment of effort in producing change,

remembering one's past performance as more difficult than it

actually was[81][unreliable source?]

Childhood amnesia The retention of few memories from before the age of four.

Page 18: List of Cognitive Biases

Conservatism orRegressive

Bias

Tendency to remember high values and high

likelihoods/probabilities/frequencies lower than they actually

were and low ones higher than they actually were. Based on

the evidence, memories are not extreme enough[24][25]

Consistency bias Incorrectly remembering one's past attitudes and behaviour as

resembling present attitudes and behaviour.[82]

Context effect

That cognition and memory are dependent on context, such

that out-of-context memories are more difficult to retrieve than

in-context memories (e.g., recall time and accuracy for a work-

related memory will be lower at home, and vice versa)suf

Cross-race effect

The tendency for people of one race to have difficulty

identifying members of a race other than their own.

Cryptomnesia

A form of misattribution where a memory is mistaken for

imagination, because there is no subjective experience of it

being a memory.[81]

Egocentric bias

Recalling the past in a self-serving manner, e.g., remembering

one's exam grades as being better than they were, or

remembering a caught fish as bigger than it really was.

Fading affect bias

A bias in which the emotion associated with unpleasant

memories fades more quickly than the emotion associated with

positive events.[83]

False memory

A form of misattribution where imagination is mistaken for a

memory.

Generation effect(Self- That self-generated information is remembered best. For

Page 19: List of Cognitive Biases

generation effect) instance, people are better able to recall memories of

statements that they have generated than similar statements

generated by others.

Google effect

The tendency to forget information that can be found readily

online by using Internet search engines .

Hindsight bias

The inclination to see past events as being more predictable

than they actually were; also called the "I-knew-it-all-along"

effect.

Humor effect

That humorous items are more easily remembered than non-

humorous ones, which might be explained by the

distinctiveness of humor, the increased cognitive processing

time to understand the humor, or the emotional arousal caused

by the humor.[citation needed]

Illusion of truth effect

That people are more likely to identify as true statements those

they have previously heard (even if they cannot consciously

remember having heard them), regardless of the actual validity

of the statement. In other words, a person is more likely to

believe a familiar statement than an unfamiliar one.

Illusory correlation

Inaccurately remembering a relationship between two

events.[23][47]

Lag effect See spacing effect.

Leveling and Sharpening

Memory distortions introduced by the loss of details in a

recollection over time, often concurrent with sharpening or

selective recollection of certain details that take on

exaggerated significance in relation to the details or aspects of

the experience lost through leveling. Both biases may be

reinforced over time, and by repeated recollection or re-telling

Page 20: List of Cognitive Biases

of a memory.[84]

Levels-of-processing effect

That different methods of encoding information into memory

have different levels of effectiveness.[85]

List-length effect

A smaller percentage of items are remembered in a longer list,

but as the length of the list increases, the absolute number of

items remembered increases as well.[86][further explanation needed]

Misinformation effect

Memory becoming less accurate because of interference

from post-event information.[87]

Modality effect

That memory recall is higher for the last items of a list when

the list items were received via speech than when they were

received through writing.

Mood-congruent memory

bias

The improved recall of information congruent with one's

current mood.

Next-in-line effect

That a person in a group has diminished recall for the words of

others who spoke immediately before himself, if they take

turns speaking.[88]

Part-list cueing effect

That being shown some items from a list and later retrieve one

item causes it to become harder to retrieve the other items[89]

Peak-end rule

That people seem to perceive not the sum of an experience

but the average of how it was at its peak (e.g. pleasant or

unpleasant) and how it ended.

Persistence The unwanted recurrence of memories of a traumatic

Page 21: List of Cognitive Biases

event.[citation needed]

Picture superiority effect

The notion that concepts that are learned by viewing pictures

are more easily and frequently recalled than are concepts that

are learned by viewing their written word form

counterparts.[90][91][92][93][94][95]

Positivity effect

That older adults favor positive over negative information in

their memories.

Primacy effect,Recency

effect &Serial position effect

That items near the end of a sequence are the easiest to

recall, followed by the items at the beginning of a sequence;

items in the middle are the least likely to be remembered.[96]

Processing difficulty effect That information which is read longer time and more thought

about (more diffculty processed) is easier remembered.[97]

Reminiscence bump

The recalling of more personal events from adolescence and

early adulthood than personal events from other lifetime

periods[98]

Rosy retrospection

The remembering of the past as having been better than it

really was.

Self-relevance effect That memories relating to the self are better recalled than

similar information relating to others.

Source confusion

Confusing episodic memories with other information, creating

distorted memories.[99]

Spacing effect

That information is better recalled if exposure to it is repeated

Page 22: List of Cognitive Biases

over a long span of time rather than a short one.

Stereotypical bias

Memory distorted towards stereotypes (e.g. racial or gender),

e.g. "black-sounding" names being misremembered as names

of criminals.[81][unreliable source?]

Suffix effect

Diminish of the recency effect because a sound item is

appended to the list that the subject is not required to

recall.[100][101]

Suggestibility

A form of misattribution where ideas suggested by a

questioner are mistaken for memory.

Telescoping effect

The tendency to displace recent events backward in time and

remote events forward in time, so that recent events appear

more remote, and remote events, more recent.

Testing effect

The fact that you easier remember information you have read

by rewriting it instead of rereading it.[102]

Tip of the tonguephenomenon

When a subject is able to recall parts of an item, or related

information, but is frustratingly unable to recall the whole item.

This is thought an instance of "blocking" where multiple similar

memories are being recalled and interfere with each other.[81]

Verbatim effect

That the "gist" of what someone has said is better

remembered than the verbatim wording.[103] This is because

memories are representations and not carbon copy clones.

Von Restorff effect

That an item that sticks out is more likely to be remembered

than other items[104]

Page 23: List of Cognitive Biases

Zeigarnik effect

That uncompleted or interrupted tasks are remembered better

than completed ones.

Common theoretical causes of some cognitive biases[edit]

Bounded rationality – limits on optimization and rationality

Prospect theory

Mental accounting

Adaptive bias – basing decisions on limited information and biasing them based on the

costs of being wrong.

Attribute substitution – making a complex, difficult judgment by unconsciously substituting

it by an easier judgment[105]

Attribution theory

Salience

Naïve realism

Cognitive dissonance, and related:

Impression management

Self-perception theory

Heuristics in judgment and decision making, including:

Availability heuristic – estimating what is more likely by what is more available in

memory, which is biased toward vivid, unusual, or emotionally charged examples[46]

Representativeness heuristic – judging probabilities on the basis of resemblance[46]

Affect heuristic – basing a decision on an emotional reaction rather than a calculation

of risks and benefits[106]

Some theories of emotion such as:

Two-factor theory of emotion

Somatic markers hypothesis

Introspection illusion

Misinterpretations or misuse of statistics; innumeracy.

A 2012 Psychological Bulletin article suggested that at least eight seemingly unrelated biases

can be produced by the same information-theoretic generative mechanism that assumes noisy

information processing during storage and retrieval of information in human memory.[23]

Page 24: List of Cognitive Biases

List of fallacies A fallacy is incorrect argument in logic and rhetoric resulting in a lack of validity, or more

generally, a lack of soundness. Fallacies are either formal fallacies or informal fallacies.

Formal fallacies

A formal fallacy is an error in logic that can be seen in the argument's form.[1] All formal fallacies

are specific types of non sequiturs.

Appeal to probability – is a statement that takes something for granted because it would

probably be the case (or might be the case).[2][3]

Argument from fallacy – assumes that if an argument for some conclusion is fallacious, then

the conclusion itself is false.[4]

Base rate fallacy – making a probability judgement based on conditional probabilities,

without taking into account the effect of prior probabilities.[5]

Conjunction fallacy – assumption that an outcome simultaneously satisfying multiple

conditions is more probable than an outcome satisfying a single one of them.[6]

Masked man fallacy (illicit substitution of identicals) – the substitution of identical

designators in a true statement can lead to a false one.[7]

Propositional fallacies[edit]

A propositional fallacy is an error in logic that concerns compound propositions. For a

compound proposition to be true, the truth values of its constituent parts must satisfy the

relevant logical connectives which occur in it (most commonly: <and>, <or>, <not>, <only if>, <if

and only if>). The following fallacies involve inferences whose correctness is not guaranteed by

the behavior of those logical connectives, and hence, which are not logically guaranteed to yield

true conclusions.

Types of Propositional fallacies:

Affirming a disjunct – concluded that one disjunct of a logical disjunction must be false

because the other disjunct is true; A or B; A; therefore not B.[8]

Affirming the consequent – the antecedent in an indicative conditional is claimed to be true

because the consequent is true; if A, then B; B, therefore A.[8]

Denying the antecedent – the consequent in an indicative conditional is claimed to be false

because the antecedent is false; if A, then B; not A, therefore not B.[8]

Quantification fallacies[edit]

A quantification fallacy is an error in logic where the quantifiers of the premises are in

contradiction to the quantifier of the conclusion.

Types of Quantification fallacies:

Page 25: List of Cognitive Biases

Existential fallacy – an argument has a universal premise and a particular conclusion.[9]

Formal syllogistic fallacies[edit]

Syllogistic fallacies – logical fallacies that occur in syllogisms.

Affirmative conclusion from a negative premise (illicit negative) – when a

categorical syllogism has a positive conclusion, but at least one negative premise.[9]

Fallacy of exclusive premises – a categorical syllogism that is invalid because both of its

premises are negative.[9]

Fallacy of four terms (quaternio terminorum) – a categorical syllogism that has four terms.[10]

Illicit major – a categorical syllogism that is invalid because its major term is

not distributed in the major premise but distributed in the conclusion.[9]

Illicit minor – a categorical syllogism that is invalid because its minor term is not distributed

in the minor premise but distributed in the conclusion.[9]

Negative conclusion from affirmative premises (illicit affirmative) – when a categorical

syllogism has a negative conclusion but affirmative premises. [9]

Fallacy of the undistributed middle – the middle term in a categorical syllogism is not

distributed.[11]

Informal fallacies[edit]

Main article: Informal fallacy

Informal fallacies – arguments that are fallacious for reasons other than structural (formal) flaws

and which usually require examination of the argument's content.[12]

Argument from ignorance (appeal to ignorance, argumentum ad ignorantiam) – assuming

that a claim is true because it has not been or cannot be proven false, or vice versa.[13]

Argument from (personal) incredulity (divine fallacy, appeal to common sense) – I cannot

imagine how this could be true, therefore it must be false.[14][15]

Argument from repetition (argumentum ad nauseam) – signifies that it has been discussed

extensively until nobody cares to discuss it anymore.

Argument from silence (argumentum e silentio) – where the conclusion is based on the

absence of evidence, rather than the existence of evidence.

Argument to moderation (false compromise, middle ground, fallacy of the

mean, argumentum ad temperantiam) – assuming that the compromise between two

positions is always correct.[16]

Argumentum ad hominem – the evasion of the actual topic by directing the attack at your

opponent.

Argumentum verbosium – See Proof by verbosity, below.

Page 26: List of Cognitive Biases

Begging the question (petitio principii) – providing what is essentially the conclusion of the

argument as a premise.

(shifting the) Burden of proof (see – onus probandi) – I need not prove my claim, you must

prove it is false.

Circular reasoning (circulus in demonstrando) – when the reasoner begins with what he or

she is trying to end up with; sometimes called assuming the conclusion.

Circular cause and consequence – where the consequence of the phenomenon is claimed

to be its root cause.

Continuum fallacy (fallacy of the beard, line-drawing fallacy, sorites fallacy, fallacy of the

heap, bald man fallacy) – improperly rejecting a claim for being imprecise.[17]

Correlative-based fallacies

Correlation proves causation (cum hoc ergo propter hoc) – a faulty assumption that

correlation between two variables implies that one causes the other.[18]

Suppressed correlative – where a correlative is redefined so that one alternative is

made impossible.[19]

Equivocation – the misleading use of a term with more than one meaning (by glossing over

which meaning is intended at a particular time).[20]

Ambiguous middle term – a common ambiguity in syllogisms in which the middle term is

equivocated.[21]

Ecological fallacy – inferences about the nature of specific individuals are based solely upon

aggregate statistics collected for the group to which those individuals belong.[22]

Etymological fallacy – which reasons that the original or historical meaning of a word or

phrase is necessarily similar to its actual present-day meaning.[23]

Fallacy of composition – assuming that something true of part of a whole must also be true

of the whole.[24]

Fallacy of division – assuming that something true of a thing must also be true of all or some

of its parts.[25]

False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – two

alternative statements are held to be the only possible options, when in reality there are

more.[26]

Fallacy of many questions (complex question, fallacy of presupposition, loaded

question, plurium interrogationum) – someone asks a question that presupposes something

that has not been proven or accepted by all the people involved. This fallacy is often used

rhetorically, so that the question limits direct replies to those that serve the questioner's

agenda.

Fallacy of the single cause (causal oversimplification[27]) – it is assumed that there is one,

simple cause of an outcome when in reality it may have been caused by a number of only

jointly sufficient causes.

Page 27: List of Cognitive Biases

False attribution – an advocate appeals to an irrelevant, unqualified, unidentified, biased or

fabricated source in support of an argument.

Fallacy of quoting out of context (contextomy) – refers to the selective excerpting of

words from their original context in a way that distorts the source's intended meaning.[28]

False authority (single authority) – using an expert of dubious credentials and/or using only

one opinion to sell a product or idea. Related to the appeal to authority fallacy.

Gambler's fallacy – the incorrect belief that separate, independent events can affect the

likelihood of another random event. If a coin flip lands on heads 10 times in a row, the belief

that it is "due to the number of times it had previously landed on tails" is incorrect.[29]

Hedging – using words with ambiguous meanings, then changing the meaning of them later.

Historian's fallacy – occurs when one assumes that decision makers of the past viewed

events from the same perspective and having the same information as those subsequently

analyzing the decision.[30] (Not to be confused with presentism, which is a mode of historical

analysis in which present-day ideas, such as moral standards, are projected into the past.)

Homunculus fallacy – where a "middle-man" is used for explanation, this sometimes leads

to regressive middle-men. Explains without actually explaining the real nature of a function

or a process. Instead, it explains the concept in terms of the concept itself, without first

defining or explaining the original concept. Explaining thought as something produced by a

little thinker, a sort of homunculus inside the head, merely explains it as another kind of

thinking (as different but the same).[31]

Inflation of conflict – The experts of a field of knowledge disagree on a certain point, so the

scholars must know nothing, and therefore the legitimacy of their entire field is put to

question.[32]

If-by-whiskey – an argument that supports both sides of an issue by using terms that are

selectively emotionally sensitive.

Incomplete comparison – in which insufficient information is provided to make a complete

comparison.

Inconsistent comparison – where different methods of comparison are used, leaving one

with a false impression of the whole comparison.

Ignoratio elenchi (irrelevant conclusion, missing the point) – an argument that may in itself

be valid, but does not address the issue in question.[33]

Kettle logic – using multiple inconsistent arguments to defend a position.

Ludic fallacy – the belief that the outcomes of non-regulated random occurrences can be

encapsulated by a statistic; a failure to take into account unknown unknowns in determining

the probability of events taking place.[34]

Mind projection fallacy – when one considers the way one sees the world as the way the

world really is.

Moral high ground fallacy – in which one assumes a "holier-than-thou" attitude in an attempt

to make oneself look good to win an argument.

Page 28: List of Cognitive Biases

Moralistic fallacy – inferring factual conclusions from purely evaluative premises in violation

of fact–value distinction. For instance, inferring is from ought is an instance of moralistic

fallacy. Moralistic fallacy is the inverse of naturalistic fallacy defined below.

Moving the goalposts (raising the bar) – argument in which evidence presented in response

to a specific claim is dismissed and some other (often greater) evidence is demanded.

Naturalistic fallacy – inferring evaluative conclusions from purely factual premises[35] in

violation of fact–value distinction. For instance, inferring ought from is (sometimes referred

to as the is-ought fallacy) is an instance of naturalistic fallacy. Also naturalistic fallacy in a

stricter sense as defined in the section "Conditional or questionable fallacies" below is an

instance of naturalistic fallacy. Naturalistic fallacy is the inverse of moralistic fallacy.

Naturalistic fallacy fallacy[36] (anti-naturalistic fallacy[37]) – inferring impossibility to infer any

instance of ought from is from the general invalidity of is-ought fallacy mentioned above. For

instance, is does imply ought for any proposition , although

the naturalistic fallacy fallacy would falsely declare such an inference invalid. Naturalistic

fallacy fallacy is an instance of argument from fallacy.

Nirvana fallacy (perfect solution fallacy) – when solutions to problems are rejected because

they are not perfect.

Onus probandi – from Latin "onus probandi incumbit ei qui dicit, non ei qui negat" the

burden of proof is on the person who makes the claim, not on the person who denies (or

questions the claim). It is a particular case of the "argumentum ad ignorantiam" fallacy, here

the burden is shifted on the person defending against the assertion.

Petitio principii – see begging the question.

Post hoc ergo propter hoc Latin for "after this, therefore because of this" (faulty cause/effect,

coincidental correlation, correlation without causation) – X happened then Y happened;

therefore X caused Y. The Loch Ness Monster has been seen in this loch. Something tipped

our boat over; it's obviously the Loch Ness Monster.[38]

Proof by verbosity (argumentum verbosium, proof by intimidation) – submission of others to

an argument too complex and verbose to reasonably deal with in all its intimate details.

(See also Gish Gallop and argument from authority.)

Prosecutor's fallacy – a low probability of false matches does not mean a low probability

of some false match being found.

Proving too much - using a form of argument that, if it were valid, could be used more

generally to reach an absurd conclusion.

Psychologist's fallacy – an observer presupposes the objectivity of his own perspective

when analyzing a behavioral event.

Red herring – a speaker attempts to distract an audience by deviating from the topic at hand

by introducing a separate argument which the speaker believes will be easier to speak to.[39]

Page 29: List of Cognitive Biases

Referential fallacy[40] – assuming all words refer to existing things and that the meaning of

words reside within the things they refer to, as opposed to words possibly referring no real

object or that the meaning of words often comes from how we use them.

Regression fallacy – ascribes cause where none exists. The flaw is failing to account for

natural fluctuations. It is frequently a special kind of the post hoc fallacy.

Reification (hypostatization) – a fallacy of ambiguity, when an abstraction (abstract belief or

hypothetical construct) is treated as if it were a concrete, real event or physical entity. In

other words, it is the error of treating as a "real thing" something which is not a real thing,

but merely an idea.

Retrospective determinism – the argument that because some event has occurred, its

occurrence must have been inevitable beforehand.

Shotgun argumentation – the arguer offers such a large number of arguments for their

position that the opponent can't possibly respond to all of them. (See "Argument by

verbosity" and "Gish Gallop", above.)

Special pleading – where a proponent of a position attempts to cite something as an

exemption to a generally accepted rule or principle without justifying the exemption.

Wrong direction – cause and effect are reversed. The cause is said to be the effect and vice

versa.[41]

Faulty generalizations[edit]

Faulty generalizations – reach a conclusion from weak premises. Unlike fallacies of relevance,

in fallacies of defective induction, the premises are related to the conclusions yet only weakly

buttress the conclusions. A faulty generalization is thus produced.

Accident – an exception to a generalization is ignored.[42]

No true Scotsman – when a generalization is made true only when a counterexample is

ruled out on shaky grounds.[43]

Cherry picking (suppressed evidence, incomplete evidence) – act of pointing at individual

cases or data that seem to confirm a particular position, while ignoring a significant portion

of related cases or data that may contradict that position.[44]

False analogy – an argument by analogy in which the analogy is poorly suited.[45]

Hasty generalization (fallacy of insufficient statistics, fallacy of insufficient sample, fallacy of

the lonely fact, leaping to a conclusion, hasty induction, secundum quid, converse accident)

– basing a broad conclusion on a small sample.[46]

Inductive fallacy – A more general name to some fallacies, such as hasty generalization. It

happens when a conclusion is made of premises which lightly supports it.

Misleading vividness – involves describing an occurrence in vivid detail, even if it is an

exceptional occurrence, to convince someone that it is a problem.

Page 30: List of Cognitive Biases

Overwhelming exception – an accurate generalization that comes with qualifications which

eliminate so many cases that what remains is much less impressive than the initial

statement might have led one to assume.[47]

Pathetic fallacy – when an inanimate object is declared to have characteristics of animate

objects.[48]

Thought-terminating cliché – a commonly used phrase, sometimes passing as folk wisdom,

used to quell cognitive dissonance, conceal lack of thought-entertainment, move onto other

topics etc. but in any case, end the debate with a cliche—not a point.

Red herring fallacies[edit]

A red herring fallacy is an error in logic where a proposition is, or is intended to be, misleading in

order to make irrelevant or false inferences. In the general case any logical inference based on

fake arguments, intended to replace the lack of real arguments or to replace implicitly the

subject of the discussion.

Red herring – argument given in response to another argument, which is irrelevant and draws

attention away from the subject of argument. See also irrelevant conclusion.

Ad hominem – attacking the arguer instead of the argument.

Poisoning the well – a type of ad hominem where adverse information about a target is

presented with the intention of discrediting everything that the target person says.[49]

Abusive fallacy – a subtype of "ad hominem" when it turns into verbal abuse of the

opponent rather than arguing about the originally proposed argument.

Argumentum ad baculum (appeal to the stick, appeal to force, appeal to threat) – an

argument made through coercion or threats of force to support position.[50]

Argumentum ad populum (appeal to widespread belief, bandwagon argument, appeal to the

majority, appeal to the people) – where a proposition is claimed to be true or good solely

because many people believe it to be so.[51]

Appeal to equality – where an assertion is deemed true or false based on an assumed

pretense of equality.

Association fallacy (guilt by association) – arguing that because two things share a property

they are the same.

Appeal to authority (argumentum ab auctoritate) – where an assertion is deemed true

because of the position or authority of the person asserting it.[52][53]

Appeal to accomplishment – where an assertion is deemed true or false based on the

accomplishments of the proposer.

Appeal to consequences (argumentum ad consequentiam) – the conclusion is supported by

a premise that asserts positive or negative consequences from some course of action in an

attempt to distract from the initial discussion.[54]

Page 31: List of Cognitive Biases

Appeal to emotion – where an argument is made due to the manipulation of emotions,

rather than the use of valid reasoning. [55]

Appeal to fear – a specific type of appeal to emotion where an argument is made by

increasing fear and prejudice towards the opposing side

Appeal to flattery – a specific type of appeal to emotion where an argument is made due

to the use of flattery to gather support.[56]

Appeal to pity (argumentum ad misericordiam) – an argument attempts to induce pity to

sway opponents.[57]

Appeal to ridicule – an argument is made by presenting the opponent's argument in a

way that makes it appear ridiculous.

Appeal to spite – a specific type of appeal to emotion where an argument is made

through exploiting people's bitterness or spite towards an opposing party.

Wishful thinking – a specific type of appeal to emotion where a decision is made

according to what might be pleasing to imagine, rather than according to evidence or

reason.[58]

Appeal to motive – where a premise is dismissed by calling into question the motives of its

proposer.

Appeal to novelty (argumentum novitatis/antiquitatis) – where a proposal is claimed to be

superior or better solely because it is new or modern.[59]

Appeal to poverty (argumentum ad Lazarum) – supporting a conclusion because the arguer

is poor (or refuting because the arguer is wealthy). (Opposite of appeal to wealth.)[60]

Appeal to tradition (argumentum ad antiquitam) – a conclusion supported solely because it

has long been held to be true.[61]

Appeal to nature – wherein judgment is based solely on whether the subject of judgment is

'natural' or 'unnatural'.[citation needed]

Appeal to wealth (argumentum ad crumenam) – supporting a conclusion because the

arguer is wealthy (or refuting because the arguer is poor).[62] (Sometimes taken together

with the appeal to poverty as a general appeal to the arguer's financial situation.)

Argument from silence (argumentum ex silentio) – a conclusion based on silence or lack of

contrary evidence.

Bulverism (Psychogenetic Fallacy) – inferring why an argument is being used, associating it

to some psychological reason, then assuming it is invalid as a result. It is wrong to assume

that if the origin of an idea comes from a biased mind, then the idea itself must also be a

false.[32]

Chronological snobbery – where a thesis is deemed incorrect because it was commonly

held when something else, clearly false, was also commonly held.[citation needed]

Page 32: List of Cognitive Biases

Fallacy of relative privation – dismissing an argument due to the existence of more

important, but unrelated, problems in the world.

Genetic fallacy – where a conclusion is suggested based solely on something or someone's

origin rather than its current meaning or context.[63]

Judgmental language – insulting or pejorative language to influence the recipient's

judgment.

Naturalistic fallacy (is–ought fallacy,[64] naturalistic fallacy[65]) – claims about what ought to

be on the basis of statements about what is.

Reductio ad Hitlerum (playing the Nazi card) – comparing an opponent or their argument to

Hitler or Nazism in an attempt to associate a position with one that is universally reviled.

(See also – Godwin's law)

Straw man – an argument based on misrepresentation of an opponent's position.[66]

Texas sharpshooter fallacy – improperly asserting a cause to explain a cluster of data.[67]

Tu quoque ("you too", appeal to hypocrisy, I'm rubber and you're glue) – the argument

states that a certain position is false or wrong and/or should be disregarded because its

proponent fails to act consistently in accordance with that position.[68]

Two wrongs make a right – occurs when it is assumed that if one wrong is committed,

another wrong will cancel it out.[69]

Conditional or questionable fallacies[edit]

Broken window fallacy – an argument which disregards lost opportunity costs (typically non-

obvious, difficult to determine or otherwise hidden) associated with destroying property of

others, or other ways of externalizing costs onto others. For example, an argument that

states breaking a window generates income for a window fitter, but disregards the fact that

the money spent on the new window cannot now be spent on new shoes.[70]

Definist fallacy – involves the confusion between two notions by defining one in terms of the

other.[71]

Naturalistic fallacy – attempts to prove a claim about ethics by appealing to a definition of

the term "good" in terms of either one or more claims about natural properties (sometimes

also taken to mean the appeal to nature)[citation needed] or God's will.

Slippery slope (thin edge of the wedge, camel's nose) – asserting that a relatively small first

step inevitably leads to a chain of related events culminating in some significant

impact/event that should not happen, thus the first step should not happen. While this

fallacy is a popular one, it is, in its essence, an appeal to probability fallacy. (e.g. if person x

does y then z would (probably) occur, leading to q, leading to w, leading to e.)[72] This is also

related to the Reductio ad absurdum.

Page 33: List of Cognitive Biases

List of memory biases

In psychology and cognitive science, a memory bias is a cognitive bias that either enhances or impairs

the recall of a memory (either the chances that the memory will be recalled at all, or the amount of time it

takes for it to be recalled, or both), or that alters the content of a reported memory. There are many

different types of memory biases, including:

Choice-supportive bias: remembering chosen options as having been better than rejected options

(Mather, Shafir & Johnson, 2000)

Change bias: after an investment of effort in producing change, remembering one's past

performance as more difficult than it actually was.[1]

Childhood amnesia: the retention of few memories from before the age of four.

Conservatism or Regressive Bias: tendency to remember high values and high

likelihoods/probabilities/frequencies lower than they actually were and low ones higher than they

actually were. Based on the evidence, memories are not extreme enough.[2][3]

Consistency bias: incorrectly remembering one's past attitudes and behaviour as resembling

present attitudes and behaviour.

Context effect: that cognition and memory are dependent on context, such that out-of-context

memories are more difficult to retrieve than in-context memories (e.g., recall time and accuracy for a

work-related memory will be lower at home, and vice versa).

Cross-race effect: the tendency for people of one race to have difficulty identifying members of a

race other than their own.

Cryptomnesia: a form of misattribution where a memory is mistaken for imagination, because there

is no subjective experience of it being a memory.[1]

Egocentric bias: recalling the past in a self-serving manner, e.g., remembering one's exam grades

as being better than they were, or remembering a caught fish as bigger than it really was.

Fading affect bias: a bias in which the emotion associated with unpleasant memories fades more

quickly than the emotion associated with positive events.[4]

Gender differences in eyewitness memory: the tendency for a witness to remember more details

about someone of the same gender.

Generation effect (Self-generation effect): that self-generated information is remembered best. For

instance, people are better able to recall memories of statements that they have generated than

similar statements generated by others.

Google effect: the tendency to forget information that can be easily found online.

Hindsight bias: the inclination to see past events as being predictable; also called the "I-knew-it-all-

along" effect.

Humor effect: that humorous items are more easily remembered than non-humorous ones, which

might be explained by the distinctiveness of humor, the increased cognitive processing time to

understand the humor, or the emotional arousal caused by the humor.

Illusion-of-truth effect: that people are more likely to identify as true statements those they have

previously heard (even if they cannot consciously remember having heard them), regardless of the

Page 34: List of Cognitive Biases

actual validity of the statement. In other words, a person is more likely to believe a familiar statement

than an unfamiliar one.

Illusory correlation: inaccurately seeing a relationship between two events related by coincidence.[5]

Lag effect: see spacing effect.

Leveling and Sharpening: memory distortions introduced by the loss of details in a recollection over

time, often concurrent with sharpening or selective recollection of certain details that take on

exaggerated significance in relation to the details or aspects of the experience lost through leveling.

Both biases may be reinforced over time, and by repeated recollection or re-telling of a memory.[6]

Levels-of-processing effect: that different methods of encoding information into memory have

different levels of effectiveness (Craik & Lockhart, 1972).

List-length effect: a smaller percentage of items are remembered in a longer list, but as the length

of the list increases, the absolute number of items remembered increases as well.

Misinformation effect: that misinformation affects people's reports of their own memory.

Misattribution of memory: when information is retained in memory but the source of the memory is

forgotten. One of Schacter's (1999) Seven Sins of Memory, Misattribution was divided into Source

Confusion, Cryptomnesia and False Recall/False Recognition.[1]

Memory inhibition: that being shown some items from a list makes it harder to retrieve the other

items (e.g., Slamecka, 1968).

Modality effect: that memory recall is higher for the last items of a list when the list items were

received via speech than when they were received via writing.

Mood congruent memory bias: the improved recall of information congruent with one's current

mood.

Next-in-line effect: that a person in a group has diminished recall for the words of others who spoke

immediately before or after this person.

Peak-end effect: that people seem to perceive not the sum of an experience but the average of how

it was at its peak (e.g. pleasant or unpleasant) and how it ended.

Persistence: the unwanted recurrence of memories of a traumatic event.

Picture superiority effect: that concepts are much more likely to be remembered experientially if

they are presented in picture form than if they are presented in word form.[7]

Placement bias: tendency to remember ourselves to be better than others at tasks at which we rate

ourselves above average (also Illusory superiority or Better-than-average effect)[8]

and tendency

to remember ourselves to be worse than others at tasks at which we rate ourselves below average

(also Worse-than-average effect).[9]

Positivity effect: that older adults favor positive over negative information in their memories.

Primacy effect, Recency effect & Serial position effect:[10]

that items near the end of a list are the

easiest to recall, followed by the items at the beginning of a list; items in the middle are the least likely

to be remembered.[10]

Processing difficulty effect.

Reminiscence bump: the recalling of more personal events from adolescence and early adulthood

than personal events from other lifetime periods (Rubin, Wetzler & Nebes, 1986; Rubin, Rahhal &

Poon, 1998).

Rosy retrospection: the remembering of the past as having been better than it really was.

Page 35: List of Cognitive Biases

Self-reference effect: the phenomena that memories encoded with relation to the self are better

recalled than similar information encoded otherwise.

Self-serving bias: perceiving oneself responsible for desirable outcomes but not responsible for

undesirable ones.

Source Confusion: misattributing the source of a memory, e.g. misremembering that one saw an

event personally when actually it was seen on television.

Spacing effect: that information is better recalled if exposure to it is repeated over a longer span of

time.

Stereotypical bias: memory distorted towards stereotypes (e.g. racial or gender), e.g. "black-

sounding" names being misremembered as names of criminals.[1]

Suffix effect: the weakening of the recency effect in the case that an item is appended to the list that

the subject is not required to recall (Morton, Crowder & Prussin, 1971).

Suggestibility: a form of misattribution where ideas suggested by a questioner are mistaken for

memory.

Subadditivity effect: the tendency to estimate that the likelihood of a remembered event is less than

the sum of its (more than two) mutually exclusive components.[11]

Telescoping effect: the tendency to displace recent events backward in time and remote events

forward in time, so that recent events appear more remote, and remote events, more recent.

Testing effect: that frequent testing of material that has been committed to memory improves

memory recall.

Tip of the tongue: when a subject is able to recall parts of an item, or related information, but is

frustratingly unable to recall the whole item. This is thought to be an instance of "blocking" where

multiple similar memories are being recalled and interfere with each other.[1]

Verbatim effect: that the "gist" of what someone has said is better remembered than the verbatim

wording (Poppenk, Walia, Joanisse, Danckert, & Köhler, 2006).

Von Restorff effect: that an item that sticks out is more likely to be remembered than other items

(von Restorff, 1933).

Zeigarnik effect: that uncompleted or interrupted tasks are remembered better than completed ones.

Page 36: List of Cognitive Biases

Outline of public relations

The following outline is provided as an overview of and topical guide to public relations:

Public relations – actions of an organization or individual in promoting goodwill between itself

and the organisation's publics (PR rarely has the resources to focus on the "general public" and

so segments populations into "target publics" being those specific groups of like-minded people

with an ability to "help" or hurt" the organisation from sustaining itself within its operating

environment), the community (e.g. neighbours), media, government (three-tiers), competitors,

staff and employees, suppliers and contractors, investors, and so on.

Nature of public relations[edit]

Public relations can be described as all of the following:

Academic discipline – branch of knowledge that is taught and researched at the college or

university level. Disciplines are defined (in part), and recognized by the academic journals in

which research is published, and the learned societies and academic departments or

faculties to which their practitioners belong.

Communication – activity of conveying information.

Marketing – process which creates, communicates, and delivers value to the customer, and

maintains the relationship with customers.

Essence of public relations[edit]

To create and sustain "shared meaning" or "common understanding" - NB this may be and

usually is different to "shared beliefs"

Propaganda: the general propagation of information for a specific purpose

Psychological warfare:

Psyops

Public relations: techniques used to influence the publics' perception of an organization

Publicity: PR techniques used to promote a specific product or brand

Spin (public relations)

Spin: both the objective of a PR campaign and the act of obtaining that objective

Page 37: List of Cognitive Biases

Airborne leaflet propaganda is a form of psychological warfare in

which leaflets (flyers) are scattered in the air. Military forces have used aircraft to drop leaflets to

attempt to alter the behavior of combatants and civilians in enemy-controlled territory,

sometimes in conjunction with air strikes. Humanitarian air missions, in cooperation with leaflet

propaganda, can turn civilians against their leadershipwhile preparing them for the arrival of

enemy troops.

Astroturfing is the practice of masking the sponsors of a message (e.g. political,

advertising, or public relations) to give the appearance of it coming from a

disinterested,grassroots participant. Astroturfing is intended to give the statements the credibility

of an independent entity by withholding information about the source's financial connection. The

term astroturfing is a derivation of AstroTurf, a brand of synthetic carpeting designed to look like

natural grass.

On the Internet, astroturfers use software to mask their identity. Sometimes one individual

operates over many personas to give the impression of widespread support for their client's

agenda.[1][2] Some studies suggest astroturfing can alter public viewpoints and create enough

doubt to inhibit action.

The term atrocity story (also referred to as atrocity tale) as defined by

the American sociologists David G. Bromley and Anson D. Shupe refers to the symbolic

presentation of action or events (real or imaginary) in such a context that they are made

flagrantly to violate the (presumably) shared premises upon which a given set of social

relationships should be conducted. The recounting of such tales is intended as a means of

reaffirming normative boundaries. By sharing the reporter's disapproval or horror, an audience

reasserts normative prescription and clearly locates the violator beyond the limits of public

morality. The term was coined in 1979 by Bromley, Shupe, and Joseph Ventimiglia. [1]

Bromley and others define an atrocity as an event that is perceived as a flagrant violation of a

fundamental value. It contains the following three elements

1. moral outrage or indignation

2. authorization of punitive measures

3. mobilization of control efforts against the apparent perpetrators.

The veracity of the story is considered irrelevant.[2]

Page 38: List of Cognitive Biases

The bandwagon effect is a well documented form of groupthink in behavioral

science and has many applications.[which?] The general rule is that conduct or beliefs spread

among people, as fads and trends clearly do, with "the probability of any individual adopting it

increasing with the proportion who have already done so".[1] As more people come to believe in

something, others also "hop on the bandwagon" regardless of the underlying evidence.

The tendency to follow the actions or beliefs of others can occur because individuals directly

prefer to conform, or because individuals derive information from others. Both explanations

have been used for evidence of conformity in psychological experiments. For example, social

pressure has been used to explain Asch's conformity experiments,[2] and information has been

used to explain Sherif's autokinetic experiment.[3]

The Big Lie (German: Große Lüge) is a propaganda technique. The expression was

coined by Adolf Hitler, when he dictated his 1925 book Mein Kampf, about the use of a lie so

"colossal" that no one would believe that someone "could have the impudence to distort

the truth so infamously." Hitler asserted the technique was used by Jews to unfairly blame

Germany's loss in World War I on German Army officer Erich Ludendorff.

Black propaganda is false information and material that purports to be from a

source on one side of a conflict, but is actually from the opposing side. It is typically used to

vilify, embarrass or misrepresent the enemy.[1] Black propaganda contrasts with grey

propaganda, the source of which is not identified, and white propaganda, in which the real

source is declared and usually more accurate information is given, albeit slanted, distorted and

omissive. Black propaganda is covert in nature in that its aims, identity, significance, and

sources are hidden.

The major characteristic of black propaganda is that the people are not aware that someone is

trying to influence them, and do not feel that they are being pushed in a certain

direction.[2] Black propaganda purports to emanate from a source other than the true source.

This type of propaganda is associated with covert psychological operations.[3]Sometimes the

source is concealed or credited to a false authority and spreads lies, fabrications, and

deceptions. Black propaganda is the "big lie," including all types of creative deceit.[4] Black

propaganda relies on the willingness of the receiver to accept the credibility of the source. If the

creators or senders of the black propaganda message do not adequately understand their

intended audience, the message may be misunderstood, seem suspicious, or fail altogether.[4]

Page 39: List of Cognitive Biases

Governments will generally conduct black propaganda operations for two different reasons.

First, by using black propaganda a government is more likely to succeed in convincing their

target audience that the information that they are seeking to influence them with is disguised

and that its motivations are not apparent. Second, there are diplomatic reasons behind the use

of black propaganda. Black propaganda is necessary to obfuscate a government's involvement

in activities that may be detrimental to its foreign policies.[5]

A buzzword is a word or phrase used to impress, or one that is fashionable.

Buzzwords often originate in jargon or are neologisms.[1]

The term was first used in 1946 as student slang.[2]

Card stacking is a propaganda technique that seeks to manipulate audience

perception of an issue by emphasizing one side and repressing another.[1] Such emphasis may

be achieved through media bias or the use of one-sided testimonials, or by simply censoring the

voices of critics. The technique is commonly used in persuasive speeches by political

candidates to discredit their opponents and to make themselves seem more worthy.[2]

The term originates from the magician's gimmick of "stacking the deck", which involves

presenting a deck of cards that appears to have been randomly shuffled but which is, in fact,

'stacked' in a specific order. The magician knows the order and is able to control the outcome of

the trick. In poker, cards can be stacked so that certain hands are dealt to certain players.[3]

The phenomenon can be applied to any subject and has wide applications. Whenever a broad

spectrum of information exists, appearances can be rigged by highlighting some facts and

ignoring others. Card stacking can be a tool of advocacy groups or of those groups with specific

agendas.[4] For example, an enlistment poster might focus upon an impressive picture, with

words such as "travel" and "adventure", while placing the words, "enlist for two to four years" at

the bottom in a smaller and less noticeable point size

In communication, a code word is an element of a standardized code or protocol.

Each code word is assembled in accordance with the specific rules of the code and assigned a

unique meaning. Code words are typically used for reasons of reliability, clarity, brevity, or

secrecy.

Page 40: List of Cognitive Biases

Communist propaganda is propaganda aimed to advance

the ideology of communism, communist worldview and interests of the communist movement.

A Bolshevik theoretician, Nikolai Bukharin, in his The ABC of Communism wrote:[1]

The State propaganda of communism becomes in the long run a means for the eradication of

the last traces of bourgeois propaganda dating from the old régime; and it is a powerful

instrument for the creation of a new ideology, of new modes of thought, of a new outlook on the

world.

A corporate identity is the overall image of a corporation or firm or business in

the minds of diverse publics, such as customers and investors and employees. It is a primary

task of the corporate communications department to maintain and build this identity to accord

with and facilitate the attainment of business objectives. It is usually visibly manifested by way

of branding and the use of trademarks.[1]

Corporate identity comes into being when there is a common ownership of an organizational

philosophy that is manifest in a distinct corporate culture — the corporate personality. At its

most profound, the public feel that they have ownership of the philosophy. Corporate identity

helps organizations to answer questions like ―who are we?‖ and ―where are we going?‖

Corporate identity also allows consumers to denote their sense of belonging with particular

human aggregates or groups.[2]

In general, this amounts to a corporate title, logo (logotype and/or logogram) and supporting

devices commonly assembled within a set of guidelines. These guidelines govern how the

identity is applied and confirm approved colour palettes, typefaces, page layouts and other

such.

A corporation is a separate legal entity that has been incorporated either directly

through legislation or through a registration process established by law. Incorporated entities

have legal rights and liabilities that are distinct from their employees and shareholders,[1] and

may conduct business as either a profit-seeking business or not for profit business. Early

incorporated entities were established by charter (i.e. by an ad hoc act granted by a monarch or

passed by a parliament or legislature). Most jurisdictions now allow the creation of new

Page 41: List of Cognitive Biases

corporations through registration. In addition to legal personality, registered corporations tend to

have limited liability, be owned byshareholders[2][3] who can transfer their shares to others, and

controlled by a board of directors who are normally elected or appointed by the shareholders.

In American English the word corporation is widely used to describe large business

corporations.[4] In British English and in the commonwealth countries, the term company is

more widely used to describe the same sort of entity while the word corporationencompasses

all incorporated entities. In American English, the word company can include entities such

as partnerships that would not be referred to as companies in British English as they are not

a separate legal entity.

Despite not being human beings, corporations, as far as the law is concerned, are legal

persons, and have many of the same rights and responsibilities as natural people do.

Corporations can exercise human rights against real individuals and the state,[5][6] and they can

themselves be responsible for human rights violations.[7] Corporations can be "dissolved" either

by statutory operation, order of court, or voluntary action on the part of

shareholders. Insolvency may result in a form of corporate failure, when creditors force the

liquidation and dissolution of the corporation under court order,[8] but it most often results in a

restructuring of corporate holdings. Corporations can even be convicted of criminal offenses,

such as fraud and manslaughter. However corporations are not considered living entities in the

way that humans are.[9]

A cult of personality arises when an individual uses mass media, propaganda,

or other methods, to create an idealized, heroic, and at times, god-like public image, often

through unquestioning flattery and praise. Sociologist Max Weberdeveloped a tripartite

classification of authority; the cult of personality holds parallels with what Weber defined as

"charismatic authority". A cult of personality is similar to hero worship, except that it is

established by mass media and propaganda.

Demonization is the reinterpretation of polytheistic deities as evil, lying demons by

other religions, generally monotheistic and henotheistic ones. The term has since been expanded to refer

to any characterization of individuals, groups, or political bodies as evil.

Page 42: List of Cognitive Biases

Disinformation is intentionally false or inaccurate information that is spread

deliberately. It is an act of deception and false statements to convince someone of untruth.

Disinformation should not be confused with misinformation, information that is unintentionally

false.

Unlike traditional propaganda techniques designed to engage emotional support, disinformation

is designed to manipulate the audience at the rational level by either discrediting conflicting

information or supporting false conclusions. A common disinformation tactic is to mix some truth

and observation with false conclusions and lies, or to reveal part of the truth while presenting it

as the whole (a limited hangout).

Another technique of concealing facts, or censorship, is also used if the group can affect such

control. When channels of information cannot be completely closed, they can be rendered

useless by filling them with disinformation, effectively lowering their signal-to-noise ratio and

discrediting the opposition by association with many easily disproved false claims.

Dog-whistle politics is political messaging employing coded language that

appears to mean one thing to the general population but has an additional, different or more

specific resonance for a targeted subgroup. The phrase is only ever used as a pejorative,

because of the inherently deceptive nature of the practice and because the dog-whistle

messages are frequently themselves distasteful, for example by empathising with racist or

revolutionary attitudes. It is an analogy to dog whistles, which are built in such a way that their

high-frequency whistle is heard by dogs, but is inaudible to humans.

The term can be distinguished from "code words" used by hospital staff or other specialist

workers, in that dog-whistling is specific to the political realm, and the messaging referred to as

the dog-whistle has an understandable meaning for a general audience, rather than being

incomprehensible.

Doublespeak is language that deliberately disguises, distorts, or reverses

the meaning of words. Doublespeak may take the form of euphemisms (e.g., "downsizing"

for layoffs, "servicing the target" for bombing[1]), in which case it is primarily meant to make the

truth sound more palatable. It may also refer to intentional ambiguity in language or to actual

inversions of meaning (for example, naming a state of war "peace"). In such cases,

doublespeak disguises the nature of the truth. Doublespeak is most closely associated with

political language.[2][3]

Page 43: List of Cognitive Biases

Enterperience: fusing entertainment and experience together

A euphemism is a generally innocuous word or expression used in place of one that

may be found offensive or suggest something unpleasant.[1] Some euphemisms are intended to

amuse; while others use bland, inoffensive, and often misleading terms for things the user

wishes to dissimulate or downplay. Euphemisms are used for dissimulation, to refer totaboo

topics (such as disability, sex, excretion, and death) in a polite way, and to

mask profanity.[2] The opposite of euphemism roughly equates to dysphemism.

Euphemisms may be used to avoid words considered rude, while still conveying their meaning;

words may be replaced by similar-sounding words, gentler words, or placeholders. Some

euphemisms have become accepted in certain societies for uncomfortable information; for

example, in the United States, a doctor is likely to say "the patient passed away" rather than

"the patient died". They can be used to downplay or conceal unpalatable facts, such as

"collateral damage" for "civilian casualties" in a military context, or "redacted" for "censored".

A factoid is a questionable or spurious (unverified, false, or fabricated) statement

presented as a fact, but without supporting evidence. The word can also be used to describe a

particularly insignificant or novel fact, in the absence of much relevant context.[1] The word is

defined by the Compact Oxford English Dictionary as "an item of unreliable information that is

repeated so often that it becomes accepted as fact".[2]

Factoid was coined by Norman Mailer in his 1973 biography of Marilyn Monroe. Mailer

described a factoid as "facts which have no existence before appearing in a magazine or

newspaper",[3] and created the word by combining the word fact and the ending -oid to mean

"similar but not the same". The Washington Times described Mailer's new word as referring to

"something that looks like a fact, could be a fact, but in fact is not a fact".[4]

Factoids may give rise to, or arise from, common misconceptions and urban legends.

In monetary policy of the United States, the term Fedspeak (also known

as Greenspeak) is what Alan Blinder called "a turgid dialect of English" used by Federal

Reserve Board chairmen in making intentionally wordy, vague, and ambiguous

Page 44: List of Cognitive Biases

statements.[2][3][4] The strategy, which was used most prominently by Alan Greenspan, was used

to prevent financial markets from overreacting to the chairman's remarks.[3] The coinage is an

intentional parallel to Newspeak of Nineteen Eighty-Four, a novel by George Orwell.

The deliberately confusing and carefully rehearsed cryptic language described as an

"indecipherable, Delphic dialect" is meant to "give people a sense that there's no way they could

understand economics and finance" and thus allow the Federal Reserve and government to

manage the economy with less interference from the general public.[5][6][7]

Although Ben Bernanke has stated that Fedspeak now means clear and extensive

communication of the fed's action, the original interpretation in which Fedspeak meant

"explaining more, is understanding better" is still prominently used.[8] It should also be noted that

Bernanke's speech has also been described as fedspeak.[9]

It has been noted that the nuanced nature of fedspeak poses interpretation problems to

automated trading algorithms.[10]

A front organization is any entity set up by and controlled by another

organization, such as intelligence agencies, organized crime groups, banned organizations,

religious or political groups, advocacy groups, or corporations. Front organizations can act for

the parent group without the actions being attributed to the parent group.

Front organizations that appear to be independent voluntary associations or charitable

organizations are called front groups. In the business world, front organizations such asfront

companies or shell corporations are used to shield the parent company from legal

liability. In international relations, a puppet state is a state which acts as a front (or surrogate) for

another state.

A glittering generality (also called glowing generality) is an emotionally

appealing phrase so closely associated with highly valued concepts and beliefs that it carries

conviction without supporting information or reason. Such highly valued concepts attract general

approval and acclaim. Their appeal is to emotions such as love of country and home, and desire

for peace, freedom, glory, and honor. They ask for approval without examination of the reason.

They are typically used by politicians and propagandists.

Page 45: List of Cognitive Biases

Homophobic propaganda (or anti-gay propaganda)

is propaganda based on negative and homophobia towards homosexual and sometimes

other non-heterosexual people. Such propaganda supports anti-gay prejudices and stereotypes,

and promotes social stigmatization and/or discrimination. The term homophobic

propaganda was used by the historian Stefan Micheler in his work Homophobic Propaganda

and the Denunciation of Same-Sex-Desiring Men under National Socialism,[1] as well as other

works treating the topic.[2]

In some countries, some forms of homophobic propaganda are considered hate speech and are

prohibited by law. In Russia, such propaganda can also be treated as illegal, because laws in

Russia explicitly prohibit hate speech against any social group (not explicitly mentioning sexual

orientation), and LGBT can be considered as distinct social group.[3]

Indoctrination is the process of inculcating ideas, attitudes, cognitive strategies or

a professional methodology (see doctrine).[1] It is often distinguished from education by the fact

that the indoctrinated person is expected not to question or critically examine the doctrine they

have learned.[2] As such the term may be used pejoratively, often in the context of

education, political opinions, theology or religious dogma. The term is closely linked

to socialization; in common discourse, indoctrination is often associated with

negativeconnotations, while socialization refers to cultural or educational learning.

The term Information Warfare (IW) is primarily an American concept

involving the use and management of information technology in pursuit of a competitive

advantage over an opponent. Information warfare may involve collection of tactical

information, assurance(s)that one's own information is valid, spreading

of propaganda or disinformation to demoralize or manipulate[1] the enemy and the public,

undermining the quality of opposing force information and denial of information-collection

opportunities to opposing forces. Information warfare is closely linked to psychological warfare.

The American focus tends to favor technology, and hence tends to extend into the realms of

Electronic Warfare, Cyber Warfare, Information Assurance and Computer Network Operations /

Attack / Defence.

Most of the rest of the world use the much broader term of "Information Operations" which,

although making use of technology, focuses on the more human-related aspects of information

Page 46: List of Cognitive Biases

use, including (amongst many others) social network analysis, decision analysis and the human

aspects of Command and Control.

In the United States, junk science is any scientific data, research, or analysis

considered to be spurious or fraudulent. The concept is often invoked in political and legal

contexts where facts and scientific results have a great amount of weight in making a

determination. It usually conveys a pejorative connotation that the research has been

untowardly driven by political, ideological, financial, or otherwise unscientific motives.

The concept was first invoked in relation to expert testimony in civil litigation.[citation needed] More

recently, invoking the concept has been a tactic to criticize research on the

harmful environmental or public health effects of corporate activities, and occasionally in

response to such criticism. In these contexts, junk science is counterposed to the "sound

science" or "solid science" that favors one's own point of view.[1] This dichotomy has been

particularly promoted by Steven Milloy and theAdvancement of Sound Science Center. This is

somewhat different from issues around pseudoscience and controversial science.

The lesser of two evils principle (or lesser evil principle) is the idea

in politics and political science that of two bad choices, one is not as bad as the other and

should therefore be chosen over the one that is the greater threat.[citation needed]

Originally, "lesser evil" was a Cold War-era pragmatic foreign policy principle used by the United

States and, to a lesser extent, several other countries. The principle dealt with the United States

of America's attitude regarding how dictators of third-world nations ought to be handled, and

was closely related to the Kirkpatrick Doctrine of Jeane Kirkpatrick. By contrast, the lesser of

two evils principle is today most commonly used in reference to electoral politics, particularly

in Western nations, and perhaps in the United States more than anywhere else. When popular

opinion in the United States is confronted with what is often seen as two main candidates—

normally Democratic and Republican in the modern era—that are substantially

similar ideologically, politically, and/or in their economic programmes, a voter is often advised to

choose the "lesser of two evils" to avoid having the supposedly "greater evil" get into office and

wreak havoc on society.

Page 47: List of Cognitive Biases

In rhetoric, loaded language (also known as loaded terms or emotive

language) is wording that attempts to influence an audience by using appeal to

emotion orstereotypes.[1][2][3] Such wording is also known as high-inference

language or language persuasive techniques.

Loaded words and phrases have strong emotional implications and involve strongly positive or

negative reactions beyond their literal meaning. For example, the phrase tax reliefrefers literally

to changes that reduce the amount of tax citizens must pay. However, use of the emotive

word relief implies that all tax is an unreasonable burden to begin with. Examples of loaded

language are "You want to go to the mall, don't you?" and "Do you really want to associate with

those people?".

The appeal to emotion is often seen as being in contrast to an appeal to logic and reason.

However, emotion and reason are not necessarily always in conflict, nor is it true that an

emotion cannot be a reason for an action. Murray and Kujundzic distinguish "prima

facie reasons" from "considered reasons" when discussing this. A prima facie reason for, say,

not eating mushrooms is that one does not like mushrooms. This is an emotive reason.

However, one still may have a considered reason for not eating mushrooms: one might

consume enough of the relevant minerals and vitamins that one could obtain from eating

mushrooms from other sources. An emotion, elicited via emotive language, may form aprima

facie reason for action, but further work is required before one can obtain

a considered reason.[3]

Emotive arguments and loaded language are particularly persuasive because they exploit the

human weakness for acting immediately based upon an emotional response, withoutsuch

further considered judgment. Due to such potential for emotional complication, it is generally

advised to avoid loaded language in argument or speech when fairness and impartiality is one

of the goals. Anthony Weston, for example, admonishes students and writers: "In general, avoid

language whose only function is to sway the emotions".[1][3]

Marketing is the process of communicating the value of a product or service

to customers, for selling that product or service.

From a societal point of view, marketing is the link between a society’s material requirements

and its economic patterns of response. Marketing satisfies these needs and wants through

exchange processes and building long term relationships. Marketing can be looked at as an

organizational function and a set of processes for creating, delivering and communicating value

to customers, and managing customer relationships in ways that also benefit the organization

and its shareholders. Marketing is the science of choosing target markets through market

Page 48: List of Cognitive Biases

analysis and market segmentation, as well as understanding consumer buying behavior and

providing superior customer value.

There are five competing concepts under which organizations can choose to operate their

business: the production concept, the product concept, the selling concept, the marketing

concept, and the holistic marketing concept.[1] The four components of holistic marketing are

relationship marketing, internal marketing, integrated marketing, and socially responsive

marketing. The set of engagements necessary for successful marketing management includes

capturing marketing insights, connecting with customers, building strong brands, shaping the

market offerings, delivering and communicating value, creating long-term growth, and

developing marketing strategies and plans.[2]

Media bias is the bias or perceived bias of journalists and news producers within

the mass media in the selection of events and stories that are reported and how they are

covered. The term "media bias" implies a pervasive or widespread bias contravening the

standards of journalism, rather than the perspective of an individual journalist or article. The

direction and degree of media bias in various countries is widely disputed.

Practical limitations to media neutrality include the inability of journalists to report all available

stories and facts, and the requirement that selected facts be linked into a coherent

narrative.[1] Government influence, including overt and covert censorship, biases the media in

some countries, for example North Korea and Burma.[2] Market forces that result in a biased

presentation include the ownership of the news source,concentration of media ownership, the

selection of staff, the preferences of an intended audience, and pressure from advertisers.

There are a number of national and international watchdog groups that report on bias in the

media.

Media manipulation is a series of related techniques in which partisans

create an image or argument that favours their particular interests. Such tactics may include the

use of logical fallacies and propaganda techniques, and often involve the suppression of

information or points of view by crowding them out, by inducing other people or groups of

people to stop listening to certain arguments, or by simply diverting attention elsewhere.

In Propaganda: The Formation of Men's Attitudes, Jacques Ellul writes that public opinion can

only express itself through channels which are provided by the mass media of communication-

without which there could be no propaganda.[1] It is used within public

relations, propaganda, marketing, etc. While the objective for each context is quite different, the

Page 49: List of Cognitive Biases

broad techniques are often similar. As illustrated below, many of the more modern mass

media manipulation methods are types ofdistraction, on the assumption that the public has a

limited attention span.

A misuse of statistics occurs when a statistical argument asserts a

falsehood. In some cases, the misuse may be accidental. In others, it is purposeful and for the

gain of the perpetrator. When the statistical reason involved is false or misapplied, this

constitutes a statistical fallacy.

The false statistics trap can be quite damaging to the quest for knowledge. For example, in

medical science, correcting a falsehood may take decades and cost lives.

Misuses can be easy to fall into. Professional scientists, even mathematicians and professional

statisticians, can be fooled by even some simple methods, even if they are careful to check

everything. Scientists have been known to fool themselves with statistics due to lack of

knowledge of probability theory and lack of standardization of their tests.

Managing the news refers to acts that are intended to influence the

presentation of information within the news media. The expressionmanaging the news is often

used in a negative sense. For example, people or organizations that wish to lessen the publicity

concerning bad news may choose to release the information late on a Friday, giving journalists

less time to pursue the story. Staying "on message" is a technique intended to limit questions

and attention to a narrow scope favorable to the subject.

An example cited by the Communication, Cultural and Media Studies infobase regards a

February 1996 Scott Report on arms sales to Iraq. In the United Kingdom, the report was given

early to certain officials.

News propaganda is a type of propaganda covertly packaged as

credible news, but without sufficient transparency concerning the news item's source and the

motivation behind its release. Transparency of the source is one parameter critical to distinguish

between news propaganda and traditional news press releases and video news releases.

Page 50: List of Cognitive Biases

As with any propaganda, news propaganda may be spread for widely different reasons

including governance, political or ideologicalmotivations, partisan agendas, religious or ethnic re

asons, and commercial or business motivations; their purposes are not always clear. News

propaganda also can be motivated by national security reasons, especially in times of war or

domestic upheaval.

Newspeak is the fictional language in the novel Nineteen Eighty-Four, written

by George Orwell. It is a controlled language created by the totalitarian state as a tool to

limit freedom of thought, and concepts that pose a threat to the regime such as freedom, self-

expression, individuality, peace, etc. Any form of thought alternative to the party’s construct is

classified as "thoughtcrime."

Newspeak is explained in chapters 4 and 5 of Nineteen Eighty-Four, and in an appendix to the

book. The language follows, for the most part, the same grammatical rules as English, but has a

much more limiting, and constantly shifting vocabulary. Any synonyms or antonyms, along with

undesirable concepts are eradicated. The goal is for everyone to be speaking this language by

the year 2050 (the story is set in the year 1984—hence the title). In the mean

time, Oldspeak (current English) is still spoken among the Proles — the working-class citizens

of Oceania.

Orwell was inspired to invent Newspeak by the constructed language Basic English, which he

promoted from 1942 to 1944 before emphatically rejecting it in his essay "Politics and the

English Language".[1] In this paper he deplores the bad English of his day, citing dying

metaphors, pretentious diction or rhetoric, and meaningless words, which he claimed to

encourage unclear thought and reasoning. Towards the end of the essay, Orwell states: ―I said

earlier that the decadence of our language is probably curable. Those who deny this would

argue, if they produced an argument at all, that language merely reflects existing social

conditions, and that we cannot influence its development by any direct tinkering with words or

constructions."

Newspeak's contracted forms, such as Ingsoc and Minitrue, are inspired by the Russian syllabic

abbreviations used for concepts relating to the government and society of the USSR, such

as politburo, Comintern, kolkhoz (collective farm) and Komsomol (Young Communists' League),

many of which found their way into the speech of Communists in other countries.

Page 51: List of Cognitive Biases

"Plain Folks" is a form of propaganda and is also a fallacy.[1]

A Plain Folks argument is one in which the speaker presents him or herself as an Average Joe,

a common person who can understand and empathize with a listener's concerns. The most

important part of this appeal is the speaker's portrayal of themselves as someone who has had

a similar experience, to the listener, and knows why they may be skeptical or cautious about

accepting the speaker's point of view. In this way, the speaker gives the audience a sense of

trust and comfort, believing that the speaker and the audience share common goals and that

they thus should agree with the speaker. Also using an "ordinary background," such as a park

or a building, depending on the item you are advertising, will usually give it a higher possibility of

more customers.

A propaganda film is a film that involves some form of propaganda. Propaganda

films may be packaged in numerous ways, but are most often documentary-style productions or

fictional screenplays, that are produced to convince the viewer on a specific political point or

influence the opinions or behavior of the viewer, often by providing subjective content that may

be deliberately misleading.[1]

Propaganda can be defined as the ability "to produce and spread fertile messages that, once

sown, will germinate in large human cultures.‖[2] However, in the 20th century, a ―new‖

propaganda emerged, which revolved around political organizations and their need to

communicate messages that would ―sway relevant groups of people in order to accommodate

their agendas‖.[3] First developed by theLumiere brothers in 1896, film provided a unique means

of accessing large audiences at once. Film was the first universal mass medium in that it could

simultaneously influence viewers as individuals and members of a crowd, which led to it quickly

becoming a tool for governments and non-state organizations to project a desired ideological

message.[4] As Nancy Snow stated in her book, Information War: American Propaganda, Free

Speech and Opinion Control Since 9-11, propaganda "begins where critical thinking ends." [5]

A public service announcement (PSA) or public service ad, are

messages in the public interest disseminated by the media without charge, with the objective of

raising awareness, changing public attitudes and behavior towards a social issue. In the UK,

they are generally called Public Information films.

Page 52: List of Cognitive Biases

Revolutionary propaganda means dissemination

of revolutionary ideas.

While the term propaganda bears a mostly negative connotation in modern English language,

this did not exist in the early 20th century, when the word "propaganda" was first coined.

"Revolutionary propaganda" is supposed to carry a positive connotation, something along the

lines of "dissemination of ideas that will help people win their freedom".

Self-propaganda is a form of propaganda and indoctrination performed by

an individual or a group on oneself.

Social marketing seeks to develop and integrate marketing concepts with other

approaches to influence behaviors that benefit individuals and communities for the greater

social good. It seeks to integrate research, best practice, theory, audience and partnership

insight, to inform the delivery of competition sensitive and segmented social change programs

that are effective, efficient, equitable and sustainable.[1]

Although "social marketing" is sometimes seen only as using standard commercial marketing

practices to achieve non-commercial goals, this is an oversimplification. The primary aim of

social marketing is "social good", while in "commercial marketing" the aim is primarily "financial".

This does not mean that commercial marketers can not contribute to achievement of social

good.

Increasingly, social marketing is being described as having "two parents"—a "social parent",

including social science and social policyapproaches, and a "marketing parent", including

commercial and public sector marketing approaches.[2]

In the United States, junk science is any scientific data, research, or analysis considered to

be spurious or fraudulent. The concept is often invoked in political and legal contexts where facts and

scientific results have a great amount of weight in making a determination. It usually conveys

a pejorative connotation that the research has been untowardly driven by political, ideological, financial,

or otherwise unscientific motives.

The concept was first invoked in relation to expert testimony in civil litigation.[citation needed]

More recently,

invoking the concept has been a tactic to criticize research on the harmful environmental or public

Page 53: List of Cognitive Biases

health effects of corporate activities, and occasionally in response to such criticism. In these contexts,

junk science is counterposed to the "sound science" or "solid science" that favors one's own point of

view.[1]

This dichotomy has been particularly promoted by Steven Milloy and theAdvancement of Sound

Science Center. This is somewhat different from issues around pseudoscience and controversial science.

In law, the rebuttal is a form of evidence that is presented to contradict or nullify other

evidence that has been presented by an adverse party. By analogy the same term is used

inpolitics and public affairs to refer to the informal process by which statements, designed to

refute or negate specific arguments put forward by opponents, are deployed in the media. [1]

In law, special rules apply to rebuttal. Rebuttal evidence or rebuttal witnesses must be confined

solely to the subject matter of the evidence rebutted. New evidence on other subjects may not

be brought in rebuttal. However, rebuttal is one of the few vehicles whereby a party may

introduce surprise evidence or witnesses. The basic process is as follows: Both sides of a

controversy are obliged to declare in advance of trial what witnesses they plan to call, and what

each witness is expected to testify to. When either aplaintiff (or prosecutor) or defendant brings

direct evidence or testimony which was not anticipated, the other side may be granted a specific

opportunity to rebut it. In rebuttal, the rebutting party may generally bring witnesses and

evidence which were never declared before, so long as they serve to rebut the prior evidence.

Rhetoric is the art of discourse, an art that aims to improve the capability of writers or

speakers that attempt to inform, persuade, or motivate particular audiences in specific

situations.[1] As a subject of formal study and a productive civic practice, rhetoric has played a

central role in the Western tradition.[2] Its best known definition comes from Aristotle, who

considers it a counterpart of both logic and politics, and calls it "the faculty of observing in any

given case the available means of persuasion."[3] Rhetorics typically provide heuristics for

understanding, discovering, and developing arguments for particular situations, such as

Aristotle's three persuasive audience appeals, logos, pathos, and ethos. The five canons of

rhetoric, which trace the traditional tasks in designing a persuasive speech, were first codified in

classical Rome: invention, arrangement,style, memory, and delivery. Along

with grammar and logic (or dialectic—see Martianus Capella), rhetoric is one of the three

ancient arts of discourse.

From Ancient Greece to the late 19th century, it was a central part of Western education, filling

the need to train public speakers and writers to move audiences to action with arguments.[4] The

word is derived from the Greek ῥητορικός(rhētorikós), "oratorical",[5] from ῥήτωρ (rhḗtōr), "public

Page 54: List of Cognitive Biases

speaker",[6] related to ῥῆμα (rhêma), "that which is said or spoken, word, saying",[7] and

ultimately derived from the verb ἐρῶ (erō), "say, speak".[8]

A slogan is a memorable motto or phrase used in a political, commercial, religious, and other

context as a repetitive expression of an idea or purpose. The word slogan is derived from slogorn which

was an Anglicisation of the Scottish Gaelic sluagh-ghairm tanmay(sluagh "army", "host"

+ gairm "cry").[1]

Slogans vary from the written and the visual to the chanted and the vulgar. Their

simplerhetorical nature usually leaves little room for detail and a chanted slogan may serve more as

social expression of unified purpose than as communication to an intended audience.

Marketing slogans are often called taglines in the United States or straplines in the UK. Europeans use

the terms baselines, signatures,claims or pay-offs.[2]

"Sloganeering" is a mostly derogatory term for activity which degrades discourse to the level of slogans.

Transfer is a technique used in propaganda and advertising. Also known as association,

this is a technique of projecting positive or negative qualities (praise or blame) of a person,

entity, object, or value (an individual, group, organization, nation, patriotism, etc.) to another in

order to make the second more acceptable or to discredit it. It evokes an emotional response,

which stimulates the target to identify with recognized authorities. Often highly visual, this

technique often utilizes symbols (for example, the Swastika used in Nazi Germany, originally a

symbol for health and prosperity) superimposed over other visual images. An example of

common use of this technique in the United States is for thePresident to be filmed or

photographed in front of the country's flag.[1] Another technique used is celebrity endorsement.[2]

A video news release (VNR) is a video segment made to look like a news

report, but is instead created by a PR firm, advertising agency, marketing firm, corporation,

or government agency. They are provided to television newsrooms to shape public opinion,

promote commercial products and services, publicize individuals, or support other

interests. News producers may air VNRs, in whole or in part, at their discretion or incorporate

them into news reports if they contain information appropriate to a story or of interest to viewers.

Critics of VNRs have called the practice deceptive or a propaganda technique, particularly when

the segment is not identified to the viewers as a VNR. Firms producing VNRs disagree and

Page 55: List of Cognitive Biases

equate their use to a press release in video form and point to the fact that editorial judgement in

the worthiness, part or whole, of a VNR's content is still left in the hands of Journalists, Program

Producers or the like. The United StatesFederal Communications Commission is currently

investigating the practice of VNRs.

A weasel word (also, anonymous authority) is an informal

term[1] for equivocating words and phrases aimed at creating an impression that something

specific and meaningful has been said, when in fact only a vague or ambiguous claim has been

communicated.

For example, an advertisement may use a weasel phrase such as "up to 50% off on all

products". This is misleading because the audience is invited to imagine many items reduced by

the proclaimed 50%, but the words taken literally mean only that no discount will exceed 50%,

and in extreme misrepresentation, the advertiser need not reduce any prices, which would still

be consistent with the wording of the advertisement, since "up to 50" most literally means "any

number less than or equal to 50".

In other cases, words with a particular subjective effect are chosen. For example, one person

may speak of "resistance fighters" or "freedom fighters", while another may call the same

subjects "terrorists". The underlying facts are the same, but a quite different impression is given.

The use of weasel words to avoid making an outright assertion is a synonym

to tergiversate.[2] Weasel words can imply meaning far beyond the claim actually being

made.[3] Some weasel words may also have the effect of softening the force of a potentially

loaded or otherwise controversial statement through some form of understatement, for example

using detensifiers such as "somewhat" or "in most respects".[4]

White propaganda is propaganda which truthfully states its origin.[1][2] It is the

most common type of propaganda. It generally comes from an openly identified source, and is

characterized by gentler methods of persuasion than black propaganda (which purports to come

from the opposite side to that which actually produced it) and grey propaganda (which has no

identifiable source or author). It typically uses standard public relations techniques and one-

sided presentation of an argument. Jacques Ellul, in one of the major books on the subject of

propaganda, Propaganda: The Formation of Men's Attitudes, mentions white propaganda as an

awareness of the public of attempts being made to influence it. There is a Ministry of

Propaganda; one admits that propaganda is being made; its source is known; its aims and

Page 56: List of Cognitive Biases

intentions are identified. [3] Throughout the course of a propaganda campaign white propaganda

serve as a cover for black propaganda when the propagandist seeks to mask the former.

Yellow journalism, or the yellow press, is a type of journalism that presents

little or no legitimate well-researched news and instead uses eye-catching headlines to sell

more newspapers.[1] Techniques may include exaggerations of news events, scandal-

mongering, orsensationalism.[1] By extension, the term yellow journalism is used today as a

pejorative to decry any journalism that treats news in an unprofessional or unethical fashion.[2]

Campbell defines yellow press newspapers as having daily multi-column front-page headlines

covering a variety of topics, such as sports and scandal, using bold layouts (with large

illustrations and perhaps color), heavy reliance on unnamed sources, and unabashed self-

promotion. The term was extensively used to describe certain major New York City newspapers

about 1900 as they battled for circulation.

Frank Luther Mott defines yellow journalism in terms of five characteristics:[3]

1. scare headlines in huge print, often of minor news

2. lavish use of pictures, or imaginary drawings

3. use of faked interviews, misleading headlines, pseudoscience, and a parade of false

learning from so-called experts

4. emphasis on full-color Sunday supplements, usually with comic strips

5. dramatic sympathy with the "underdog" against the system.