1
The Conceptualization and Measure of Creativity:
Implications for Research in Marketing and Consumer Behavior
Joseph R. Priester and Monique A. Fleming
University of Southern California
Accepted for publication in Review of Marketing Research
Author Note
Joseph R. Priester ([email protected]) is an associate professor of Marketing at the
University of Southern California Marshall School of Business and the Dornsife Department of
Psychology. Monique A. Fleming ([email protected]) is a Research Scientist at the Dornsife
Department of Psychology. Correspondence can be addressed to either author.
2
Abstract
The phenomenon of creativity spans research topics across Marketing and Consumer Behavior.
Interest in, and research on, creativity has grown over the past several decades. With this
heightened attention comes the question of how best to conceptualize and measure creativity.
This question is addressed by reviewing the conceptualizations and measures used in the
psychological study of creativity. From this review we build a framework by which to analyze
papers from the Journal of Consumer Research and the Journal of Marketing Research. Based
upon this analysis, we provide recommendations and best practices for future research. Of
particular importance, we recommend the use of convergent problem-solving tasks in
combination with ratings of novelty and usefulness reported separately. Such measures allow one
to distinguish between instances of effective-creativity (when an idea is both novel and useful)
and instances of quasi-creativity (when an idea is novel but lacks usefulness). The importance of
the framework to research and analysis beyond the experimental paradigm is discussed.
Keywords: creativity, innovation, problem-solving, divergent, convergent, new product design
3
Creativity lies at the heart of Marketing and Consumer Behavior. Its importance spans
from the development of new products and advertising campaigns to understanding the
antecedents and consequences of consumer creativity. And yet the topic of creativity in
Marketing and Consumer Behavior has, until relatively recently, received little empirical
examination. This lack has begun to change. The number of papers explicitly mentioning or
examining creativity has burgeoned in the 21st century (see table 1). As the amount of research
on creativity increases, a fundamental question arises: How does one best conceptualize and
measure creativity?
This question motivates the present paper. To address this question, we first review the
conceptualizations and measures used in the psychological study of creativity. From this review
we develop a framework by which to examine the measure of creativity. We use this framework
to analyze a set of papers from the Journal of Consumer Research and the Journal of Marketing
Research. Based on this analysis, we offer recommendations for future research in Marketing
and Consumer Behavior. We conclude by considering the usefulness of the proposed
conceptualization and measure of creativity within the broader field of Marketing, specifically, in
those paradigms that explore creativity beyond an experimental approach.
The Measure of Creativity
How to best measure creativity is a question not unique to Marketing and Consumer
Behavior. Indeed, this question has regularly surfaced throughout much of the historic (Guilford,
1950) and contemporary (Mumford, 2003; Plucker, Makel, & Qin, 2019) theory and research on
creativity. This question is especially important in that there exist starkly different approaches. It
is helpful to first consider the theoretical constructs which they are intended to capture. E. Paul
4
Torrance is considered the father of contemporary creativity research. He advanced a broad
conceptualization of creativity:
I tried to describe creative thinking as the process of sensing difficulties, problems, gaps in information, missing elements, something askew, making guesses and formulating hypotheses; possibly revising and retesting them, and finally communicating the results. (Torrance, 1988)
To capture this multi-dimensional construct, Torrance introduced (1966) and revised (1999)
the Torrance Test of Creative Thinking (TTCT). The test takes several hours to complete, and is
administered and scored by individuals trained specifically for the test. It is comprised of a
verbal and a nonverbal (figural) section. There are six verbal tasks. The first three tasks use a
picture as a stimulus, and then prompt the individual to answer 1) what questions they would ask
in order to understand the picture, 2) what guesses they would make in order to understand the
cause of the picture, and 3) what they imagine the consequences following from the picture
would be. The other verbal tasks, which are in greatest use in contemporary research, ask
individuals to 4) improve upon an existing product, 5) generate a list of unusual uses for a
common object (commonly referred to as the “alternative uses” task), and 6) predict the possible
consequences of a hypothetical situation (commonly referred to as the “just suppose” task). The
responses to these tasks are evaluated along three dimensions; fluency (the number of relevant
ideas), flexibility (the number of different categories represented in the ideas), and originality
(the number of statistically infrequent ideas). As is clear in the measures above, each measure
includes a task to elicit ideas, and dimensions along which these ideas are evaluated. Herein,
these will be referred to as tasks and evaluative dimensions.1
1 Contemporary creativity research relies almost exclusively upon the verbal portion of the TTCT. The figural portion of the TTCT includes three tasks; drawing lines within an array of circles, creating a picture containing a specific element, and adding lines to existing figures. The results of these tasks are evaluated along five dimensions; fluency, originality, elaboration, abstractness, and resistance to premature closure. The TTCT in its entirety is
5
Note that these tasks and dimensions reflect the rich conceptualization of creativity advanced
by Torrance. The TTCT has received extensive empirical scrutiny and validation (e.g., Torrance
& Wu, 1981). For example, the verbal portion of the TTCT has been found to predict both
personal and public achievement (Runco, Millar, Acar, & Cramond, 2010), as well as later
recognition of artistic achievement and participation in creative activities. Indeed, the verbal
portion of the TTCT predicted such outcomes approximately three times better than standard IQ
tests, and accounted for 50% of the variance in such outcomes (Plucker, 1999).2
The conceptualization and measure of creativity has evolved over time in several ways
since Torrance’s seminal work. First, theorists have introduced usefulness as a component to the
conceptualization of creativity. With few exceptions, contemporary theorists and researchers
(e.g., Hennessey & Amabile, 2010; Sternberg, 2012, Vernon, 1989) have adopted the definition
that creativity is the generation of products or ideas that are both novel and useful (for example,
appropriate to the situation, effective, and/or practical). Though the TTCT captures the notion of
novelty with the originality scoring of responses, it lacks an assessment of usefulness. Thus,
assessments of usefulness have been developed and are frequently used. Second, theorists have
broadened the types of thinking conceptualized to underlie the process of creativity. Originally,
idea generation (as often assessed by tasks from the TTCT) was considered key to creativity.
This type of thinking is referred to as divergent thinking. Recently, however, another aspect of
thought has come to be integrated into the conceptualization and measure of creativity. This type
of thought, referred to as convergent thinking, expands upon the “possibly revising and retesting”
portion of Torrance’s definition by positing that creativity also encompasses an individual’s
currently used primarily as a test to identify gifted children.2 Interestingly, the figural portion of the TTCT was not found to be a significant predictor of later adult outcomes (Plucker, 1999).
6
ability to evaluate and refine those ideas generated by divergent thinking. Measures intended to
capture convergent thinking have been developed and are coming to be more frequently used.
And third, contemporary measures have become relatively simpler and shorter, for example
using only one or two tasks from the TTCT.
Commonly Used Creativity Measures
As implied by the previous discussion, the types of creativity tasks most commonly used
can be roughly grouped into either tests of divergent or convergent thinking (Cropley, 2006;
Runco & Acar, 2012). Divergent thinking is conceptualized as the ability to generate as many
different responses to a situation, idea, or problem as possible. These tasks are explicitly or
implicitly based upon the tasks first advanced by Torrance in the TTCT. In contrast, convergent
thinking is conceptualized as being relatively more disciplined than divergent thinking. The goal
of convergent thinking is to evaluate and revise ideas in order to arrive at the best possible
solution to a problem (Cropley, 2006).
Further, within each type of task (divergent and convergent), we suggest that there are
two different approaches. Divergent thinking is measured both by tasks that focus on generating
ideas relatively free from a specified solution or need (subsequently referred to as generation-
focused divergent thinking) and also by tasks that focus on generating multiple solutions to
specific problems (referred to as problem-focused divergent thinking). Convergent thinking is
measured both by tasks that require individuals to come up with what they think is the one best
solution to a problem (referred to as convergent problem-solving) and also by tasks that require
insight to discern the one objectively correct solution (referred to as convergent insight
problems).
7
Divergent Thinking
Divergent thinking as a measure of creativity. Divergent thinking is an individual’s
ability to generate free-flowing, uncensored ideas (Hennessey & Amabile, 2010; Runco, 2014).
Until recently, creativity research conceptualized divergent thinking as the fundamental basis of
creativity (based, in part, upon its emphasis in the TTCT). As such, tests of divergent thinking
were most often used to measure creativity (Acar, 2019; Mumford, 2003).
Measures of divergent thinking. Tests of divergent thinking are constructed so as to
elicit ideas (see Runco, 1991). To do so, researchers provide a prompt to the individual. Though
all such prompts share the same goal, these prompts can be conceptualized as belonging to one
of two distinct approaches, generation- and problem-focused.
Generation-focused divergent thinking. The goal of generation-focused tests is to evoke
relatively non-directed ideas. The prompt is provided as a starting point, and there are no correct
or incorrect answers. Further, there is no specific focus or goal, other than to generate ideas in
response to the prompt. Over time, the alternative uses task (taken directly from the TTCT) has
emerged as the most commonly used prompt of this kind (Mumford, 2003). In it, individuals are
asked to provide as many uses as possible for a common, everyday object, such as a brick or box.
Prompts other than alternative uses are also utilized. Individuals can be asked to list as many
instances of common concepts that they can think of, such as things that are round. Relatedly,
individuals can be asked to list as many similarities between common concepts or objects as
possible, such as how milk and meat are similar. In a slightly different approach, individuals can
be asked to list consequences to hypothetical events, such as clouds having string attached to
them (i.e., the TTCT “just suppose” task). Research has demonstrated that generation-focused
8
divergent thinking tasks predict creative extracurricular activities and accomplishments (e.g.,
Runco, 1986; see also Runco, Plucker, & Lim, 2000).
Problem-focused divergent thinking. Another approach used to measure divergent
thinking is to elicit as many ideas as possible for solving a problem (Chand & Runco, 1993;
Mumford, Baughman, Threlfall, Supinski, & Costanza, 1996; Runco & Okuda, 1988; Wallach &
Kogan, 1965; see Runco, 1994). The problems used are often based upon real-world problems
and have many legitimate answers. For example, one prompt described a dilemma:
“Your friend Rick sits next to you in class. Rick really likes to talk to you and often bothers you while you are doing your work. Sometimes he distracts you and you miss an important part of the lecture, and many times you don’t finish your work because he is bothering you. What should you do? How would you solve this problem? Remember to list as many ideas and solutions as you can.” (from Chand & Runco, 1993, p. 158, adapted from Okuda et al. 1991)
Research has explored problem-solving using two different tasks, problems presented to
individuals, such as the Rick example, above, and self-generated problem-solving, wherein
individuals generate a list of possible solutions to problems that they themselves have identified
(e.g., Chand & Runco, 1993). Recent research has focused primarily upon problems presented to
individuals (see Paek, Cramond, & Runco, 2018).3 Research has provided validation for
problem-focused divergent thinking tasks. For example, they have been found to be more
predictive of creativity-related criteria than generation-focused divergent thinking (Kaufman,
Baer, Cole, & Sexton, 2008).
How ideas are evaluated. As novelty and usefulness have come to define creativity,
these are the dimensions by which responses (e.g., ideas, solutions) have typically come to be
evaluated (Mumford, 2003; Mumford & Gustafson, 1988; Rothenberg & Hausman, 1976;
3 Related research has also examined problem-finding, wherein the individual generates a list of problems that they face (Abdulla, Paek, Cramond, & Runco, 2018). Of note is that problem-solving and problem-finding abilities are relatively unrelated to one another and each are predictive of creative activities and interests (Runco, 1994).
9
Runco, 1988; Runco, Illies, & Eisenman, 2005).4 Specifically, peers or experts evaluate the
novelty and usefulness of each idea. By averaging across ideas for each dimension, separate
scores are derived for how novel and useful an individual’s thoughts are. These two dimensions
are analyzed separately, or are combined (in order to create a single composite measure).
Occasionally, novelty absent usefulness is used. The number of ideas generated (i.e., fluency), is
an additional (or at times sole) evaluative dimension that is used. And ideas are sometimes
evaluated according to how creative peers or experts believe each to be, referred to herein as
subjective creativity (e.g., Amabile, 1982).
Convergent Thinking
Convergent thinking as a measure of creativity. The goal of convergent thinking is to
evaluate and revise ideas in order to arrive at the best possible solution to a problem (Cropley,
2006). As such, some have understood it to be the opposite of divergent thinking. Divergent
thinking strives for an expansion of unique and useful ideas, whereas convergent thinking strives
for evaluation and revision of those ideas to converge on a solution. In part because of its
apparent opposition to divergent thinking, convergent thinking was long considered to be distinct
and separate from creative thinking (Eysenck, 2003; see Runco, 2014). As early as the TTCT,
tests of divergent thinking have been the most commonly used creativity measure (Runco, 1994),
to the point that some believe the field to over-rely on them (Mumford, 2003). For example,
Runco and Acar (2012) explicitly argue that divergent thinking does not equal creative thinking,
but captures only a part (see also, Guilford, 1960). Similarly, many suggest that divergent
thinking is limited and artificial, and not reflective of real-world creativity, in which one solution
4 The original dimensions proposed by Torrance are still used (e.g., Kaufman, Plucker, & Baer, 2008). However, such an approach to measurement is primarily used for educational purposes to identify gifted children.
10
is often required (e.g., a specific solution to a problem, a single product, or a work of art; see
Cropley, 2006; Sternberg, 2018).
Given its apparent opposition to divergent thinking, how is it that convergent thinking
relates to creativity? At the most basic, convergent thinking involves evaluation and revision. In
order to come to a judgment, the implications of the possible solutions must be considered, after
which the ideas are revised (Basadur, Runco, & Vega, 2000). Mumford, Lonegan, and Scott
(2002) propose a specific process that underlies convergent thinking. The first step is the
evaluation of ideas, in which the outcome and consequences of the ideas are forecast. The second
step is an appraisal of the forecasts in order to determine whether an idea will adequately solve
the problem. Given that many initial ideas will not clearly satisfy the problem, a third step is to
revise an idea such that it has a better probability of success. Based upon this process, Mumford
et al. (2002) argue that
Idea evaluation is an inherently creative activity in which the implications of ideas must be explored and ideas must be restricted to ensure their successful implementation. (p. 233)
Measures of convergent thinking. Given that convergent thinking is coming to be
appreciated as an important aspect of creativity, the question arises as to how to best measure it
(Mumford, Giorgini, Gibson, & Mecca 2013; Runco, 2008). Two disparate approaches are used.
Convergent problem-solving. Convergent problem-solving is similar to problem-focused
divergent thinking in that individuals are provided with a problem and asked to provide a
solution. Recall that in problem-focused divergent thinking, the goal is to generate as many
solutions as possible. In contrast, convergent problem-solving tasks ask an individual to provide
what they think is the single best solution (e.g., Reiter-Palmon, Illies, Cross, & Nimps, 2009).
The distinction between the two is critical. It is not just the number of solutions that
11
differentiates the two tasks. Rather, convergent problem-solving prompts the evaluation and
revision of the ideas. Thus, the psychological processes underlying the two differ significantly.
To construct tests of convergent problem-solving, one could simply modify the prompts used
in problem-focused divergent thinking tasks by asking for the one best response, rather than as
many as possible.5 However, the nature of convergent problem-solving tasks affords the
opportunity to develop and use richer, more complex prompts than those used in problem-
focused divergent thinking tasks. As an example, Lonergan, Scott, and Mumford (2004) had
individuals provide a plan to implement an advertising campaign for a specific product. In the
plan, individuals were instructed to describe the actions to be taken and long-term considerations
influencing their idea implementation.
Similarly, Sternberg, a leading creativity scholar, provides a list of tasks that he (and others)
have used (pp. 6-9, Sternberg, 2002). These tasks include, but are not limited to:
- Presenting individuals with five New Yorker cartoons, each of which omits the caption. The individuals task is to choose three of the five and provide a caption for each.6
- Providing individuals with unusual titles for short stories, for example, “2983,” “Beyond the Edge,” “Not Enough Time,” “a Fifth Chance,” “It’s Moving Backwards.” The individual’s task is to choose two titles and write a story for each.
- Giving individuals a short story. The individual’s task is to create an alternative ending.
- Asking individuals to invent a dialogue between two speakers of different languages. For example, an English speaker asks a Frenchman for directions.
5 As an example, one task developed in marketing research, the shoeshine dilemma, has been used both as a convergent problem-solving task (Burroughs & Mick, 2004; Mehta, Dahl, & Zhu, 2017) and as a problem-focused divergent task (Mehta, Zhu, & Cheema, 2012), by changing instructions as to number of solutions.6 This particular task is facilitated by the New Yorker itself, in that each issue has a cartoon caption contest. The initial cartoon, absent captions, is presented and individuals are invited to submit captions. The best three submissions are then published, and from these, a final, winning caption is determined.
12
- Posing different problems that vary on superficial features but share at least one underlying principle. The individual’s task is to discover a unifying fundamental principle.
- Inviting participants to imagine what the future consequences would be for specific scenarios. For example, if the Chinese government continues on its current path, what will China be like in 20 years.
- Inviting individuals to design one additional instrument for the symphony. The individual’s task is to explain what that instrument would be and why they chose it.
- Asking participants to draw pictures with unusual themes, such as the earth from an insect’s perspective.
- Having participants create advertisements for boring products (e.g., cufflinks).
- Solving unusual scientific problems, such as how could we tell if our planet has been visited by aliens.
Of note, these tasks elicit a single, typically elaborate response, a response that will vary as to its
creativity, but not as to whether it is correct or incorrect. In order to assess an answer’s creativity,
responses are evaluated by peers or experts who use the same dimensions as those used to
evaluate divergent thinking responses.
Research has provided validation for the use of convergent problem-focused tasks. For
example, training in convergent thinking leads to more original and higher quality solutions
(Osborn & Mumford, 2006). Further, variables related to convergent thinking, such as time spent
reading information pertaining to key facts and anomalies, also lead to more creative solutions
(Mumford, Baughman, Supinski, & Maher, 1996).7
7 Relatedly, Sternberg has investigated the ability of convergent problem-solving tasks to predict academic success above and beyond traditional intelligence tests. Though not directly validation of convergent problem-solving’s influence on creativity, these findings are important. Sternberg and his associates have found wide-ranging support on a variety of academic outcomes. Such tests predicted overall academic success (e.g., first-year grade-point average), as well as extra-curricular and leadership activities. Such tests were relatively superior to traditional admission tests for two reasons. First, they predicted these academic outcomes above and beyond traditional admission tests. Second, ethnic group-differences did not influence performance on the convergent problem-solving tasks, whereas they did influence performance on traditional admission tests.
13
Convergent insight problems. There are a set of creativity measures that do use problems for
which there are correct or incorrect answers and thus do not have need of evaluation or scoring
of responses other than whether they are correct. These are insight problems, which are
objectively simple, yet difficult to solve (Kershaw & Ohlsson, 2004). More specifically, these
tasks are easily within one’s competence, and yet at the same time are likely to lead to an
impasse. With sustained effort, however, individuals are often able to break through the impasse
in order to arrive at the correct solution. And this breakthrough is often associated with an
“Aha!” experience (Schooler, Ohlsson, & Brooks, 1993).
Insight problems were first studied by Gestalt researchers (e.g., Kohler, 1959; Wertheimer,
1959). From their perspective, they understood insight problems to be the result of functional
fixedness. Functional fixedness refers to the phenomenon that an individual’s preexisting
understanding of the use or function of an object impedes one’s ability to perceive other possible
uses or functions (e.g., Weisberg & Suls, 1973). Thus, the creative aspect of insight problems is
the cognitive flexibility to perceive objects and/or situations anew, affording novel and effective
possible solutions. Research has suggested that insight is the result of non-reportable processes
(Schooler, Ohlsson, & Brooks, 1993) and that impasses can be caused by multiple obstacles,
such as prior knowledge/experience and perceptual factors (consistent with functional fixedness),
as well as processing demands (Kershaw & Ohlsson, 2004). Though non-reportable, the process
of solving insight problems is considered to be one of generating, evaluating, and revising
possible solutions until the correct one is uncovered. As such, most consider insight problems to
be an example of convergent thinking (e.g., Kounios & Beeman, 2014; Benedek & Jauk, 2019;
Smith & Kounios, 1996, and Baas, De Dreu, & Nijstad, 2008).
14
There are a myriad of insight problems (Runco, 2014). However, there is a relatively limited
number that are frequently used as creativity measures. Most frequently used include the nine-
dot problem (Kershaw & Ohlsson, 2004; Weisberg & Suls, 1973), Duncker’s candle problem
(Duncker, 1945), and the remote associations test (RAT, Mednick, 1962).8
The nine-dot problem presents an individual with a 3 x 3 array of dots. The individual’s
goal is to connect all nine of the dots using four straight, continuous lines without lifting the
pencil from the paper. The problem is presented in figure 1 (and the solution is provided in the
appendix). The Duncker candle problem provides an individual with a candle, book of matches,
and box of thumbtacks. The individual’s task is to affix the candle to the wall. The solution lies
in the ability to perceive that the box containing the thumbtacks can also be used as a candle
holder. Specifically, one can accomplish the task by using the box to rest the candle on and then
attaching the box to the wall by the thumbtacks. The RAT is a task that presents three words to
an individual. The individual’s goal is to find a fourth word that relates to all three words.9 For
example, for the words Swiss, Cottage, and Cake, the correct answer would be cheese. Examples
of other sets of words include; High, Book, Foot, and Fork, Man, Dark.10 Of greatest importance
is that all of these tasks are intended to capture cognitive flexibility, which is considered to be an
aspect of creativity.
The Relationship of Divergent and Convergent Thinking
Divergent and convergent thinking are conceptualized as two distinct processes.
Evidence exists consistent with this position across a diverse set of findings. For example, the
8 Other insight problems not as commonly used include; the hat-rack problem (Maier, 1933), the six-coin problem (e.g., Lu, Hafenbrack, Eastwick, Wang, Maddux, & Galinsky, 2017), and anagram tasks (Smith & Kounios, 1996)9 Some have argued that the RAT is a measure of divergent thinking (e.g., Datta, 1964; Shapiro, 1965; Dietrich & Kanso, 2010). However, the RAT is correlated with other insight problems and convergent measures (Mednick, 1962; Schooler & Melcher, 1995, Taft & Rossiter, 1966).10 The solutions to these two examples are provided in the appendix.
15
hallucinogen Ayahuasca enhances divergent thinking while at the same time it diminishes
convergent thinking (Kuypers, Riba, De La Fuente, Barker, Theuissen, & Ramaekers, 2016).11
Divergent and convergent thinking are often considered as opposite ends on a continuum of
creative cognition, with divergent representing creative- and convergent representing
conventional-thought (Eysenck, 2003). As is most likely evident in the elucidation of convergent
thinking above, we, and others, believe that such a conceptualization does not fully describe the
creative process.
In short, divergent and convergent thinking are complimentary, rather than opposing,
processes. Process models of creativity, enfolding both divergent and convergent thinking, have
been advanced by several creativity theorists (Basadur, 1995; Finke, Ward, & Smith, 1992;
Lonergan, Scott, & Mumford, 2004; Runco & Acar, 2012). These models have in common the
notion that both types of thinking are critical in creativity. Creativity is a process wherein both
types of thought operate sequentially: Upon awareness of a problem, an individual first
generates possible solutions. Following such generation, the individual then evaluates and
revises possible solutions until a satisfactory solution is found.
Considerations Regarding Creativity Measures and Recommendations for Best Practices
In sum, a wonderful breadth and richness of conceptualization and measures of creativity
have been developed. How should researchers choose amongst these tasks and evaluative
dimensions? Over time there have emerged several concerns regarding some measures of
creativity and their ability to accurately capture the full conceptualization of creativity. Of
greatest interest to the present paper are two concerns: how to interpret results obtained on
11 Participants completed a generation-focused divergent thinking task in which they generated as many meanings as possible to a figure, and completed a task akin to the RAT to measure convergent thinking. They completed these tasks twice, once several hours before ingesting Ayahuasca and once again an hour into the hallucinogenic trip.
16
ratings of novelty, usefulness, fluency, and subjective creativity; and the relative ability of the
four types of tasks to fully capture the construct of creativity. A consideration of these concerns
leads to several recommendations.
Interpreting fluency, novelty, usefulness, and subjective creativity ratings: Insight
from their interrelationships. With all of the tasks, save convergent insight problems,
creativity is assessed by the evaluation of responses along dimensions, of which several are
commonly used. Of these dimensions, fluency is attractive in that it relies upon a simple count
of the number of responses, thereby overcoming the need to have peers or experts evaluate the
responses. And fluency has long been used to measure the responses to divergent thinking tests.
At its most popular, Wallach and Kogan (1965) advocated for its use in conjunction with
originality as the most appropriate measure of creativity.
Research since then, however, has developed a more complex understanding of fluency.
Perhaps the greatest concern is that fluency is highly correlated with other dimensions of
creativity, most especially originality (e.g., Hocevar, 1979; Silvia, 2008; Silvia, Winterstein,
Willse, Barona, Cram, Hess et al, 2008). At first glance this relationship might appear to suggest
that fluency is an alternative indicator of novelty. However, other research has provided
conflicting results as to fluency’s predictive validity: Plucker, Qian, and Wang (2011) found no
relationship between fluency and creativity criteria, whereas Plucker, Qian, and Schmalensee
(2014) found significant, but small, such relationships. At best, fluency is not considered a
strong indicator of creativity: “For most purposes, fluency should not be used alone. There is
unique and reliable variance in the other indices” (p. 67; Runco & Acar, 2012). Thus, the use of
fluency in isolation as an evaluative dimension of creativity is not recommended.12
12 The use of fluency may be helpful in indicating other constructs, such as amount of effort expended on a task. But note such use does not provide inference to creativity.
17
If fluency is not a valid evaluative dimension of creativity, on what dimensions should
one rate responses in order to assess their level of creativity? These possible dimensions are
novelty, usefulness, and subjective creativity. The question of how they relate to each other can
provide guidance as to their validity. Research consistently demonstrates that judgments of
novelty and usefulness are either not correlated (Runco, Illies, Eisenman, 2003), or negatively
correlated (Diedrich, Benedek, Jauk, and Neubauer, 2015; Runco & Charles, 1993). Ideas
evaluated as high in usefulness are typically evaluated as low in novelty. Further, ideas
evaluated as low in usefulness but high in novelty are evaluated as high in subjective creativity
(Runco & Charles, 1993). And what of evaluations of subjective creativity, in which peers or
experts are simply asked to rate how creative each idea is? Subsequent research has consistently
found that subjective creativity captures the extent to which an idea is novel, but does not capture
the extent to which an idea is useful. For example, Diedrich, Benedek, et al. (2015) found a
significant correlation between novelty and subjective creativity (r =.60), whereas the correlation
between usefulness and subjective creativity was either not significant (r = -.05) or significantly
negative (r = -.10; see also, Runco & Charles, 1993; Runco et al., 2003). As such, the use of
subjective creativity as a measure of creativity is analogous to relying on novelty absent
usefulness.
Recall that creativity is defined as the generation of ideas that are both novel and useful.
Clearly, useful ideas devoid of novelty lack creativity. But what of novel ideas devoid of
usefulness? Cropley (2006) considered this question in depth and concluded that
Mere novelty… involves what Cattell and Butcher (1968, p. 271) called “pseudocreativity”: The novelty derives only from nonconformity, lack of discipline, blind rejection of what already exists, and simply letting oneself go… I have added to this
18
“quasicreativity” which has many of the elements of genuine creativity – such as a high level of fantasy – but only a tenuous connection with reality (p. 392).
Cropley thus considered mere novelty (i.e., when an idea is novel but not useful or
connected to reality) to be quasi-creativity, which he contrasted with genuine creativity or, in his
term, effective-novelty (when an idea is both novel and useful, referred to herein as effective-
creativity).
As such, if one measures novelty or subjective creativity without usefulness, it is
ambiguous as to whether one is finding quasi-creativity or effective-creativity. To overcome this
ambiguity, it is critical to use both dimensions of novelty and usefulness to evaluate ideas.
Consequently, we recommend measuring both, and reporting the results for novelty and
usefulness separately. Such an approach allows one to distinguish quasi- from effective-
creativity. If a variable has an influence only on novelty and not usefulness, such a variable is
best interpreted as an antecedent of quasi-, rather than effective-creativity.13
A final note on evaluative dimensions, in light of these findings. If a single composite
creativity measure based upon both novelty and usefulness is needed, we recommend creation of
a multiplicative composite (i.e., multiplying the novelty and usefulness ratings). When a
composite measure is created by summing or averaging novelty and usefulness, as is common, it
is difficult to interpret whether the results reflect ideas that are useful but not novel (not
creative), novel but not useful (quasi-creativity), or both novel and useful (effective-creativity).
A multiplicative measure better reflects the necessity of both novelty and usefulness, and will
reduce (but not eliminate) the interpretation difficulties associated with composite measures
created by summing or averaging the dimensions. For example, an idea that is rated zero on
13 Such a recommendation applies to associations of the dimensions with other constructs. If other constructs are associated only with novelty, and not usefulness, one is most likely capturing quasi-creativity, rather than effective-creativity. This point is considered in greater detail below in considering creativity and new product success.
19
usefulness and 5 on novelty yields a multiplicative score of zero, in contrast to an average score
of 2.5. As such, the multiplicative composite measure penalizes quasi-creativity, in accordance
with the definition of creativity requiring both novelty and usefulness.
Fully capturing the construct of creativity. Finally, the type of task used to prompt
responses will affect one’s ability to fully capture the construct of creativity. We are persuaded
by the viewpoint that creativity enfolds both divergent and convergent thinking. Thus, we
recommend choosing a task that can capture both types of creative thinking fully: Convergent
problem-solving tasks. We recommend the use of convergent problem-solving tasks as they are
better able to capture both the generation and evaluation of ideas.
In contrast, divergent thinking measures are excellent at capturing the generation
component of creativity. However, measuring an individual’s ability to generate creative ideas
does not capture the equally important and necessary component of evaluation and revision.
This concern applies equally to generation-focused divergent thinking tasks as well as problem-
focused divergent thinking tasks. As such, their use in isolation should be weighed against the
benefits of using a convergent task.
Similarly, convergent insight problems (e.g., the candle problem) are excellent at
capturing the cognitive flexibility that is an aspect of creativity. And indeed, by the very nature
of insight problems, individuals must come up with a seemingly novel and useful idea to
successfully solve such problems. However, it remains unclear whether these tests fully capture
the broader creative process. Thus, a concern akin to that of divergent tasks arises: Convergent
insight problems may under- or mis-measure creativity.
A Framework for Evaluating the Measure of Creativity
20
This review suggests a framework by which to evaluate the measure of creativity.
Specifically, the concerns and recommendations can be categorized according to the four
different types of tasks and the five most commonly used evaluative dimensions. Table 2
presents this framework. Of note is that it is possible to describe the cells of the table in terms of
the extent to which they capture creativity. The most informative approach is to use convergent
problem-solving task(s) in conjunction with the evaluative dimensions of novelty and usefulness
reported separately. Less informative are the two types of divergent thinking in conjunction with
the evaluative dimensions of novelty and usefulness reported separately, as well as the use of
convergent insight problems. On the other hand, the use of a composite measure of novelty and
usefulness; as well as the use of novelty, and/or subjective creativity, and/or fluency, absent
usefulness; provide results that are ambiguous.
Creativity Measures in Marketing and Consumer Behavior
One goal of this review and resulting framework is to consider how creativity has been
measured in Marketing and Consumer Behavior. To do so, we examined all papers published in
the Journal of Consumer Research and the Journal of Marketing Research that 1) included a task
to elicit a response, and 2) measured creativity as a dependent variable. As such, this is an
illustrative, rather than exhaustive, review. This approach resulted in 12 papers, with a total of 35
relevant experiments.14 These papers and experiments are described in table 3.
14 Of the fourteen other papers, twelve did not include a psychometric measure of creativity. The remaining two papers (Andrews & Smith, 1996; Sethi, Smith, & Park, 2001) used evaluative measures but did not include a task. These two papers will be returned to in the discussion. Finally, one study (Moreau & Engeset, 2016, Study 1) used a figural completion task drawn from the TCTT that is difficult to categorize in terms of our framework. Specifically, the task requires one solution, of which there is no correct answer, and thus would seem to qualify as a Convergent Problem-Solving Task. However, it is uniformly conceptualized as a Divergent Thinking task (e.g., Plucker, 1999), and thus was not included.
21
We used the framework presented in table 2 to categorize the 35 experiments. The results
of this categorization are presented in table 4. One caution in interpreting this table is that there
do not exist analogous tables in general or in other fields to which to compare. As such,
inferences as to the relative performance of Marketing and Consumer Behavior versus other
fields is not possible. Instead, table 4 serves as a guide to potential refinement in creativity
measures used in future research.
Inspection of table 4 reveals that 23% (N = 8) of all studies used the most informative
approach, convergent problem-solving task(s) in conjunction with the evaluative dimensions of
novelty and usefulness reported separately. Another 26% (N = 9) used less informative
approaches: divergent thinking tasks in conjunction with the evaluative dimensions of novelty
and usefulness reported separately (20%, N = 7), or convergent insight problems (6%, N = 2).
Finally, 51% (N = 18) used ambiguous approaches (composite novelty and usefulness scores
were used by 14%, N = 5; and novelty, and/or subjective creativity, and/or fluency scores absent
usefulness by 37%, N = 13).
Recall that one of the most important issues is the question of whether ideas reflect quasi-
or effective-creativity. Inspection of table 4 reveals that there are 15 studies (43%) in which
inferences can be made as to the type of creativity (i.e., those that measured and reported novelty
and usefulness separately). Of those 15 studies, five reflected effective-creativity, whereas the
remaining ten reflected quasi-creativity. This information is presented in the last column of table
3.
Lastly, this analysis highlights the breadth of research topics in Marketing and Consumer
Behavior that are informed by the study of creativity. Inspection of the Operationalization
column in table 3 reveals that many of the tasks used are at the heart of Marketing and Consumer
22
Behavior. Ten studies (29%) used a task involving new product development (e.g., designing a
toy, creating new ice cream flavors, developing a device to help one eat while driving).
Seventeen studies (49%) used a task involving product improvement or new uses (generating
design ideas for an improved keyboard; new features for products such as apps, Facebook, or
drinking glasses; or alternative uses for various products such as bubble wrap, a new mattress, a
brick). And two studies (6%), used a task involving advertising (generating advertising slogans
for a new polo bike). Impressively, one study even used a task that involved both branding and
new product design (creating new ice cream flavors and brand names for each, Mehta, Dahl, &
Zhu, 2017, Study 3). All of these topics are central to our field.
Analysis thus far has been at the study level. However, it is important to note that all 12
papers included multiple studies.15 Four of the 12 papers used multiple task types across studies.
Further, of the eight papers which did not include multiple tasks, five used solely the optimal
type – a convergent problem-solving task. And across papers, seven included a convergent
problem-solving task in at least one study. Regarding evaluative dimensions, of the seven
experiments that used the ambiguous dimensions of novelty, and/or subjective creativity, and/or
fluency, absent usefulness, two were contained in a paper that also included a study in which
novelty and usefulness were rated and reported separately. Of the five experiments that used the
ambiguous composite measure comprised of novelty and usefulness scores, one paper included
the most informative approach of reporting each dimension separately.16
15 Moreau and Engeset (2016) have an additional study not included in our analysis. This study used a figural completion task from the TTCT. See footnote 14. 16 A review of the independent variables examined and how they influence creativity are beyond the scope of this paper. However, it is crucial to note that the papers included in this review, and others not included, have contributed many important advances to our understanding of the determinants of creativity in marketing and consumer behavior. For a small taste, for example, constraints (Mehta & Zhu, 2016; Moreau & Dahl, 2005) and priming of creative brands (Fitzsimmons et al., 2008) can increase creativity, whereas different incentives (social- versus financial) have differential effects on creativity (Mehta, Dahl, & Zhu, 2017).
23
Recommendations for Future Creativity Research in Marketing and Consumer Behavior
Marketing and Consumer Behavior stands at an advantageous position to not only further
study creativity in our own field, but to make advances to our understanding of how best to
conceptualize and measure creativity more broadly. As shown, creativity is central to many of
the research questions that span the field of Marketing and Consumer Behavior. In addition, we
argue that Marketing and Consumer Behavior have particular expertise in sophisticated
experimental methods (multi-method and measure, examining moderating and mediating
variables), and modeling approaches, approaches which have the potential to make important
advances in the field of creativity. To do so, however, it is beneficial to use the most powerful
creativity measures possible, those that are both clearly interpretable and comprehensive. To that
end, the paper offers specific recommendations for the use of creativity measures in the future.
Further, it highlights study designs and methods used in Marketing and Consumer Behavior that
offer particular advantages in the study of creativity and thus can help to advance the field more
generally.
Use Convergent Problem-solving Tasks
Just as exists in the larger field of creativity, so too exists in marketing and consumer
behavior the issue of overreliance on divergent thinking tasks. Seventeen (49%) of all
experiments used divergent thinking tasks. And an additional two (6%) of the experiments relied
upon convergent insight problems. Moving away from the use of these tasks in isolation will
allow for greater power in our ability to understand creativity more fully. That is, to the extent
that they are used, they should be used in conjunction with other tasks, ideally convergent
problem-solving tasks, and ideally in the same study.
24
There are several excellent examples in the sample of the use of convergent problem-
solving tasks. To highlight one, Burroughs and Mick (2004) developed a specific, real-world
problem in order to elicit a single solution. They adapted the ‘just suppose’ task of the TTCT to
create a shoeshine problem. Specifically, individuals read:
Just suppose that you are going out to dinner one evening. You have just moved into the area to take a new job. It is the annual company banquet held by your new employer and you are probably going to be called up front and introduced to the rest of the company by your new boss. You put on a black outfit and think that you are all ready for the dinner when, as you go to put on your shoes, you discover they are all scuffed up and the scuffs are definitely noticeable. You go to the utility closet only to discover that you are almost completely out of shoe polish. This is the only pair of shoes you have to go with this outfit and there is really no other outfit you can wear. You have 2 minutes before you must head to the dinner if you are to be on time. Since you live in a residential area, all of the stores in your part of town have already closed for the evening. You know of one shopping mall that is open but it means an extra 5 minutes of freeway driving.
After reading the scenario, individuals were instructed to write down how they would
respond to the problem. This rich, complex task has been subsequently used by other researchers
(e.g., Mehta, Dahl, & Zhu, 2017, experiment one).
Use Ratings of Novelty and Usefulness and Report Results Separately
Apart from the issue of overreliance on divergent thinking tasks, there also exists some
overreliance on relatively disadvantageous evaluative dimensions. Over half of all measures used
evaluative dimensions that were ambiguous. Five experiments (14%) used an averaged
composite measure, and thirteen experiments (37%) used novelty, and/or subjective creativity,
and/or fluency, absent usefulness. All of these measures make inferences as to the type of
creativity impossible. If such measures are used, we recommend that they be used in conjunction
with ratings of novelty and usefulness (scored and reported independently), ideally in the same
study, and any composites should be multiplicative rather than additive or an average.
The Gold Standard for Creativity Measurement in a Single Study
25
The most informative measures of creativity will thus consist of a convergent problem-
solving task, in combination with reporting results for novelty and usefulness separately. The
most effective approach in the sample, in our view, was implemented by Moreau and Dahl
(2005). They used a convergent problem-solving task in which they had individuals “design a
toy, anything a child (age 5-11) can use to play with.” Experts (senior design professionals)
evaluated the product according to novelty and usefulness, and these scores were analyzed and
reported separately. This investigation is an excellent example of the use of a convergent
problem-solving task, in combination with reporting results for novelty and usefulness separately
(see also Sellier & Dahl, 2011, Study 1). Such an approach affords the opportunity to explore
which factors lead to quasi- versus effective-creativity, as well as to fully capture the creative
process.
Multiple Methods and Measures
In summarizing the field of creativity, Mumford (2003) warned of the limits of
“methodological isolationism,” in which researchers adopt and use only one method, and often,
one measure. Marketing and Consumer Behavior are expert at using multiple methods and
measures in order to better understand processes and phenomena, and this expertise is beautifully
illustrated in the papers reviewed herein. Inspection of table 3 reveals exemplary use of multiple
measures across studies in a single paper. Mehta, Dahl, and Zhu (2017) use convergent problem-
solving, generation-focused problem-solving, and problem-focused divergent thinking tasks all
within one paper. A similar multi-measure approach is used by Mehta and Zhu (2015), who used
all four types of tasks. Inspection of table 3 also reveals the exemplary practice of using multiple
evaluative dimensions within one study. For example, Mehta, Zhu, and Cheema (2012),
examined the evaluative dimensions of novelty and usefulness separately, in addition to
26
subjective creativity and fluency. That all of these findings replicated across these different tasks
and evaluative dimensions provides convergent validity.
Also informative can be the instances in which the results within one study (or set of
studies) reveal differences on tasks and/or evaluative dimensions. This discriminant validity
provides the potential for insight into the specific processes that variables affect, and which they
do not. For example, cognitive demand has been found to influence insight problems (Kershaw
& Ohlsson, 2004). To what extent does such an antecedent influence the other types of tasks and
evaluative dimensions? Discovering how different variables influence different tasks and
evaluative dimensions has the potential to provide a richer and deeper understanding of the
antecedents of and processes underlying creativity. It is important to note that within such an
approach, null are as valuable as significant findings. The current emphasis in Psychology and
Consumer Behavior on reporting the results for all measures, significant or not, dovetails nicely
with advocating such a multi-method, multi-measure approach to creativity research.
The Use of Moderators
As implied by the previous discussion, the use of moderators is important in order to
understand the processes underlying any phenomena. Moderators allow one to understand under
what conditions independent variables affect dependent variables to a greater or lesser degree.
Such understanding affords inferences as to the psychological processes underlying phenomena.
Marketing and Consumer Behavior researchers commonly use moderators. The use of
moderators is not common in the broader field of creativity, though evidence exists to suggest
their existence and potential importance. As but one example, the use of realistic versus
unrealistic tasks moderates the nature of the ideas generated. Realistic tasks decrease the number
of novel ideas by over half, as compared to unrealistic tasks. And at the same time, realistic tasks
27
almost double the number of useful ideas as compared to unrealistic tasks (Runco, Illies, &
Eisenman, 2005). Though beyond the scope of this paper, many of the articles in the sample of
papers from the Journal of Consumer Research and the Journal of Marketing Research
examined variables that moderate the effect of another variable on creativity (see Footnote 16).
Marketing and Consumer Behavior can use such moderation in order to understand
questions such as the conditions under which quasi- versus effective-creativity are likely to
emerge, and as such, to advance our understanding of the antecedents of different types of
creativity. For example, Moreau and Dahl (2005) used the same convergent problem-solving
task (i.e., design a toy), in combination with reporting results for novelty and usefulness
separately, across three studies. As seen in table 3, this approach makes clear that whereas
quasi-creativity was found in Studies 1 and 2, effective-creativity was found in Study 3. Such
moderation affords the opportunity for further theorizing, analysis, and research to explore which
factors lead to quasi- versus effective-creativity, by first asking how Studies 1 and 2 differ from
Study 3 and thus what variable may have moderated the results.
Creativity Beyond the Experimental Paradigm
Of note is that the review upon which this paper is based relies primarily upon studies
that employed an experimental paradigm in which creativity is the dependent measure. One
might wonder to what extent the findings and guidance explicated herein are relevant to those
areas of the Marketing field that do not use such an experimental approach. Researchers in both
qualitative (more specifically, consumer culture theory) and quantitative (such as analytic and
game theory) paradigms have explored creativity. For example, qualitative studies have explored
collective creativity (Weijo, Martin, & Arnould, 2018), cultures of creativity (Fox, 2019),
creativity and identity (Jones, 2017), and creativity in motherhood (McCabe & Malefyt, 2015).
28
Quantitative studies have explored creativity’s role in advertising (Yang & Smith, 2009; Smith,
MacKenzie, Yang, Buchholz, & Darley, 2007), new product development (Dean, Griffith, &
Calantone, 2016; Im & Workman, 2004; Sethi, Smith, & Park, 2001), marketing programs
(Andrews & Smith, 1996), and organizational memory (Moorman & Miner, 1997).
Most of this research examines naturally-occurring creativity (e.g., the creativity of
advertisements broadcast on television). Consequently, the choice of tasks to elicit creative
responses in these instances is of little concern. However, the evaluative dimensions used to
assess creativity (of ads, ideas, products, etc) should be of great concern. As such, of greatest
potential relevance to the broader research on creativity within Marketing is most likely the
distinction between quasi- and effective-creativity. Interestingly, two papers (Andrews & Smith,
1996; Sethi, Smith, & Park, 2001) in Journal of Marketing Research asked Marketing managers
to evaluate the creativity of products that had been recently launched by their own companies.
The managers provided ratings of novelty and usefulness. These ratings were combined to create
a composite measure of creativity. Our framework provides the insight that both papers could
provide greater insight by analyzing and reporting results for novelty and usefulness separately.
In a similar vein, we end with two real-world examples to demonstrate the power and usefulness
of the approach to creativity proposed herein.
Creativity helps better solve problems, and as such, should be evidenced in more
successful ideas, solutions, and products.17 Consider the role of creativity in new product success.
How might the proposed framework, and more specifically the concern of quasi- versus
effective-creativity, be useful in understanding new product success? To illustrate, consider two
17 For example, Dahl and Moreau (2002) find that product creativity influences how much consumers are willing to pay for new products.
29
products, both of which were launched in 2001, and both of which had mysterious trajectories;
the Segway personal transporter and the Apple iPod.
The Segway personal transporter was a technological breakthrough. Its novelty
impressed, even dazzled, experts in the field of innovation. Steve Jobs judged that the Segway
was, perhaps, as important as the personal computer (Heilemann, 2001). Similarly, John Doerr
lavished praise on the Segway, predicting that it would be bigger than the internet and quickly
achieve over one billion dollars in sales (McFarland, 2018). Sales of the Segway, however, did
not achieve these predictions. Rather, the Segway, upon which over $100 million was invested
in its development, sold only 30,000 units from 2001-2007 (Golson, 2015). Indeed, the Segway
is considered to be one of the major technological failures of the 21st century (The 10 Biggest
Tech Failures of the Last Decade, 2009). Why?
In contrast, the iPod did not offer a comparable technological breakthrough. Instead, the
iPod offered a shuttle wheel for greater ease of use in line with its sleek, small design.
Technologically, it provided faster file transfer and larger storage. However, there already
existed an array of MP3 players, and its price ($399) was higher than others in its category.
Thus, its usefulness was clear, but its novelty was not. Upon its launch, Apple fans reacted with
disappointment, bemoaning Apple’s decision to “enter the world of gimmicks and toys.” A
representative reaction was, “I still can’t believe this! All this hype for something so ridiculous!
Who cares about an MP3 player? I want something new! I want them to think differently! Why
oh why would they do this?! It’s so wrong! It’s so stupid!” (Heisler, 2014). Lack of positive
reviews were not restricted to consumers. Common were judgments such as that offered by
Stephen Baker, a technology expert and analyst at NPD. He concluded that the iPod had “good
features, but this is a pretty competitive category. The question is whether people want that
30
robust of a feature set with that high of a price" (Cnet staff, 2006; see also Garber, 2013).
Favorable reviews were rare, and very few were able to discern the novelty inherent in the
iPod.18 However, in stark contrast to the Segway, the iPod succeeded. From 2001-2007,
consumers purchased over 100 million iPods (100 million iPods sold, 2007). And more
importantly, many have argued that the success of the iPod afforded Apple’s rise to dominance
today (Van Buskirk, 2012). Again, why?
This paper offers a new perspective by which to interpret the differential trajectories of
these two products. Recall two findings. First, perceptions of novelty drive perceptions of
creativity, regardless of usefulness. Second, useful ideas are often perceived to lack novelty
(Runco & Charles, 1993). In other words, obviously novel products are likely to be perceived as
more creative, even when those products lack usefulness, while at the same time, obviously
useful products are likely to be perceived as less novel.
We propose that the Segway was perceived to be creative because of its technological
novelty. And this perception of creativity led to predictions of success. However, the Segway
possessed novelty but lacked usefulness, and thus is an instance of quasi-creativity. And without
usefulness, it was not successful. We call this the novelty trap, in which the novelty of a new
product overwhelms and obscures its limited usefulness. Novelty traps, we propose, will often be
18 One review was exceptional in its prescient understanding of the novelty and usefulness of the iPod. Former CNET editor Eliot Van Buskirk wrote, “But a few things make me wonder if the iPod is not the harbinger of a new type of device, unrelated to its function as an MP3 player… If you add all of these disparate facts together and look at the whole picture, you'll see where I'm going with this. The iPod is more than an MP3 player; it's a prototype of the data wallets that we'll all carry around within the decade. These devices will sync info between multiple machines and allow for music and video collections to be carried around everywhere. They won't have a complicated interface, but they will include a variety of ports for connection to keyboards, Webcams, monitors, networks, cell phones, PDAs, stereos, headphones, video goggles, GPS modules -- whatever peripheral you can think of… If a more secure identification technology were added, the device could even act as some sort of secure digital ID for activities such as boarding planes or filling prescriptions.” (Heisler, 2014).
31
associated with products that possess quasi-creativity. And this quasi-creativity may often result
in an overestimation of a product’s future success.
In contrast, the iPod was perceived to be useful, and this perception of usefulness most
likely led it to be perceived as lacking novelty, and as a result, creativity. This perception led to
predictions of at best modest success. However, the iPod did not lack novelty, in part because of
its much more user-friendly design, and thus this is an instance in which effective-creativity was
overlooked. We call this the usefulness trap, in which a product’s usefulness penalizes the
perception of its novelty, resulting in an underestimation of its creativity. Such an
underestimation of creativity may often result in the underestimation of the product’s future
success.
Though anecdotal, these examples raise intriguing and potentially important questions:
To what extent do perceptions of a product’s creativity, and potential for adoption, rest upon the
dimensions of novelty and/or usefulness? That is, do these dimensions differentially shape the
predicted versus actual adoption of new products, as we hypothesize? More broadly, these
examples demonstrate the power and usefulness of the approach to creativity proposed herein.
A Framework of Creativity Measures for the Future of Creativity Research
Given the recent increase in interest in creativity, combined with the methodological
strengths which can be brought to bear on the study of creativity, Marketing and Consumer
Behavior stand in a unique position to move the entire field of creativity forward. The
framework and recommendations proposed herein can help the field to reach this potential. As
such, the nature, antecedents, and consequences of creativity provide a fruitful area of continued
interest for research going forward.
32
References
Abdulla, A. M., Paek, S. H., Cramond, B., & Runco, M. A. (2018). Problem finding and
creativity: A meta-analytic review. Psychology of Aesthetics, Creativity, and the Arts.
Acar, S., Runco, M. A., & Park, H. (2019). What should people be told when they take a
divergent thinking test? A meta-analytic review of explicit instructions for divergent
thinking. Psychology of Aesthetics, Creativity, and the Arts.
Abdulla, A. M., Paek, S. H., Cramond, B., & Runco, M. A. (2018). Problem finding and
creativity: A meta-analytic review. Psychology of Aesthetics, Creativity, and the Arts.
Amabile, T. M. (1982). Social psychology of creativity: A consensual assessment
technique. Journal of personality and social psychology, 43(5), 997.
Andrews, J., & Smith, D. C. (1996). In search of the marketing imagination: Factors affecting the
creativity of marketing programs for mature products. Journal of Marketing
Research, 33(2), 174-187.
Baas, M., De Dreu, C. K., & Nijstad, B. A. (2008). A meta-analysis of 25 years of mood-
creativity research: Hedonic tone, activation, or regulatory focus? Psychological
Bulletin, 134(6), 779.
Basadur, M. I. N., Runco, M. A., & VEGAxy, L. A. (2000). Understanding how creative
thinking skills, attitudes and behaviors work together: A causal process model. The
Journal of Creative Behavior, 34(2), 77-100.
Basadur, M. (1995). Optimal ideation-evaluation ratios. Creativity Research Journal, 8(1), 63-
75.
33
Brown, R. T. (1989), Creativity: What Are We to Measure? In Handbook of Creativity (Vol. 1,
pp. 41-66). Cresskill, NJ: Hampton.
Burnham, C. A., & Davis, K. G. (1969). The nine-dot problem: Beyond perceptual
organization. Psychonomic Science, 17(6), 321-323.
Burroughs, J. E., & Mick, G. D. (2004). Exploring antecedents and consequences of consumer
creativity in a problem-solving context. Journal of Consumer Research, 31(2), 402-411.
Cattell, R. B., & Butcher, H. J. (1968). The Prediction of Achievement and Creativity.
Indianapolis, IN: Bobbs-Merrill.
Chand, I., & Runco, M. A. (1993). Problem finding skills as components in the creative
process. Personality and Individual differences, 14(1), 155-162.
Chen, F., & Sengupta, J. (2014). Forced to be bad: The positive impact of low-autonomy vice
consumption on consumer vitality. Journal of Consumer Research, 41(4), 1089-1107.
Cnet Staff (2006, October 19). Apple’s iPod spurs mixed reactions. Cnet. Retrieved from
https://www.cnet.com/news/apples-ipod-spurs-mixed-reactions/
Cropley, A. J. (2000). Defining and measuring creativity: Are creativity tests worth
using? Roeper Review, 23(2), 72-79.
Dahl, D. W., & Moreau, P. (2002). The influence and value of analogical thinking during new
product ideation. Journal of Marketing Research, 39(1), 47-60.
Dahl, D. W., & Moreau, C. P. (2007). Thinking inside the box: Why consumers enjoy
constrained creative experiences. Journal of Marketing Research, 44(3), 357-369.
34
Datta, L. E. (1964). Remote associates test as a predictor of creativity in engineers. Journal of
Applied Psychology, 48(3), 183.
de Vries, H. B., & Lubart, T. I. (2019). Scientific creativity: divergent and convergent thinking
and the impact of culture. The Journal of Creative Behavior, 53(2), 145-155.
Dean, T., Griffith, D. A., & Calantone, R. J. (2016). New product creativity: Understanding
contract specificity in new product introductions. Journal of Marketing, 80(2), 39-58.
Diedrich, J., Benedek, M., Jauk, E., & Neubauer, A. C. (2015). Are creative ideas novel and
useful? Psychology of Aesthetics, Creativity, and the Arts, 9(1), 35.
Dietrich, A., & Kanso, R. (2010). A review of EEG, ERP, and neuroimaging studies of creativity
and insight. Psychological Bulletin, 136(5), 822.
Duncker, K. (1945). The structure and dynamics of problem-solving processes. Psychological
Monographs, 58, 1-12.
Eysenck, H. J. (2003). Creativity, personality and the convergent-divergent continuum. In M. A.
Runco (Ed.), Perspectives on creativity research. Critical creative processes (pp. 95-
114). Cresskill, NJ, US: Hampton Press.
Finke, R. A., Ward, T. B., & Smith, S. M. (1992). Creative cognition: Theory, research, and
applications. Cambridge, MA: MIT Press.
Fitzsimons, G. M., Chartrand, T. L., & Fitzsimons, G. J. (2008). Automatic effects of brand
exposure on motivated behavior: how apple makes you “think different”. Journal of
Consumer Research, 35(1), 21-35.
35
Fox, S. (2019). Mass imagineering, mass customization and mass production: Complementary
cultures for creativity, choice and convenience. Journal of Consumer Culture, 19(1), 67-
81.
Garber, M. (2013, October 23). 12 years ago: ‘Apple’s iPod spurs mixed reactions’ hindsight,
but little else, is 20/20. Retrieved from
https://www.theatlantic.com/technology/archive/2013/10/12-years-ago-apples-ipod-
spurs-mixed-reactions/280795/
Goldenberg, J., Mazursky, D., & Solomon, S. (1999). Toward identifying the inventive templates
of new products: A channeled ideation approach. Journal of Marketing Research, 36(2),
200-210.
Golson, J. (2015, January 16). Well, that didn’t work: The Segway is a technological marvel.
Too bad it doesn’t make any sense. Wired. Retrieved from
http s://www.wired.com/2015/01/well-didnt-work-segway-technological-marvel-bad-
doesnt-make-sense/
Guilford, J. P. (1950). Creativity. American Psychologist, 5, 444-454.
Guilford, J. P. (1960). Basic conceptual problems in the psychology of thinking. Annals of the
New York Academy of Sciences, 91(1), 6-21.
Hayes, A. F. (2017). Introduction to mediation, moderation, and conditional process analysis: A
regression-based approach. Guilford Publications.
Heilemann, J. (2001, December 2). Reinventing the wheel. Time. Retrieved from
http://content.time.com/time/business/article/0,8599,186660,00.html
36
Heisler, Y. (2014, January 21). An incredible 2001 iPod review accurately predicted the iPod’s
impact on computing.” Retrieved from https://www.engadget.com/2014/01/21/an-
incredible-2001-ipod-review-accurately-predicted-the-ipods-i/
Hennessey, B. A. & Amabile, T. M. (2010). Creativity. Annual Review of Psychology, 61, 569-
598.
Hocevar, D. (1979). Ideational fluency as a confounding factor in the measurement of
originality. Journal of Educational Psychology, 71(2), 191.
Im, S., & Workman Jr, J. P. (2004). Market orientation, creativity, and new product performance
in high-technology firms. Journal of Marketing, 68(2), 114-132.
Jones, I. (2017). ‘He’s still the winner in my mind’: Maintaining the collective identity in sport
through social creativity and group affirmation. Journal of Consumer Culture, 17(2),
303-320.
Kaufman, Baer, Cole, & Sexton, 2008 Kershaw, T. C., & Ohlsson, S. (2004). Multiple causes of
difficulty in insight: the case of the nine-dot problem. Journal of Experimental
Psychology: Learning, Memory, and Cognition, 30(1), 3.
Köhler, W. (1959). Gestalt psychology today. American Psychologist, 14(12), 727.
Kounios, J., & Beeman, M. (2014). The cognitive neuroscience of insight. Annual review of
psychology, 65.
Kuypers, K. P. C., Riba, J., De La Fuente Revenga, M., Barker, S., Theunissen, E. L., &
Ramaekers, J. G. (2016). Ayahuasca enhances creative divergent thinking while
decreasing conventional convergent thinking. Psychopharmacology, 233(18), 3395-3403.
37
Lonergan, D. C., Scott, G. M., & Mumford, M. D. (2004). Evaluative aspects of creative
thought: Effects of appraisal and revision standards. Creativity Research Journal, 16(2-
3), 231-246.
Lu, J. G., Hafenbrack, A. C., Eastwick, P. W., Wang, D. J., Maddux, W. W., & Galinsky, A. D.
(2017). “Going out” of the box: Close intercultural friendships and romantic relationships
spark creativity, workplace innovation, and entrepreneurship. Journal of Applied
Psychology, 102(7), 1091.
Maier, N. R. (1933). An aspect of human reasoning. British Journal of Psychology, 24(2), 144.
McCabe, M., & de Waal Malefyt, T. (2015). Creativity and cooking: Motherhood, agency and
social change in everyday life. Journal of Consumer Culture, 15(1), 48-65.
McFarland, M. (2018, October 30). Segway was supposed to change the world. Two decades
later, it just might. CNN Business. Retrieved from
https://www.cnn.com/2018/10/30/tech/segway-history/index.html
Mednick, S. (1962). The associative basis of the creative process. Psychological Review, 69(3),
220.
Mednick, S. A. (1968). The remote associates test. The Journal of Creative Behavior, 2(3), 213-
214.
Mehta, R., Dahl, D. W., & Zhu, R. J. (2017). Social-recognition versus financial incentives?
Exploring the effects of creativity-contingent external rewards on creative
performance. Journal of Consumer Research, 44(3), 536-553.
38
Mehta, R., & Zhu, M. (2015). Creating when you have less: The impact of resource scarcity on
product use creativity. Journal of Consumer Research, 42(5), 767-782.
Moorman, C., & Miner, A. S. (1997). The impact of organizational memory on new product
performance and creativity. Journal of Marketing Research, 34(1), 91-106.
Moreau, C. P., & Dahl, D. W. (2005). Designing the solution: The impact of constraints on
consumers' creativity. Journal of Consumer Research, 32(1), 13-22.
Moreau, C. P., & Engeset, M. G. (2016). The downstream consequences of problem-solving
mindsets: How playing with LEGO influences creativity. Journal of Marketing
Research, 53(1), 18-30.
Mumford, M. D. (2003). Where have we been, where are we going? Taking stock in creativity
research. Creativity Research Journal, 15(2-3), 107-120.
Mumford, M. D., Baughman, W. A., Threlfall, K. V., Supinski, E. P., & Costanza, D. P. (1996).
Process-based measures of creative problem-solving skills: I. Problem
construction. Creativity Research Journal, 9(1), 63-76.
Mumford, M. D., Baughman, W. A., Supinski, E. P., & Maher, M. A. (1996). Process-based
measures of creative problem-solving skills: II. Information encoding. Creativity
Research Journal, 9(1), 77-88.
Mumford, M. D., Giorgini, V., Gibson, C., & Mecca, J. (2013). Creative thinking: Processes,
strategies and knowledge. In Handbook of research on creativity. Edward Elgar
Publishing.
39
Mumford, M. D., & Gustafson, S. B. (1988). Creativity syndrome: Integration, application, and
innovation. Psychological Bulletin, 103(1), 27.
Mumford, M. D., Lonergan, D. C., & Scott, G. (2002). Evaluating creative ideas: Processes,
standards, and context. Inquiry: Critical Thinking Across the Disciplines, 22(1), 21-30.
Okuda, S. M., Runco, M. A., & Berger, D. E. (1991). Creativity and the finding and solving of
real-world problems. Journal of Psychoeducational Assessment, 9(1), 45-53.
Osborn, H. K., & Mumford, M. D. (2006). Creativity and planning: Training interventions to
develop creative problem-solving skills. Creativity Research Journal, 18(2), 173-190.
Over 100 Million iPods Sold (2007, April 9). Retrieved from
https://www.apple.com/newsroom/2007/04/09100-Million-iPods-Sold/
Plucker, J. A. (1999). Is the proof in the pudding? Reanalyses of Torrance's (1958 to present)
longitudinal data. Creativity Research Journal, 12(2), 103-114.
Plucker, J. A., Makel, M. C., & Qian, M (2019). Assessment of Creativity. In The Cambridge
Handbook of Creativity, James C. Kaufman & Roberts S. Sternberg (eds), NY, NY:
Cambridge University Press.
Plucker, J. A., Qian, M., & Schmalensee, S. L. (2014). Is what you see what you really get?
Comparison of scoring techniques in the assessment of real-world divergent
thinking. Creativity Research Journal, 26(2), 135-143.
Plucker, J. A., Qian, M., & Wang, S. (2011). Is originality in the eye of the beholder?
Comparison of scoring techniques in the assessment of divergent thinking. The Journal
of Creative Behavior, 45(1), 1-22.
40
Reiter-Palmon, R., Illies, M. Y., Kobe Cross, L., Buboltz, C., & Nimps, T. (2009). Creativity and
domain specificity: The effect of task type on multiple indexes of creative problem-
solving. Psychology of Aesthetics, Creativity, and the Arts, 3(2), 73.
Rothenberg, A., & Hausman, C. R. (Eds.). (1976). The Creativity Question. Durham, NC: Duke
University Press.
Runco, M. A. (1988). Creativity research: Originality, utility, and integration. Creativity
Research Journal, 1, 1-7.
Runco, M. A. (1991). Divergent thinking. Norwood, NJ: Ablex Publishing Corporation.
Runco, M. A. (Ed.). (1994). Problem finding, problem solving, and creativity. Santa Barbara,
CA: Greenwood Publishing Group.
Runco, M. A. (2008). Commentary: Divergent thinking is not synonymous with creativity.
Runco, M. A. (2014). Creativity: Theories and themes: Research, development, and practice.
NY, NY: Elsevier.
Runco, M. A., & Acar, S. (2012). Divergent thinking as an indicator of creative
potential. Creativity Research Journal, 24(1), 66-75.
Runco, M. A., & Chand, I. (1995). Cognition and creativity. Educational psychology
review, 7(3), 243-267.
Runco, M. A., & Charles, R. E. (1993). Judgments of originality and appropriateness as
predictors of creativity. Personality and Individual Differences, 15(5), 537-546.
41
Runco, M. A., Illies, J. J., & Eisenman, R. (2005). Creativity, originality, and appropriateness:
What do explicit instructions tell us about their relationships? The Journal of Creative
Behavior, 39(2), 137-148.
Runco, M. A., Millar, G., Acar, S., & Cramond, B. (2010). Torrance tests of creative thinking as
predictors of personal and public achievement: A fifty-year follow-up. Creativity
Research Journal, 22(4), 361-368.
Runco, M. A., & Okuda, S. M. (1988). Problem discovery, divergent thinking, and the creative
process. Journal of Youth and Adolescence, 17(3), 211-220.
Runco, M. A., Plucker, J. A., & Lim W. (2000). Development and psychometric integrity of a
measure of ideational behavior. Creativity Research Journal, 13, 391-398.
Schooler, J. W., & Melcher, J. (I994). The ineffability of insight. In S. M. Smith, T. B. Ward, &
R. A. Finke (Eds.), The Creative Cognition Approach (pp. 97-133). Cambridge, MA:
MIT Press.
Schooler, J. W., Ohlsson, S., & Brooks, K. (1993). Thoughts beyond words: When language
overshadows insight. Journal of Experimental Psychology: General, 122(2), 166.
Sellier, A. L., & Dahl, D. W. (2011). Focus! Creative success is enjoyed through restricted
choice. Journal of Marketing Research, 48(6), 996-1007.
Sethi, R., Smith, D. C., & Park, C. W. (2001). Cross-functional product development teams,
creativity, and the innovativeness of new consumer products. Journal of marketing
research, 38(1), 73-85.
42
Shapiro, R. J. (1965). The integrating of remotely associated concepts as a process in scientific
creativity. Psychologia Africana.
Silvia, P. J. (2008). Discernment and creativity: How well can people identify their most creative
ideas?. Psychology of Aesthetics, Creativity, and the Arts, 2(3), 139.
Silvia, P. J., Winterstein, B. P., Willse, J. T., Barona, C. M., Cram, J. T., Hess, K. I., ... &
Richard, C. A. (2008). Assessing creativity with divergent thinking tasks: Exploring the
reliability and validity of new subjective scoring methods. Psychology of Aesthetics,
Creativity, and the Arts, 2(2), 68.
Smith, R. E., MacKenzie, S. B., Yang, X., Buchholz, L. M., & Darley, W. K. (2007). Modeling
the determinants and effects of creativity in advertising. Marketing Science, 26(6), 819-
833.
Smith, R. W., & Kounios, J. (1996). Sudden insight: All-or-none processing revealed by speed–
accuracy decomposition. Journal of Experimental Psychology: Learning, Memory, and
Cognition, 22(6), 1443.
Stephen, A. T., Zubcsek, P. P., & Goldenberg, J. (2016). Lower connectivity is better: The
effects of network structure on redundancy of ideas and customer innovativeness in
interdependent ideation tasks. Journal of Marketing Research, 53(2), 263-279.
Sternberg, R. J. (2012). The assessment of creativity: An investment-based approach. Creativity
Research Journal, 24(1), 3-12.
Sternberg, R. J. (2018). A triangular theory of creativity. Psychology of Aesthetics, Creativity,
and the Arts, 12(1), 50.
43
Taft, R., & Rossiter, J. R. (1966). The remote associates test: divergent or convergent
thinking? Psychological Reports, 19(3_suppl), 1313-1314.
The 10 Biggest Tech Failures of the Last Decade (2009). Time. Retrieved from
http://content.time.com/time/specials/packages/0,28757,1898610,00.html
Torrance, E. (1966), The Torrance Tests of Creativity Thinking: Norms and technical manual.
Princeton, NJ: Personnel Press.
Torrance, E. (1999), The Torrance Tests of Creativity Thinking: Norms and technical manual.
Beaconville, IL: Scholastic Testing Services.
Torrance, E. P., & Wu, T. (1981). A comparative longitudinal study of the adult creative
achievements of elementary school children identified as highly intelligent and as highly
creative. Creative Child and Adult Quarterly, 6(2), 71-76.
Van Buskirk, E. (2012, September, 14). Without music, Apple would be nothing. Time.
Retrieved from http://business.time.com/2012/09/14/without-music-apple-would-be-
nothing/
Vernon, P. E. (1989). The nature-nurture problem in creativity. In Handbook of creativity (pp.
93-110). Springer, Boston, MA.
Wallach , M. A. , & Kogan , N. ( 1965 ). Modes of thinking in young children: A study of the
creativity–intelligence distinction . New York : Holt, Rinehart, & Winston
Weisberg, R., & Suls, J. M. (1973). An information-processing model of Duncker's candle
problem. Cognitive Psychology, 4(2), 255-276.
44
Weijo, H. A., Martin, D. M., & Arnould, E. J. (2018). Consumer movements and collective
creativity: The case of restaurant day. Journal of Consumer Research, 45(2), 251-274.
Wertheimer, M. (1959). On discrimination experiments: I. Two logical
structures. Psychological Review, 66, 252–266.
Yang, X., & Smith, R. E. (2009). Beyond attention effects: Modeling the persuasive and
emotional effects of advertising creativity. Marketing Science, 28(5), 935-949.
45
Table 1.
Number of Papers Mentioning Creativity
Total Number of Papers
Journal of Consumer Psychology
Journal of Consumer Research
Journal of Marketing
Journal of MarketingResearch
Marketing Science
1961-1970 1 0 0 1 0 0
1971-1980 1 0 1 0 0 0
1981-1990 0 0 0 0 0 0
1991-2000 8 0 0 2 5 1
2001-2010 15 1 4 5 2 3
2011-2018 31 6 8 8 6 3
Total Number of Papers by Journal
7 13 16 13 7
46
Table 2.
Framework for best practices in the measure of creativity.
Task
Evaluative Dimension
Convergent Problem-Solving Convergent Insight Problems Problem-Focused Divergent Thinking
Generation-Focused Divergent Thinking
Novel & Useful scores provided separately
Informative:Use if possible
N/A Less informative: If Used, Couple with
Convergent Problem-Solving
Less informative: If Used, Couple with
Convergent Problem-Solving
Novel & Useful scores combined to create a composite measure
Ambiguous interpretation:Use multiplicative if a
composite measure needed
N/A Ambiguous interpretation:Use multiplicative if a
composite measure needed
Ambiguous interpretation:Use multiplicative if a
composite measure needed
Whether response is correct N/A Less informative: If Used, Couple with
Convergent Problem-Solving
N/A N/A
Novelty, and/or Subjective Creativity, and/or Fluency, Absent Usefulness
Ambiguous interpretation:If Used, Couple with Useful
dimension
N/A Ambiguous interpretation:If Used, Couple with Useful
dimension
Ambiguous interpretation:If Used, Couple with Useful
dimension
47
Table 3
Papers Published in the Journal of Consumer Research (JCR) and the Journal of Marketing Research (JMR) that 1) Included A Task to Elicit Responses, and 2) Measured Creativity as a Dependent Variable
Authors, Year, & Journal
Study Task-type Operationalization Specific measures Results Type of Creativity
Mehta, Dahl, & Zhu (2017), JCR
1 Convergent Problem-Solving
Shoeshine problem Novelty and Usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
2 Generation-Focused Divergent Thinking
Just Suppose – what if clouds had strings attached to them which hang down to the earth
Novelty and Usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
3 Problem-Focused Divergent Thinking
Create 1) new ice cream flavors, 2) brand names for each, 3) ingredients for each
Novelty and Usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
Moreau & Engeset (2016), JMR
2 Generation-Focused Divergent Thinking
Alternative uses for a paper clip
Originality (statistical) and fluency
Critical variable influenced originality, but did not influence fluency
N/A
Stephen, Zubcsek, & Goldenberg (2016), JMR
1 Problem-Focused Divergent Thinking
Generate new features for a mobile banking app
Novelty Critical variable influenced novelty
N/A
2 Problem-Focused Divergent Thinking
Generate new features for Facebook
Novelty Critical variable influenced novelty
N/A
3 Problem-Focused Divergent Thinking
Generate new features for Facebook
Novelty Critical variable influenced novelty
N/A
4 Problem-Focused Divergent Thinking
Generate new features for Facebook
Novelty Critical variable influenced novelty
N/A
5 Problem-Focused Generate features for an Novelty Critical variable influenced N/A
48
Divergent Thinking existing product novelty
Mehta & Zhu (2015), JCR
1 Convergent Problem-Solving
Use building blocks to build a creative prototype of a toy with which a typical child between the ages of 5 and 7 could play:
Novelty and Usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
2 Convergent Insight Problem
Candle problem Whether solution is correct
Critical variable influenced number of correct answers
N/A
3 Generation-Focused Divergent Thinking
Alternative uses for a brick Novelty and Usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
4 Convergent Problem-Solving
Alternative use for bubble wrap (for use by School)
Novelty and Usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
5 Convergent Problem-Solving
Alternative use for bubble wrap (for use by School)
Novelty and Usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
6 Problem-Focused Divergent Thinking
Generate design ideas for an improved keyboard:
Novelty and Usefulness
Critical variable influenced novelty and usefulness in opposite directions
Quasi-
Chen & Sengupta (2014) , JCR
3 Convergent Problem-Solving
Generate 1 advertising slogan for a new polo bike
Composite measure (Novelty and Usefulness)
Critical variable influenced composite measure
N/A
4 Convergent Problem-Solving
Generate 1 advertising slogan for a new polo bike
Composite measure (Novelty and Usefulness)
Critical variable influenced composite measure
N/A
Mehta, Zhu, & Cheema (2012), JCR
1 Convergent Insight Problem
Remote Associations Test Number of correct responses
Critical variable influenced number of correct responses
N/A
2 Generation-Focused Divergent Thinking
Alternative uses for a new mattress
FluencySubjective creativity
Critical variable influenced subjective creativity and fluency
N/A
3 Generation-Focused Alternative uses for a brick Fluency Critical variable influenced N/A
49
Divergent Thinking Subjective creativity subjective creativity, but not fluency
4 Problem-Focused Divergent Thinking
Shoeshine problem Novelty and Usefulness
Critical variable influenced both novelty and usefulness
Effective-
Sellier & Dahl (2011), JMR
1 Convergent Problem-Solving
Knit a scarf Novelty, Usefulness, & Subjective creativity
Critical variable influenced both novelty and subjective creativity, and marginally influenced usefulness
Effective-
2 Convergent Problem-Solving
Create a Christmas tree ornament
Composite measure (Subjective creativity, Novelty, and Usefulness)
Critical variable influenced creativity measure
N/A
Fitzsimons, Chartrand, & Fitzsimons (2008) , JCR
1 Generation-Focused Divergent Thinking
Alternative uses for a brick FluencySubjective creativity
Critical variable influenced subjective creativity and fluency
N/A
3 Generation-Focused Divergent Thinking
Alternative uses for a brick FluencySubjective creativity
Critical variable influenced subjective creativity and fluency
N/A
Moreau & Dahl (2005) , JCR
1 Convergent Problem-Solving
Design a toy Novelty and usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
2 Convergent Problem-Solving
Design a toy Novelty and usefulness
Critical variable influenced novelty, but did not influence usefulness
Quasi-
3 Convergent Problem-Solving
Design a toy Novelty and usefulness
Critical variable influenced novelty and usefulness
Effective-
Burroughs & Mick (2004), JCR
1 Convergent Problem-Solving
Shoeshine problem Composite measure (Novelty and Usefulness)
Critical variable influenced composite measure
N/A
50
2 Convergent Problem-Solving
Shoeshine problem Composite measure (Novelty and Usefulness)
Critical variable influenced composite measure
N/A
Dahl & Moreau (2002), JMR
1 Convergent Problem-Solving
Design a new product to solve the problem of eating while driving
Composite measure (Original, Innovative, & Creative)
Critical variable influenced composite measure
N/A
2 Convergent Problem-Solving
Design a new product to solve the problem of eating while driving
Composite measure (Original, Innovative, & Creative)
Critical variable influenced composite measure
N/A
3 Convergent Problem-Solving
Design a new product to solve the problem of eating while driving
Composite measure (Original, Innovative, & Creative)
Critical variable influenced composite measure
N/A
Goldenberg, Mazursky, & Solomon (1999), JMR
1 Generation-Focused Divergent Thinking
Generate ideas for new product features for ointment and a mattress
Novelty and Usefulness
Critical variable influenced both novelty and usefulness
Effective-
2 Generation-Focused Divergent Thinking
Generate ideas for new product features for drinking glasses
Novelty and Usefulness
Critical variable influenced both novelty and usefulness
Effective-
51
Table 4.
Experiments categorized according to framework.
Task
Evaluative Dimension
Convergent Problem-SolvingN = 16 (46%)
Convergent Insight ProblemsN = 2 (6%)
Problem-Focused Divergent Thinking
N = 8 (23%)
Generation-Focused Divergent Thinking
N = 9 (26%)
Novel & Useful scores provided separatelyN = 15 (43%)
Informative
N = 8 (23%)6 reveal quasi-creativity
2 reveal effective-creativity
N/A Less informative
N = 3 (9%)2 reveal quasi-creativity
1 reveals effective-creativity
Less informative
N = 4 (11%)2 reveal quasi-creativity
2 reveal effective-creativity
Novel & Useful scores combined to create a composite measureN = 5 (14%)
Ambiguous interpretation
N = 5 (14%)
N/A Ambiguous interpretation Ambiguous interpretation
Whether response is correctN = 2 (6%)
N/A Less informative
N = 2 (6%)
N/A N/A
Novelty, and/or Subjective Creativity, and/or Fluency, Absent Usefulness N = 13 (37%)
Ambiguous interpretation
N = 3 (9%)
N/A Ambiguous interpretation
N = 5 (14%)
Ambiguous interpretation
N = 5 (14%)
52
Figure 1. The nine-dot problem
53
Appendix
Solutions to the insight problems
Nine-dot problem
Remote Associates Test
High, Book, Foot = Note
Fork, Man, Dark = Pitch