Transcript

I . . . . .. . . . . .. . . . . .

C 0 MMUN!C AT!N G R!SIi

A A

FIRST,

LEARN

WHAT

PEOPLE

KNOW AND

BELIEVE A A A A A A A A A A

d

Y

often a code [word] for brainwash- ing by experts or industry.” Clearly, there are ethical considerations in risk communication (3-7).

Within the community of risk professionals the phrase has come to mean communication that sup- plies lay people with the informa- tion they need to make informed in- dependent judgments about risks to health, safety, and the environment (3-20). Lay people make personal decisions about their exposures to risks, such as those associated with radon and diet, over which they ex- ercise considerable individual con- trol. Lay people also participate in democratic government processes by which decisions are made about risk issues, such as building a nu- clear power plant, over which indi- viduals can exercise relatively little control. To quote Thomas Jefferson about these processes, “diffusion of knowledge among the people” is the only sure strategy “for the pres- ervation of freedom and happi- ness.”

The research reported here seeks to present people with information they need in a form that fits their in- tuitive ways of thinking. It is in- tended to support the social and po- litical processes of managing risks in a democratic society. If risks were better understood, some conflicts would be avoided. Other risks that have received too little scrutiny might become the focus of informed debate (3-7, 1 2 ) .

If lay people were trained deci- sion analysts, then it would be straightforward to determine what information they need. A decision analysis would be constructed for the decisions that they face, their current knowledge would be as- sessed, and the additional informa- tion they need to help them distin- guish among the available options could be calculated. For example, homeowners deciding whether to test for radon would need to know the likelihood that their house has a high radon level, the health risk of various radon levels, the cost and accuracy of testing procedures, and the cost and efficacy of possible re- mediation measures (8-20).

However, people sometimes do not need to know much in order to make an informed decision. For ex- ample, the probability of having a radon problem might be small enough or the cost of remediation large enough that individuals would gain nothing by testing.

The information they will require for decisions they face is the mini-

mum content for communications directed at lay people. Remarkably few communications include any numbers at all regarding the magni- tude of risks or the confidence that can be placed in risk estimates. In their stead are recommendations such as “practice safe sex” or “if your measured radon level is above the standard, hire an approved con- tractor.” The implicit assumption of these communications is that peo- ple will let others do the decision analysis for them, trusting some ex- pert to apply the best scientific evi- dence toward identifying the course of action in their best interests. That trust could, however, be strained whenever the expert has a vested interest in which actions are taken, has values different from the cli- ent’s, or disagrees with other ex- perts.

Even when trust is complete, however, numbers alone may not suffice. Especially when they refer to very small quantities or are ex- pressed in unfamiliar units, the numbers simply may not “speak” to people. To get an intuitive feeling for the nature and magnitude of a risk, people may need some under- standing of the physical processes that create and regulate it. More- over, independent knowledge of the substance of an issue provides one basis for evaluating experts’ pro- nouncements.

Substantive information may be even more important in pre- and post-decision activities. Long before they make any decisions, people may be monitoring public discus- sion of a hazard, trying to establish some competence in the issues, and formulating options for future ac- tion. After an option has been cho- sen, implementing it (or making midcourse corrections) can require further knowledge of how things work.

Analogous issues arise when con- trol over hazards is exercised through political processes. Lay people must decide whether to sup- port or oppose a technology, as well as how to express those beliefs. A substantive understanding of risk processes may be important for evaluating the competence of those responsible for a hazard.

A “mental models” approach People process new information

within the context of their existing beliefs. If they know nothing about a topic, then a new message will be incomprehensible. If they have er- roneous beliefs, then they may mis-

construe the message. For example, even science students who get good grades will graft new knowledge onto fundamentally incorrect naive “mental models” for a long time, before replacing them with techni- cally correct models (22-26). Such mental models play significant roles in how people acquire new skills, operate equipment, and fol- low instructions (27-23). As a re- sult, communicators need to know the nature and extent of a recipi- ent’s knowledge and beliefs if they are to design messages that will not be dismissed, misinterpreted, or al- lowed to coexist with misconcep- tions (see box “Four steps for risk communication. ”)

The influence diagram As an organizing device, we con-

struct an expert influence diagram, a directed network showing the re- lationships among the factors rele- vant to a hazard-related decision (25 ) . Figure 1 shows a representa- tive portion of such a diagram for managing the risk of radon in a house’s crawl space. This diagram was developed iteratively with a group of experts who reviewed suc- cessive drafts. In it, knowledge about exposure and effects pro- cesses is represented hierarchically; the higher levels are more general. An arrow indicates that the value of the variable at its head depends on the value of the variable at its tail. Although they can be mapped into decision trees, influence diagrams are more convenient for displaying the functional relationships among variables.

No lay person would have this mental model. However, it provides a template for characterizing a lay- person’s mental model. That char- acterization can be performed in terms of the appropriateness of peo- ple’s beliefs, their specificity (Le., level of detail), and category of k n o w l e d g e . We distinguished among five categories: exposure processes, effects processes (i.e., health and physiology), mitigation behaviors, evaluative beliefs (e.g., radon is bad), and background in- formation (e.g., radon is a gas). In evaluating appropriateness, we characterized beliefs as accurate, er- roneous, peripheral (correct, but not relevant), or indiscriminate (too im- precise to be evaluated).

Open-ended procedure Elicitation. In the design of our

interview protocol, a primary objec- tive was to minimize the extent to

2050 Environ. Sci. Technol., Vol. 26, No. 1 1 , 1992

Four steps for risk communication

1. Open-ended elicitation of peo- ple's beliefs about a hazard, al- lowing expression of both accu- rate and inaccurate concepts.

2. Structured questionnaires de- signed to determine the preva- Ienc8 of these beliefs.

3. Development of communications based on what people need to know to make informed decisions (as determined by decision anal- ysis) and a psychological as- sessment of their current beliefs.

4. Iterative testing of successive versions of those communica- tions using open-ended, closed- form, and problem-solving in- struments, administered before, during, and alter the receipt of I messages.

illustration of how the four-step approach to risk communication, based on people's mental models of risk processes (24, fits within the broader process of risk management.

FIGURE 1 I Expert influence diagram for health effects of radon.

Radon horn building matorleis a

Luna I

which the investigator's perspective i s imposed on the respondent. In- stead of asking directed questions, we began with an open-ended ap- proach "Tell me about radon.'' To ensure that respondents had ample opportunities to address al l aspects

of the influence diagram, we pro- v i d e d i n c r e a s i n g l y d i r e c t e d prompts. Specifically, we asked re- spondents to elaborate on each comment that they had made in the "tell me about" stage. Then, we en- couraged them to describe expo-

sure, effects, risk-assessment, and risk-management processes. These basic categories seemed so essential that mentioning them would correct an oversight rather than introduce a foreign concept.

A part of our protocol for radon is

Envimn. Sci. Technoi.. Vol. 26, NO. 11. 1992 2051

presented (see box “Radon inter- view protocol”). The protocol was followed assiduously. Interview transcripts were reviewed periodi- cally to insure that they conformed to the protocol. These controls were, needed to prevent the interviewer from helping respondents with their answers. A single trained in- terviewer conducted all the inter- views reported here. In the final stage of the interview,

respondents were asked to describe what each of several dozen photo- graphs showed and to explain why it was either relevant or irrelevant to radon. The session began with two examples whose status seemed obvious (a photo of EF’A’s radon hro- chure and a photo of Mickey Mouse). The other photographs covered a wide range of topics. In general, be- liefs evoked by this task should be less central to respondents’ thought processes than those produced spontaneously. When previously unmentioned beliefs appear here, they are likely to represent latent portions of people’s mental mod- els-the sort that might emerge in everyday life if they had cause to consider specific features of their own radon situation. For example, when shown a supermarket pro- duce counter, some respondents told us that these plants might have become contaminated by taking up radon from the soil in which they grew. In cases in which photos evoked erroneous beliefs, respon- dents likely had labile mental mod- els to begin with.

Representation. Once elicited, beliefs must be represented in a way that is sensitive, neither omitting nor distorting beliefs; practical, in terms of the resources needed for analysis: reducible to summary sta- tistics: reliable across investigators; compamble across studies; and in- formative regarding the design of communications. To fulfill these re- quirements, we applied a coding scheme comprised of the expert in- fluence diagram supplemented by the erroneous, peripheral, and back- ground beliefs emerging in the in- terviews. Using relatively heteroge- neous opportunity samples, we found that the number of different concepts elicited by this procedure approaches its asymptotic limit af- ter about a dozen interviews. Figure 2 illustrates this result for two dif- ferent risks: “radon in homes” and “space launch of nuclear energy sources” (26).

Results. Most subjects knew that radon concentrates indoors (92%

Radon interview protoc,ut What I’d like to ask you to do is just talk to me about radon: that is, tell me what you know about radon and any risks it poses.

Bask prompts: Anvthina else? Cin yo; tell me more? Anything else-don’t worry about whether it‘s right, just tell me what comes

to mind. Can you explain why?

EXDOSUre nrocesses Source df radon

Can YOU tell me (more) about where radon comes from? Can ;ou tell me (more) about how radon gets into homes? You told m e that - (e.g.. radon leaks in through the basement); can

Can you tell me (more) about the things that determine how much radon

Can you tell me [more) about how radon moves around in a home once it

Is the level of radon usually the same in ail parts of a house?

you tell me more about that? Concentration and movement in home

there is in a home?

gets in?

Uncertainty about exposure Is radon found in all homes? Can you tell me (more) about how much variation there is in the amount

of radon in different homes?

The protocol continues similarly for the other parts of the problem.

mentioned), is detectable with a test kit (96961, is a gas [88%), and comes from underground (83%). Most knew that radon causes cancer (63%). However, many also believed errone- ously that radon affects plants (58%). contaminates blood (38%), and causes breast cancer (29%). Only two subjects (8%) mentioned that radon decays. During the interviews, sub- jects mentioned, on average, less than one (0.67) misconception out of 14 concepts mentioned. During the photograph-sorting sessions, they produced, on average, 2.5 mis- conceptions out of 15 concepts.

Discussion. Respondents ex- pressed many accurate beliefs re- garding radon, a hazard for which they may have received little direct education. Unfortunately, some of the misconceptions that did emerge could undermine the value of their correct beliefs. In particular, believ- ing that radon is a permanent con- taminant-like other radioactive hazards in the news-could make it seem like an insoluble problem, at least for those who cannot afford ex- tensive remodeling. For instance, we encountered one respondent who had been persuaded by a con- tractor to replace all the rugs, paint, and wallpaper in her home.

In related research on perceptions of climate change, we and Kempton

(27) have found confusion between stratospheric ozone depletion and the greenhouse effect. In fact, some of our U.S. interviewees suggested that giving up hairspray (which no longer contains chlorofluorocarbon [CFC] propellant) will slow global warming. Potentially more serious was many respondents’ failure to mention any l ink between the greenhouse effect and energy con- sumption.

Structured procedures Design. Open-ended interviews

are essential for allowing the struc- ture of people’s mental models to emerge and, in particular, for iden- tifying the set of possible miscon- ceptions. However, the labor inten- sity of our interview procedure makes it difficult to use for estimat- ing the frequency of each belief in a general population. As a result, the next step in developing a risk com- munication is to create a structured questionnaire for estimating the prevalence of different beliefs. Such a questionnaire should address all significant expert and nonexpert concepts, translating abstract tech- nical material into concrete lan- guage appropriate for lay respon- dents. To satisfy that requirement, there is no substitute for iteratively testing successive drafts with sub-

2052 Environ. Sd. Technol., Vol. 26. NO. 11, 1992

jects similar to the eventual respon- dents. For example, the test that we developed for radon included 58 statements. Respondents could an- swer “true,” “maybe true,” “don’t know,” “maybe false,” or “false.”

Result. In three small, diverse samples (total n = 73). our struc- tured test produced results similar to those from the open-ended inter- view. For most test items that corre- spond to single concepts in the ex- pert influence diagram (augmented by nonexpert concepts from the in- terviews), similar proportions of subjects stated that those proposi- tions were true as had mentioned those concepts in the previous study. For example, 29% of the in- terviewees and 32% of the’ques- tionnaire respondents said that ra- don can come from water; 21% and 1896, respectively, stated that radon comes from garbage. Thirty-nine per- cent of questionnaire respondents agreed that “Radon-contaminated surfaces stay contaminated unless they are cleaned or renovated,” and only 13% agreed that “radon decays over a few days.”

Figure 3 summarizes results from similar studies of lay beliefs about 60-HZ fields (28). Results are shown for knowledge of 20 basic concepts drawn from a 54-statement test by three groups of respondents. Each circle represents one concept, char- acterized by the percentages of right, wrong, and “don’t know” re- sponses. The space itself is divided into four regions, representing sub- jects’ typical performance. Con- sider, for example, Concept 2, the fact that moving charges make cur- rents. Approximately 3% of sub- jects disagreed with this statement, 73% agreed, and 23% said that they did not know whether it was true. Overall, although there is much confusion, the centers of mass for more than 75% of the concepts lie on the left-hand (correct) side of the plot. The same was true for two other groups of respondents in the study. In this case, in contrast with the radon case, correcting miscon- ceptions would not be as high a pri- ority as building on people’s gener- ally correct beliefs about fields.

One weakness of these interview procedures is in revealing beliefs about quantitative relationships. It would be most uncommon for a lay- person to say “electric fields fall off with the inverse square of the dis- tance from the source.” It is difficult even to formulate structured ques- tions about such topics in lay terms. In other studies, we have used ques-

FIGURE 2 Number of concept f nur )f subjects intervieweds

160r I I

nn 1.1

N2’

............... ””

40 ...................... ................

01 I I I 1 10 20 30

Number of interviews ‘For opan-ended mental mdel intewiews on redan in homes and nudear p w e r in space. ’Technology-oriented interview bUb@Cfs (on nuclew power m soace). “General lay public (on nuclear pwr in space). dEnvimnmemalirt. (on nuclear p w e r in space). ‘General lay public (on radon in homes).

tions involving pictures and dia- grams to tap such beliefs (28, 29). There, we found that lay respon- dents could rank the intensity of fields from transmission and distri- bution lines. However, they did not understand the vast range in the strengths of the fields produced by different appliances. Similarly, their estimates of field strength at different distances from sources suggested an intuitive inverse- power law, hut one with a greatly reduced exponent. Given this pat- tern of results, communications ahout fields should focus on sharp- ening beliefs that are correct quoli- totively, but not quantitatively.

Communication materials Development. Informative materi-

als such as brochures can attempt to refine mental models in five ways: by adding parts, deleting parts, re- placing parts, generalizing parts, and refining parts of people’s be- liefs (30, 31). The need for each of these strategies can be illustrated with findings from our radon inter- views.

Important pieces of the basic model of indoor radon exposure and effects processes were often missing from our respondents’ men- tal models (e.g., radon decays quickly, radon causes lung cancer).

Adding these high-level concepts might in itself delete or replace erro- neous beliefs. Other erroneous he- liefs (e.g., radon causes breast can- cer], peripheral beliefs (e.g., radon comes from industrial wastes), and indiscriminate beliefs (e.g., radon makes you sick) seem to be derived from mental models of various haz- ardous processes rather than a core mental model for radon. As a result, they need to be addressed individu- ally.

Based on these results, we de- signed two brochures. A hierarchical structure for each brochure was de- rived from a decision-analytic per- spective. One (Carnegie Mellon Uni- versity-DN, or CMU-DN) traced the Directed Network of the influence di- agram. The other (CMU-DT) adopted a Decision-Tree framework, stressing the choices that people had to make. Both used higher level organizers that have been found to improve the comprehension and retention of tex- tual material (32). These organizers included a table of contents, clear section headings, and a summary. Both brochures contained identical illustrations, a glossary, and a boxed section discussing the assumptions underlying EPAs recommended ex- posure levels and the attendant risks.

These two brochures were tested against EPA’s widely distributed

Environ. Sci. Technol.. Vol. 26. No. 11, 1992 M53

0% don’t know

.””,” right

C, = Subjects are confused but tend to know the right answer. K = SubjeaS consislenliy know the right answer.

C, = Subjects are confused and tend to provide the wrong answer. W = S u m consistently give the wrong answer.

random mpeOl48 P*rph r e s w W r n n U I e a r M s u e s a o n s B b O l n X ) W C O ~ O “ liem. cilcbd “ymbws m e m consepts.

“Citizens Guide to Radon” (33). CMU-DN included all basic expo- sure concepts in the expert influ- ence diagram and CMU-DT in- cluded 89%: EPA included 78%. Each brochure covered 80% of the basic effects concepts. EPA covered a much higher percentage of spe- cific effects concepts (50% versus 13%). The only higher level orga- nizers that EPA used were section headings.

Testhe. The three brochures were compareud on a battery of measures, including our open-ended inter- view, our true-false test, a multiple- choice test commissioned by EPA (34, a short problem-solving task, and verbal protocols of individuals reading the text. In addition to ex- ploiting the respective strengths of these different procedures, this bat- tery allowed our brochures and EPA’s to be evaluated with question- naires developed by both groups.

In general, subjects readmg the two CMU brochures performed similarly, and significantly better, than those reading the EPA brochure (35, 36). The greatest superiority of perfor- mance was observed with questions requiring inferences on topics not

mentioned explicitly in the bro- chures; these dealt predominantly with detection and mitigation. CMU subjects also gave more detailed rec- ommendations when asked to pro- duce advice for a neighbor with a ra- don problem. On the other hand, respondents were equally able to re- call or recognize material mentioned explicitly in their brochure. Each group performed significantly better than a control group in all respects.

Although subjects of EPA’s test did more poorly on the tests derived from the mental models perspective, there was no overall difference in performance on the EPA-commis- sioned test. Performance on two indi- vidual questions deserves note. More subjects who read the EPA brochure knew that health effects from radon were delayed. However, when asked what homeowners could do to re- duce hi& radon levels in their home, 43% of EPA subjects answered “don’t know” and 9% answered, “There is no way to fix the problem.” This contrasts with the 100% of CMU-DN and 96% of CMU-DT sub- jects who answered, “Hire a contmc- tor to fix the problem.”

Risk communications are com-

plex entities; it is hard to discern which features cause which im- pacts. We believe that the advantage of the CMU brochures lies in several common features not shared by the EPA brochure: their decision-ana- lytic structure emphasizes action- related information, which facili- tates inferences: our preparatory descriptive research focused the content of our brochures on gaps and flaws in recipients’ mental models, and principles from re- search in reading comprehension directed the technical design. One possible additional advantage was that each CMU brochure was writ- ten by a single individual, aided by others’ critiques. EPA’s brochure, on the other hand, was written by a committee consisting of members from diverse backgrounds: perhaps that compromised its coherence.

As a caution, we note that all these results were obtained with relatively small, albeit quite hetero- geneous, populations in western Pennsylvania. We anticipate that the prevalence of particular beliefs will vary more across population groups than will the repertoire of thought processes involved in mak- ing inferences or absorbing new ma- terial.

Conventional wisdom Although their approaches dif-

fered, the projects producing the EPA and CMU brochures both showed a commitment to empirical validation. By contrast, much of the advice about risk communication available in the literature or offered by consultants lacks such commit- ment. Perhaps the most carefully prepared and widely circulated guidance is a manual for plant man- agers produced for the Chemical Manufacturers Association (37). It focuses on the pitfalls of comparing risks and concludes with 14 para- graph-length illustrations of risk comparisons described with labels ranging from “very acceptable” to “very unacceptable.” We asked four diverse groups of subjects to judge these paragraphs on seven scales in- tended to capture the manual’s no- tion of acceptability (38). Using a variety of analytical strategies, we found no correlation between the acceptability judgments predicted by the manual and those produced by our subjects.

One possible reason for the fail- ure of these predictions is that the manual’s authors knew too much [from their own previous research) to produce truly unacceptable com-

2054 Environ. Sci. Technol.. Vol. S, NO. 11, 1992

parisons. More important than identifying the specific reasons for this failure is the eeneral cautionarv

their (self-described) life circum- stances.

Lav Deoole often have little on- message: Because-we all have expi- rience in dealing with risks, it is termtine to assume that our intui-

portinlty io consider complex risk issues. However, we found that lay oninion leaders dealt well with the ~-

tions are shared by others. Often they are not. Effective risk commu- nication requires careful empirical research. A poor risk communica- tion can often cause more public health (and economic) damage than the risks that it attempts to describe. One should no more release an un- tested communication than an un- tested product (11).

Risk professionals often complain that lay people do not know the magnitude of risks (39, 40). They point to cases in which people ap- parently ignore mundane hazards that pose significant chances of in- jury or death but get upset about ex- otic hazards that impose a very low chance of death or injury. However, there is counterevidence on both scores.

The earliest studies of technologi- cal risk perception demonstrated disagreements in the meaning of “risk’ between lay people and ex- perts (and even among different groups of experts) (41, 42). As a re- sult, lay people order ‘‘risks’’ differ- ently than do experts. However, if asked to order hazards by their an- nual fatalities, lay people perform quite credibly (43-45). Moreover, differences in the definitions of “risk“ reflect political and ethical concerns, such as the respective weights to be given to deaths and in- juries to various classes of people (e.g., the young, nonbeneficiaries. those who expressly consent to their exposure). Ignoring these dif-

ri’sks of the 60-HZ electric and mag- netic fields produced by bigh- voltage power transmission (49) when we provided them with the necessary facts and time. Modest analytical assistance probably would have improved their perfor- mance further. Poor lay decision making may reflect inadequate time, information, and institutional arrangements, rather than cognitive limitations. When risk communica- tion materials adopt jargon or com- pressed formats that are not familiar to lay people, understanding can be poor (50).

Critics argue that all risk commu- nication is manipulative, designed to sell unsuspecting recipients on the communicator’s pol i t ical agenda. We believe that, with care- ful design and evaluation, it is pos- sible to develop balanced materials that provide lay audiences with the information they need to make in- formed decisions about the risks they face. That design must start wi th a n examinat ion of what choices people face, what beliefs they hold, and what expert knowl- edge exists.

Research on risk communication has just begun. Much “conventional wisdom” withers when subjected to empirical examination. As a result, when developing communications for lay audiences, we see no substi- tute for the kind of empirical explo- ration and validation that we pro- posed in the box, “Four steps for

M. Gmnger Morgan is head of the De- partment of Engineering and Public Pol- icy and also holds foculty appointments in the Department of Electrical and Computer Engineering and in the Heinz School at Carnegie Mellon University. His research has focused on problems in technology and public policy. His Ph.D. is in appliedphysics from the University of California at Son Diego.

Baruch Fischhoff is a professor in the Department of Social and Decision Sci- ences and Engineering and Public Pol- icy at Carnegie Mellon University. His research has focused on public percep- tion of risk. His Ph.D. is in psychology from Hebrew University in Jerusalem. Ann Bosbom is assistant professor of public policy at Georgia Institute of Technology. She has just completed a one-year postdoctoml appointment with the Bureau ofhbor Statistics. Her Ph.D. is from the Heinz School at Carnegie Mellon University where she did re- search on risk communication.

fering definitionsposes several per- risk communication.” This process ils: neglecting the role of values in must be iterative, insofar as even the defining and managing risks, un- most careful risk communicators fairly deprecating lay people’s risk are unlikely to get things right the priorities, and failing to provide in- first few times around. Communica- formation on critical dimensions tors are not to be trusted in their (46, 47). speculations regarding others’ per-

Moreover. even studies that claim ceptions. The legacy of undisci- to demonstrate inappropriate con- plined claims is miscommunica- cems often use questionable meth- t ion , whose pr ice i s paid i n ods. For example, lav people may increased conflict and foregone Leste=Lave is James H. Higins Profes- be asked to ran-k risk; that &e hard to compare, and are formulated in unfamiliar terms. We recently asked three generations of subjects (high

technological and economic oGpor- tunities.

Acknowledgments

Of Economics t h e Graduate School of Industrial Administration and holds appointments in Engineering and Public Policy and the Heinz School at cm:nm~oio ~ d h imiverritlr H ; ~ Ph n ir

school students, parents, grandpar- ents) to manufacture their own lists of concerns and then to answer questions about the five risks that most concerned them (48) . Al- though our samples were small (n =

We thank Conception Corlbs. Greg Fis- cher, Keith Florig. Max Henrion, Gor- don Hester. Urbano Lopez, Michael Ma- harik, Kevin Marsh. Fran McMichael. Jon Merz. Denise Murrin-Macey. lndira Nair, Karen Pavlosky. Daniel Resendiz- Carrillo. Emilie Roth. Mitchell Small.

“b.”...” ” ..,. ... ”. .- in economics from Howard University. Cynthia I. Ahnan is assistant professor of industrial engineering at the Univer- sity of Pittsburgh. Her Ph.D. is from the Deportment of Engineering and Public Policv at Carneeie Mellon Universitv ~ ~ ~ ~~~~~

871, subjects’ self-nominated con- cems differed with age to focus on

Patti Steranchak, and’Joel Tam of Car: negie Mellon University: Paul Slovic of

where she did reiearch on risk cornmi- nication.

Environ. Sci. Technol., Vol. 26, No. 11, 1992 2055

Now Available!

EPA Method 1613 Standard Solutions and EPA Gontaminanl I Standards

Isotec's newest product line of EPA Method 1613 Standard Solutions from Chemsyn Science La bo rato r ieswe I I i ngto n Laboratories include: - Calibration and Verification

-Labeled Compound Stock

- PAR Stock Solution -Cleanup Standard Spiking

Solution - Internal Standard Spiking

Solution - DB-5 Column Window

Defining Standard Mixture - Isomer Specificity Test

Standard Mixtures

Our EPA Contaminant Standards include isomers of: - Chlorinated Dibenzo-p-Dioxins - Chlorinated Dibenzofurans - Brominated Dibenzo-p-Dioxins - Brominated Dibenzofurans These and other isomers are available with stable isotope labels, radioisotope labels, or unlabelled.

Custom synthesis orders are welcome. For more information, please call:

Solutions (CSl through CS5)

Solution

Miamisburg, OH 45342 (800) 448-9760

Decision Research; and A n n Fisher of Pennsylvania State University for their assistance on this paper. The work was supported by grant SES-8715564 from the National Science Foundation, con- tract number RP 2955-3 from the Elec- t r ic Power Research Inst i tute , a n d a grant from the Carnegie Council for Ad- olescent Development. The authors are solely responsible for the contents.

References (1) National Research Council. Improv-

ing Risk Communication; National Academy Press: Washington, DC, 1989.

(2) Jasanoff, S. Presented at the Syrnpo- sium on Managing the Problem of In- dustrial Hazards: The International Policy Issues; National Academy of Sciences: Washington, DC, February 27, 1989.

(3) Fischhoff, B. American Psychologist

(4) To Breathe Deeply: Risk Consent, and Air; Gibson, M., Ed.; Allenheld: To- towa, NJ, 1985. Communicating with the Public about Major A c c i d e n t s H a z a r d ; G o w , H.B.F.; Otway, H., Eds.; Elsevier: Lon- don, 1990.

(6) Morgan, M. G.; Lave, L. B. Risk Anal- ysis 1990, 10, 355-58.

(7) Wynne, B. J . NIHRes. 1991, 3, 65-71. (8) Evans, J. S.; Hawkins, N. C.; Graham,

J. D. J. Air Pollut. Control Assoc. 1988,

(9) Nazaroff, N. W.; Teichman, K. Envi- ron. Sci. Technol. 1990, 24, 774-82.

(10) Svenson, 0.; Fischhoff, B. J. Environ. Psychol. 1985, 5, 55-68.

(11) Fischhoff, B. Issues in Science and Technology 1985, 2, 83-96.

(12) Driver, R. The Pupil as Scientist; The Open University Press, Milton Key- nes: Philadelphia, PA, 1983.

:13) Nussbaum, 7.; Novick, S. Instructional Science 1982, 11, 183-200.

:14) Posner, G. J. et al. Science Education

:15) Schauble, L. et al . Journal of t he Learning Sciences 1991, 2 , 201-38.

116) Clement, J. In Mental Models; Gent- ner, D.; Stevens, A. L., Eds.; Erlbaum: Hillsdale, NJ, 1983; pp. 325-39.

:17) Mental Models in Human-Computer Interaction; Carroll, J. M.; Olson, J. R., Eds.; National Academy Press: Wash- ington, D.C., 1987.

118) Craik, K. The Nature of Explanation; Cambridge University Press: Cam- bridge, 1943.

:19) M e n t a l M o d e l s ; G e n t n e r , D . ; Stephens, A. L., Eds.; Erlbaum: Hills- dale, NJ, 1983.

:20) Johnson-Laird, P. Mental Models ; Harvard University Press: Cambridge, MA, 1983.

, 2 1 ) Murphy, G. L.; Wright, J. C. J . Exper. Psychol.: Learning, Memory, and C o g nition 1984, 10, 144-55.

:22) Norman, D. In Mental Models; Gent- ner, D.; Stevens, A. L., Eds.; Erlbaum: Hillsdale, NJ, 1983; pp. 7-14.

.23) Rouse, W. B.; Morris, N. M. Psycho- logical Bulletin 1986, 100, 349-63.

124) Bostrom, A.; Fischhoff, B; Morgan, M. G. Journal of Social Issues, in press.

1990,45, 57-63.

(5)

38, 1380-85.

1982, 66, 211-27.

(25) Howard, R. Management Science 1989, 35, 903-22.

(26) Maharik, M. M.A. Thesis, Carnegie Mellon University, 1991.

(27) Kempton, W. Global Environmental Change: Human and Policy Dimen- sions, 1991, 1 , 183-2nR -1-.

(28) Morgan, M. G. et al. Bioelectromag- netics 1990, 11, 313-2'. -".

(29) Lopez, A. U. Ph.D. Dissertation, Car- negie Mellon University, 1990.

(30) Stevens, A. L.; Collins, A. In Apti- tude, Learning b Instruction, Vol. 2; Snow, R. E.; Federico, P.; Montague, W. E., Eds.; Erlbaum: Hillsdale, NJ,

(31) Katzeff, C., Ph.D. Dissertation, Uni- versity of Stockholm, 1989.

(32) Krug, D. et al. Contemporary Educa- tional Psychology1989, 14 , 111.

(33) U . S . E n v i r o n m e n t a l P r o t e c t i o n Agency; U.S. Department of Health and Human Services. A Citizen's Guide to Radon; OPA-86-004; U.S. Government Printing Office: Wash- ington, DC, 1986.

(34) Desvousges, W. H.; Smith, V. K . ; Rink, H. H., 111. Communicating Ra- don Risk Effectively: Radon Testing in Maryland; Office of Policy Analysis, U.S. E n v i r o n m e n t a l P r o t e c t i o n Agency: Washington, DC, 1989.

(35) Atman, C. J. Ph.D. Dissertation, Car- negie Mellon University, 1990.

(36) Bostrom, A. Ph.D. Dissertation, Car- negie Mellon University, 1990.

(37) Covello, V . T . ; Sandman , P . M . ; Slovic, P. Risk Communication, Risk Statistics, and Risk Comparisons: A Manual for Plant Managers; Chemical Manufacturers Association: Washing- ton, DC, 1988.

(38) Roth, E. et al. Risk Analysis 1990, 10,

(39) U.S. E n v i r o n m e n t a l P r o t e c t i o n Agency. Unfinished Business: A Com- parative Assessment of Environmen- tal Problems; U.S. Environmental Protection Agency: Washington, DC, 1987.

1980; pp, 177-97.

3 75-87.

(40) Wildavsky, A. Searching for Safety; Transaction Books: New Brunswick, NJ, 1988.

(41) Slovic, P.; Fischhoff, B.; Lichtenstein, S. Environment 1979,21, 14-39.

(42) Slovic, P.; Fischhoff, B.; Lichtenstein, S. The Assessment and Perception of Risk; The Royal Society: London, 1981.

(43) Lichtenstein, S. et al. I. Exper. Psy- chol.: Human Learning and Memory

(44) Morgan, M. G. et al. Risk Analysis

(45) Romney , A. K.; Batchelder , W . ; Weller, S. C. American Behavioral Scientist 1987, 31, 163-77.

(46) Fischhoff, B.; Svenson, 0.; Slovic, P. In Handbook of Environmental Psy- chology; Stokols, D.; Altman, I., Eds.; Wiley: New York, 1987; pp. 1089- 1133.

(47) Fischhoff, B.; Watson, S. ; Hope, C. Policy Sciences 1984, 17, 123-39.

(48) Fischer, G. W. et al. Risk Analysis,

(49) Hester, G. et al. Risk Analysis 1990,

(50) Lave, T. R.; Lave, L. B. Risk Analysis

1978, 4, 551-78.

1983, 3, l l -16.

1991, 11, 303-14.

10, 213-28.

1991, 11, 255-67. CIRCLE 2 ON READER SERVICE CARD

2056 Environ. Sci. Technol., Vol. 26, No. 11, 1992