14
Disasters & Hazard Mitigation living more safely on a restless planet

Disasters& Hazard Mitigation - NSF - National Science …€¦ ·  · 2017-12-12the scientific questions—posed by nature’s most energetic events. N. ... and floods on America’s

  • Upload
    lamdieu

  • View
    216

  • Download
    2

Embed Size (px)

Citation preview

Disasters &Hazard Mitigation

living more safelyon a restless planet

ature is continually reshaping our world

with volatile and often catastrophic

power. NSF-funded research is helping

to improve our understanding of the

causes and effects of natural disasters

while also making the world safer.

Together, scientists and engineers are

drawing from a wide variety of disciplines

to mitigate the hazards—and answer

the scientific questions—posed by

nature’s most energetic events.

N

Na tu r a l d i s a s t e r s a r e c omp l ex and often global in their effects. That is

why the hallmark of NSF's long involvement in disaster research has been to encourage the

exchange of ideas across national boundaries as well as scientific disciplines. To find answers in

this high-stakes field, NSF programs marshal a wide range of researchers, including atmospheric

scientists, engineers, geologists, sociologists, economists, seismologists, biologists, political

scientists, and others. Their work takes them to wherever nature is in turmoil—earthquakes in

Japan, volcanoes in the Philippines, hurricanes in the Atlantic, and floods on America’s Great

Plains. The resulting discoveries about the inner workings and human risk associated with

nature’s most extreme events are making both warning and mitigation increasingly possible.

Disasters & Hazard Mitigation — 149

The Forces Und lying the FuryThe economic cost of natural disasters in theUnited States has averaged as much as $1 billiona week since 1989—and is expected to rise,according to a 1999 NSF-supported study.Because natural disasters can have such brutalconsequences, it’s easy to think of them interms of human misery that, somehow, must bemitigated. But society cannot mitigate what itdoes not understand. Natural disasters are, afterall, a part of nature, and though human activitiescan influence the impact of extreme events, re-searchers must first learn as much as possibleabout the basic physical forces underlying the fury.At NSF, most of the research into natural disas-ters and their mitigation takes place within theDirectorate for Geosciences, the Directorate forEngineering, and the Directorate for Social, Behavioral,and Economic Sciences.

Take, for example, earthquakes and volcanoes.Almost from its inception, NSF has been a criticalplayer in the global effort to understand and copewith these giant Earth-altering forces. NSF fundeda series of explorations during 1957–58—dubbedthe International Geophysical Year—and again inthe 1960s. These explorations confirmed a wildidea that scientists had begun to suspect was true:the Earth’s seafloors, rather than being congruouslike the rind of a melon, were actually disparatepieces that, at least in some places, were slowlymoving away from each other. These findingspushed geophysicists toward the modern theoryof plate tectonics. Researchers now know that theupper part of Earth’s crust is broken up into anumber of rigid sections or plates, and that theseplates float atop soft-solid rock kept in a moltenstate by an unimaginably hot inner core. As theplates drift, they not only separate but also col-lide and slide past each other, forming valleysand mountain ranges. Occasionally, some of themolten rock breaks through—and a volcano is born.When two plates grind past each other, the shud-dering friction generates earthquakes.

Of the one million or so earthquakes that rat-tle the planet each year, only a few—about oneeach week—are large enough to grab our attention.Predicting when and where the next “big one” willtake place is still far from a certainty. Short-termforecasts are sometimes pegged to swarms ofsmaller quakes that may signal mounting stressat a fault. Or a sudden change in underground watertemperature or composition may be significant:this type of signal led to the successful evacua-tion of a million people before a major earthquakestruck near the city of Haicheng, China, in 1975—the first earthquake to be scientifically foretold.

NSF-funded researchers are making headwayon the difficult question of earthquake predictionby narrowing their focus to specific regions ofthe world. Because the behavior of seismic wavesis so strongly affected by the different kinds ofsoil and geological structures through which thewaves must travel, the effects of an earthquakecan vary widely from place to place, even alongthe same fault. A soft-soil area such as a lakebed,for example, will shake more than a rocky hill.Knowing this, scientists and engineers at theNSF-sponsored Southern California EarthquakeCenter in Los Angeles have reassessed the con-sequences of earthquakes along faults in thesurrounding region. The scientists were able tosimulate the anticipated effects of future localquakes by using sophisticated computer modelsof the Los Angeles basin that accounted for faultgeometry and motion, sediment composition,and other factors that can reflect, prolong, or

With funding from the National Science

Foundation, scientists have made sig-

nificant advances in the accuracy of

storm prediction since the first tornado

forecast in 1948.

amplify quaking motion. Such modeling, supple-mented with data from new digital seismic recorderscapable of sensing a broad range of actualearthquake vibrations, can help researchers andresidents of quake-prone areas to anticipate—atleast in a general way—when and where the nextbig temblor will hit and what damage may result.

Even as local efforts to understand earthquakeactivity improve, scientists are finding new waysto take another look at the big picture. In June1999, NSF-funded researchers joined an interna-tional team headed to the east coast of Japan toestablish long-term seafloor observatories in oneof the world’s busiest earthquake zones: the so-called Japan Trench, where two of Earth’s biggesttectonic plates are colliding. The internationalteam of scientists drilled holes about one kilo-meter deep into the ocean floor along the trench,which itself is two kilometers underwater. Theythen installed instruments at the bottom of theseboreholes to monitor the amount of seismic activ-ity there. Robotically controlled vehicles similar tothose used to investigate the sunken Titanic willperiodically travel to and from the seafloor obser-vatories and help provide scientists with long-termobservations of one of the planet’s most activequake regions.

Another way that NSF is helping researchersgather data close to the moment of seismicactivity is through its funding of the EarthquakeEngineering Research Institute (EERI) in Oakland,California. Besides encouraging regular communi-cation among engineers, geoscientists, architects,planners, public officials, and social scientistsconcerned about natural disasters, EERI quicklyassembles and deploys teams of researchers onfact-finding missions in the wake of earthquakes—anywhere in the world—soon after they occur.

Reducing the RiskThough researchers cannot yet precisely predictthe timing and location of earthquakes, NSF haslong recognized that more can be done to minimize—or mitigate—the damage that quakes can cause.Toward that end, in 1977 Congress passed theEarthquake Hazards Reduction Act, which put NSFin charge of a substantial part of earthquakemitigation research efforts in the United States.Earthquake-related studies, especially with regardto structural and geotechnical engineering, nowmake up the bulk of NSF’s natural disasters re-search under the guidance of the Natural HazardsReduction Program in the Directorate for Engineering.

Why engineering? Because most of the imme-diate deaths from earthquakes occur whenbuildings collapse, and the huge economic lossesassociated with the biggest quakes stem fromdamage to the structures and infrastructures thatmake up cities and towns. In 1997, NSF officiallycharged three earthquake centers with a majorportion of the responsibility for conducting andcoordinating earthquake engineering research inthe United States. The centers, each constitutinga consortium of public and private institutions, arebased at the University of California at Berkeley,the University of Illinois at Urbana-Champaign,and the State University of New York at Buffalo.

The NSF-funded earthquake centers are modelsof cooperation, including not only geoscientistsand engineers but also economists, sociologists,political scientists, and contributors from a hostof other disciplines. The Buffalo center, for example,recently studied the potential economic impact ofan earthquake in the Memphis, Tennessee, areanear the epicenter of several major quakes thatstruck in 1811-12. Participants in the study includ-ed researchers from the University of Delaware’sDisaster Research Center, who examined economic,political, and social elements of the hazard. TheDelaware researchers have also studied the

150 — National Science Foundation

these simulations a century intothe future. Their model suggeststhat if carbon dioxide emissionscontinue to rise at their currentpace, there will likely be a boostin global temperatures as well asa 40 percent jump in winter rainand snow within the southwestregion and Great Plains of theUnited States. The model also showsthat the warming effect would bemore severe in the United Statesthan in Europe or Asia.

While global warming might notrival earthquakes and hurricanesfor dramatic immediacy, such grad-ual but significant climate changescan indeed have disastrous conse-quences for human society. As icecaps melt, sea levels will rise,threatening coastal habitation andcommerce. Warmer temperatureswill also radically alter when, where,and whether farmers can growcertain crops. Climate models thatcan predict such events with a fairdegree of certainty—and perhapssuggest what can be done to mini-mize their impact—will make aninvaluable contribution to the fieldof natural hazards research.Accustomed to the urgency of saving lives, natural disasterresearchers now face the challengeof preserving a way of life, as well.

Before 1950, climatologists spentmost of their time describing andcomparing the current-day climatesof different regions. Even the rela-tively recent climatic past remaineda mystery to them, and interactionsbetween the atmosphere and theoceans that researchers now knowdrive global climate change weretoo complex to study with themathematical tools at hand. Butthen came the computer revolu-tion, funded to a significant degreeby NSF, and today much of nature’sturbulence, past and present, isavailable for study.

With the advent of NSF-sponsoredsupercomputers, climatologistsbegan building models of atmos-pheric change that now embracemillions of years of oceanic, atmos-pheric, biological, geological, andsolar processes. For example, bythe late 1980s NSF-supportedresearchers at the University ofWashington were able to recon-struct the wide extremes of tem-peratures that existed 250 millionyears ago within the giant super-continent of Pangaea.

In 1999, climate modelers atthe NSF-funded National Center forAtmospheric Research in Boulder,Colorado, managed to accuratelysimulate a century of known climatehistory. The scientists then carried

Climate Change—Disaster in Slow Motion

152 — National Science Foundation

impact that the Loma Prieta earthquake (1989)and Hurricane Andrew (1992) had on businessesin the Santa Cruz and Miami areas, respectively.Kathleen Tierney, a sociologist at the Universityof Delaware and a co-principal investigator for theBuffalo earthquake consortium, says the fewprevious studies of long-term disaster impactsfocused on individuals and families rather than onbusinesses. The new Delaware research shouldhelp both policymakers and business owners bet-ter understand the economic impacts of disastersand devise more effective ways of coping with them.

While understanding the economic impact ofdisasters is important, the heart of the earthquakecenters’ mission is to design safer buildings. In1967, the University of California at Berkeleycenter installed what is still the nation’s largest“shake table.” The twenty-foot-by-twenty-foot plat-form reproduces the seismic waves of variousearthquakes, allowing engineers to test modelstructures. After the Loma Prieta quake, NSF fund-ed an upgrade of the table from two- to three-dimensional wave motions; additional digital controls and sensors will soon allow offsite re-searchers to monitor experiments at the shaketable in real time via a computer network.

Ultimately, says William Anderson, senior advi-sor in NSF’s Division of Civil and MechanicalSystems, the research community may be ableto conceptually link geophysical and geotechnicalresearch—such as computer models of faults andsoil liquefaction—to the engineering simulationsof building parts, creating a unified, integratedmathematical model of disaster.

The need for such research has been under-scored numerous times in the latter part of thetwentieth century. Early in the morning on January17, 1994, southern California suddenly heavedand swayed. Deep beneath the town of Northridge,less than 25 miles from downtown Los Angeles,one giant chunk of the Earth's crust slipped overanother, jolting the people and structures abovewith a 6.7 magnitude earthquake. (On the loga-rithmic Richter scale, 7.0 constitutes a majorearthquake. Although the Richter scale has noupper limit, the largest known shocks have hadmagnitudes in the 8.8 to 8.9 range.) More thantwelve thousand buildings were shaken so hardthey collapsed or sustained serious damage, whilemany of the region’s vital freeways and bridgesdisintegrated or were rendered impassable. Sixtypeople died and Californians suffered more than$25 billion in economic losses.

One year later and halfway around the world, thecity of Kobe, Japan, endured its first catastrophicearthquake in a century, a 6.9 magnitude temblor.More than six thousand people died and almost twohundred thousand buildings were destroyed ordamaged. Fires spread across the city while helplessfirefighters failed to draw a drop of water from theshattered pipes. Besides the horrific loss of life,the devastation in Kobe cost between $100 and$200 billion.

The widespread destruction from these disas-ters has been especially alarming to expertsbecause both cities sit atop a seismically activecoastal region known as the Pacific Rim, which iscapable of bestirring earthquakes of even greaterviolence. Close inspection of the rubble from bothearthquake sites revealed one of the main

This geological model of the 1994

Northridge earthquake was created

by researchers at the NSF-funded

Earthquake Engineering Research

Center at the University of California at

Berkeley. This three-dimensional view

of the 6.7 magnitude earthquake gives

scientists a better understanding of the

geological forces behind earthquakes.

Disasters & Hazard Mitigation — 153

contributing factors to the devastation: Buildingswith steel frames exhibited cracks at the weldedjoints between columns and beams. Experts hadexpected old masonry and reinforced-concretestructures to crumble, but steel-framed buildingswere supposed to be relatively safe. In Kobe, thesteel frames failed catastrophically: more than onein eight simply collapsed. In Northridge, more thantwo-thirds of the multistory steel-framed buildingssuffered damage.

Immediately after these disasters, NSF-sponsoredresearchers put new emphasis on developingbetter connection designs. In five short years,researchers have learned to reduce stresses onwelds by altering the joints in the frames, in somecases by per forating or trimming the projectingrims (i.e., flanges) of the steel I-beams. Thesesafer construction techniques have been includedin new building code recommendations issued bythe Federal Emergency Management Agency forall U.S. buildings in earthquake-prone regions.

NSF-funded researchers are finding many otherways to make buildings safer during earthquakes.Shih Chih Liu, program director of NSF’s infra-structure and information systems program, saysnew high-performance concrete uses ash or smallsteel bars for better tensile strength and corrosionresistance. It took smart thinking to concoct thenew concrete but other work is aimed at makingthe buildings themselves “smart.” NSF-fundedengineering professor Deborah Chung at the StateUniversity of New York at Buffalo recently inventeda smart concrete that acts as a sensor capableof monitoring its own response to stress. Theconcrete contains short carbon fibers that lowerthe concrete’s tendency to resist the flow of elec-tricity (a quality that researchers call “resistivity”).Deformations to the material—as can occur duringearthquakes—cause resistivity to rise, a changethat can be gauged by a simple electrical contactwith the concrete. The greater the signal, the greaterthe presumed damage. NSF-funded engineers havealso developed systems such as swinging coun-terweights, which dampen the oscillations of

buildings, and slippery foundations that are shapedlike ball bearings in a bowl—the bearings allowthe structure’s footings to shift sideways nearlyindependently of the structure above.

Other NSF-supported advances include the dev-elopment of smart shock absorbers for buildings,bridges, and other structures. As the structureshakes or sways, electrical signals from motionsensors in the structure cause a special fluid inthe shock absorbers to become thicker or thinner(ranging between the consistency of light oil toone more like pudding), depending on what’sneeded to slow or speed the movement of theshock absorbers’ pistons.

How well these efforts translate into savedlives and minimized economic losses depends onhow widely they are shared. In the new millennium,NSF plans to develop the Network for EarthquakeEngineering Simulation (NEES)—a kind of overar-ching cybersystem for earthquake engineeringexperimental research. Through NEES, researchersaround the world can remotely access a completesystem of laboratory and field experimentationfacilities, of which there are currently more thanthirty in the United States alone.

The 1995 earthquake that devastated

Kobe, Japan, destroyed a section of

the Nishinomiya-ko Bridge. The Kobe

earthquake demonstrated the vital

importance of NSF-funded research

into “smart” materials and other earth-

quake-resistant construction techniques.

154 — National Science Foundation

Hot HeadsVolcanoes are close cousins of earthquakes, aris-ing as they do from the same powerful motions ofthe planet’s tectonic plates. Despite their fieryreputation for chaos and destruction, however, onlyabout sixty volcanoes erupt each year, usually withmore bravado than brawn. What’s more, mostvolcanoes are on the ocean floor where plateboundaries are converging or spreading over“hot spots”—large subterranean pools of magma.

This is not to say that volcanoes pose no peril.Over the last three hundred years more than260,000 people have died from volcanic activity.The 1991 eruption of the Philippines’ MountPinatubo killed more than three hundred peopleand devastated the area’s economy. When MountSt. Helens blew its stack in the state of Washingtonin 1980, 57 people died, nearly 7,000 big gameanimals were killed, more than 12 million salmonperished, forests were devastated, and the econ-omy took a nearly $1 billion hit.

All of this reinforces the need for better waysto model and predict volcanic activity. One way toboth study and monitor a volcano is to place a gassensing device called COSPEC along the volcano’sflanks. COSPEC (for “correlation spectrometer”)measures how much sulfur dioxide gas is escap-ing from the volcano’s interior. A jump in theamount of sulfur dioxide suggests an imminenteruption. Still, a few hours’ or, at most, a few days’warning is the best that scientists can managewith current knowledge and technology. Andsometimes, of course, there is no discerniblewarning at all.

In 1993, nine members of a scientific expeditionwho were taking gas samples died when a suddenspasm of molten rock and ash erupted from thecrater of a volcano called Galeras in Colombia.The tragedy prompted one of the survivors, StanleyWilliams of Arizona State University, to organizea conference that would enable scientists tostandardize their methods and make data consis-tent from one volcano observatory to another.The 1997 NSF-funded conference brought togethervirtually every scientist then working with COSPEC—some twenty-five volcanologists from fourteencountries. Williams has also developed a remote-access instrument called GASPEC that measuresanother early-warning gas, carbon dioxide.

Other volcano-monitoring efforts funded in partby NSF include a network of seismometers (instru-ments that measure ground vibrations caused byearthquakes) and an array of Earth-orbiting satel-lites called the Global Positioning System (GPS).The GPS can alert scientists to volcano-relatedground deformations at the millimeter scale—deformations that might signal an imminent eruption.

On May 18, 1980, a 5.1 magnitude

earthquake shook Mount St. Helens.

The bulge and surrounding area slid

away in a gigantic rockslide and debris

avalanche, releasing pressure and trig-

gering a major pumice and ash eruption

of the volcano. Thirteen hundred feet

of the peak collapsed. As a result, 24

square miles of valley were filled by

a debris avalanche; 250 square miles

of recreation, timber, and private lands

were damaged by a lateral blast; and

an estimated 200 million cubic yards

of materials were deposited into the

river channels.

For example, a special electroniccamera called CHIP (for “chromo-spheric helium imaging photometer”)perches on the volcanic flanks ofHawaii’s Mauna Loa and snapshighly detailed pictures of the solardisk and corona every three min-utes. These pictures are frequentenough to provide scientists witha movie loop of ejections as theydevelop and burst forth. Othersatellite-borne instruments—somelaunched and retrieved by the spaceshuttle Discovery to escape distor-tions caused by Earth’s dustyatmosphere—mine the Sun’s radi-ation for clues about its magneticbehavior. Along with piecingtogether the basic science behindsolar storms, these instrumentsshould help scientists do a betterjob of predicting the next seriousbout of bad space weather.

Sun known as sunspots, whoseactivity follows an eleven-year cycle.And not only is the most recentsunspot cycle expected to reachits maximum activity in the year2000, but overall, these so-calledsolar maximums have become twiceas powerful as they were in theearly 1900s. With our civilization’swell-being tied ever more closelyto power stations and satellite-based communication systems,the atmospheric disturbancestriggered by solar storms pose apotentially significant threat.

From 1980 to 1989, the NSF-funded Solar Maximum Missionsatellite collected the most detaileddata yet on coronal mass ejections.NCAR researchers used this datato develop a new suite of observa-tion tools that work in space andon the ground.

In the spring of 1989, six millionpeople in Canada, Sweden, and theUnited States lost electric powerfor up to nine hours thanks tostormy weather—not on Earth, buton the Sun. During particularly vig-orous solar storms, billions of tonsof plasma erupt from the Sun'sgaseous outer layer (called thecorona), speed toward Earth athundreds of miles per second, anddisrupt the Earth's magnetic field.

Although they also producespectacularly beautiful auroras—those colorful atmospheric stream-ers known as the northern lights—”coronal mass ejections” consti-tute a poorly understood naturalhazard of growing concern to thescientists at NSF’s National Centerfor Atmospheric Research (NCAR).That’s because the ejections areassociated with features on the

How’s the Weather Up There?

156 — National Science Foundation

Stormy WeatherWhile earthquakes and volcanoes capture much

of the public’s imagination, weather-related disas-ters can wreak far more economic havoc. Accordingto the Worldwatch Institute, 1998 set a new recordfor global economic losses related to extremeweather—$89 billion, a 48 percent increase overthe previous record of $60 billion in 1996 and farmore than the accumulated losses for the entiredecade of the 1980s.

A major player in the world’s efforts to learnabout and live with extreme weather is the NationalCenter for Atmospheric Research (NCAR) inBoulder, Colorado, funded by NSF’s Division ofAtmospheric Sciences. Central to NCAR’s activitiesis the use of supercomputers to develop large-scalesimulations of atmospheric and ocean dynamics.These models help to explain the formation oftornadoes, windstorms, and hurricanes, as well asmore mundane climatic events. For example, inthe late 1970s, NCAR researcher Joseph Klemp,working with Robert Wilhelmson of the NSF-fundedsupercomputing center at the University of Illinois,developed the first successful model of the mostdangerous of all thunderstorms, the “supercell”storm. In a thunderstorm, air moves up and downin a turbulent mix. A single-cell storm means thatthere is just one updraft/downdraft component,which generally produces only moderately severeweather. A multicell storm can kick out the occa-sional tornado, but sometimes a main, intenselyrotating updraft develops within a multicell stormand transforms it into a supercell storm capable

of producing the most devastating weather, com-plete with violent tornadoes, raging winds, hail,and flooding.

The model developed by Klemp and Wilhelmsonconfirmed other researchers’ observations thatthis special, rotating brand of thunderstorm coulddevelop by splitting into two separate storm cells.According to their simulation, the southern stormin the pair was the most likely to concentrate itspowers to make a tornado. Meteorological mod-elers have since improved these simulations tothe point where researchers can study the abilityof rotations midway up in a thunderstorm to dev-elop tornado-like whirls at the ground. Such work,coupled with NSF-sponsored ground reconnais-sance of tornadoes, may eventually solve themystery of how tornadoes are born, which, in turn,could lead to better warning systems.

Warning systems can save lives, but whetheror not a building survives a tornado’s onslaughtdepends largely on how it was constructed. Sincethe 1970s, scientists and engineers at the NSF-funded Texas Tech (University) Institute for DisasterResearch have been picking through the aftermathof tornadoes’ fury for clues about what predis-poses a structure to survival. When the researchersfirst began their work, it was common for emer-gency preparedness manuals to recommend thatduring a tornado building residents open theirwindows so that pressure inside the building couldequalize with the low-pressure interior of theapproaching twister. But after much doggeddetective work, the Texas Tech researchers weresurprised to learn that rather than exploding fromunequal pressure, the walls of homes destroyedby tornadoes appeared to flatten when winds priedup the roof, just as aerodynamic forces will lift upan airplane wing. Wind was also discovered tocontribute to structural damage by blowing debrisfrom poorly built homes into homes that wereotherwise sound.

The key to survivable housing—at least in allbut the worst cases of tornadoes—turns out tobe roofs that are firmly anchored to walls and walls

Researchers at the NSF-funded National

Center for Atmospheric Research in

Boulder, Colorado, use computer models

to learn more about tornadoes, hurri-

canes, and other weather events. These

models enable atmospheric scientists to

more accurately predict when and where

severe storms will hit. Greater forecast-

ing accuracy can save lives and minimize

property damage.

Most hurricanes kill and destroy with a surge ofseawater. Not Hurricane Andrew—the monster of1992—a storm that led to at least fifty deathsand more than $30 billion in property damage insouthern Florida. Sufficient warning enabled peo-ple to evacuate from dangerous beaches even asthe worst of the storm miraculously skirted down-town Miami. But Andrew pummeled south Floridawith particularly intense air currents that leveledwell-built homes and demolished Homestead AirForce Base. The damage from the winds was moresevere than expected, given that the region’sbuilding codes had been considered among thebest in the country. As it turned out, however,enforcement of those codes had grown lax duringthe region’s recent building boom.

All the science-based predictions and warningsin the world will not mitigate a natural disastermade more devastating by human folly. Ironically,improved hazard warnings in the United Statesmay be one of the factors encouraging more andmore people to move to homes on the earthquake-and hurricane-prone coasts. As noted in a 1999report by the National Research Council’s Boardon Natural Disasters, 80 percent of Florida’s popu-lation now lives within 22 miles of the beach—afivefold increase since 1950. A steady rise in themigration to cities has also made more people morevulnerable to the effects of natural disasters asthey live amidst aging infrastructures increasinglysusceptible to the slightest ill wind or tremor.Urban growth also translates into more pavement

and less exposed soil, which forces rain to run offrather than soak into the ground and tends toincrease flood damage. All of this means that thesharply upward trend in the costs of natural disas-ters is attributable not so much to the occurrenceof more hazards but rather to human choices thatplace more of our structures and possessions at risk.

Sometimes, too, steps taken with the best ofintentions to limit the dangers of natural hazardscan turn out to amplify the problem. The intenserains and flooding that occurred in 1993 along theMississippi River provide an example. The leveesand dikes that had been built along the river toprotect communities from the occasional mid-levelflood allowed more development in the area andalso effectively eliminated the flood plain, exacer-bating the damage caused by the unusually mas-sive surge of water in 1993.

“Disasters by Design: A Reassessment of NaturalHazards in the United States,” a five-year-long NSF-funded study, was released in the spring of 1999. The study compiled the thoughts of 132 experts onhow communities can better prepare themselves byevaluating potential local threats up to two hun-dred years in the future, determining acceptablelosses, and then planning for them.

Says study leader Dennis Mileti of the Universityof Colorado’s Natural Hazards Research andApplications Information Center, “We need to changethe culture to think about designing communitiesfor our great-grandchildren’s children’s children.”

The Human Factor

158 — National Science Foundation

that are firmly anchored to foundations. Wide eavesalong the roofline, which can act as handles forpowerful winds, should be avoided. And the re-searchers found that weak points in the structure,such as garage doors and opened windows, actu-ally increase the risk of damage by inviting inwinds that will blow down the opposing walls,exposing people to injury from breaking glass andflying wreckage. The advice might be simple—shutyour windows during a tornado rather than openthem—but it is rooted in the long investigation ofcomplex physical forces.

Trustworthy ToolsOur ability to understand tornadoes and other

natural forces is only as good as the tools research-ers have to study them. One highlight in this regardis Doppler radar, developed in the mid-1970s withthe help of NSF funds. Since the 1960s, meteor-ologists’ ability to predict violent weather patternshas depended largely on two kinds of technology:an array of orbiting space satellites that observeEarth’s environment on a scale not previouslypossible, and ground-based radar technology. Radarpeers inside clouds for clues about their potentialfor severe weather by sending out electromagneticpulses that bounce off particles and return withvaluable information about the location and intensityof precipitation. Most weather radars send out sig-nals with relatively short wavelengths that, whileoffering a precise picture of a cloud’s interior, canbe absorbed by the very particles they’re supposedto measure. On the other hand, Doppler radar useslonger wavelengths, so that even distant weathersystems will appear on the radar screen withaccurately rendered intensity. What’s more, Dopplerradar provides additional information (such as thevelocity at which precipitation is moving) that iscritical to short-term forecasting.

In the last decade, the National Weather Servicehas installed Doppler radar systems at fixed loc-ations across the country, improving meteorologists’ability to issue timely flash flood and severethunderstorm warnings and cutting by more than60 percent the number of tornadoes that strike

without public notice. Recently NSF-funded scien-tists have also begun experimenting with more-advanced mobile Doppler instruments mountedon flatbed trucks, which allow the hardier breedof researcher to chase down storms for even moreprecise readings.

With Doppler radar, NCAR scientists helped a University of Chicago wind expert, the late T. Theodore Fujita, to confirm in the 1980s a wholenew atmospheric hazard—the microburst.Microbursts are concentrated blasts of downdraftsfrom thunderstorms that have been responsiblefor airplane crashes killing more than five hundredpeople in the United States. And in 1999, NCARresearchers began testing whether Doppler radarsystems installed on airplanes can detect so-calledconvective turbulence, associated with storms andclouds, which can rip sections off small planesand injure crew and passengers.

Another significant new observing technologydeveloped at NCAR is a probe employed by hurri-cane-hunting aircraft. The probes are dropped fromgovernment planes into offshore hurricanes toprofile previously hard-to-measure factors such aslow-level winds, pressures, and temperaturesaround the storm's eye. Data from these probeshave greatly improved the National Weather Service’sability to predict the course and intensity of hurri-canes. Hurricanes develop over the warm tropicaloceans and have sustained winds in excess of 75 miles per hour. One hundred years ago, coastalresidents generally had less than a day’s warningbefore a hurricane struck. Today, thanks to satel-lites and radar, these same residents know daysin advance that a hurricane is maturing and moving their way.

El Niño Bears Unwanted GiftsHurricanes are dramatic examples of how theatmosphere and the oceans interact to drive thecourse of Earth’s climate in sometimes perilousways. Another example is El Niño, a weak warmcurrent of water that appears for several weekseach Christmas off the coast of Ecuador and Peru.Every three to five years, however, this otherwise

Disasters & Hazard Mitigation — 159

mild-mannered current becomes a real “hazardspawner,” says NCAR senior scientist MichaelGlantz, by growing in size and strength and lastingfor many months. Unusual weather conditionsresult as tropical monsoons that normally centerover Indonesia shift eastward, influencing atmos-pheric wind patterns around the world. Massivefish kills, droughts, heavy rains: These are justsome of the gifts that a robust El Niño can bear.

After a particularly devastating El Niño event in1982–83, researchers vowed not to be caught offguard again. NSF coordinated a global scientificeffort to set up a network of ocean-drifting, data-gathering buoys in the Pacific Ocean. In the springof 1997, the investment paid off when the instru-ments began recording abnormally high tempera-tures off the coast of Peru, giving scientists andpolicymakers their first inkling of an El Niño eventthat would turn out to be the most devastating infifty years. Supplemented with satellite observa-tions, the advance warning from the buoys allowedfarmers in Central and South America to steelthemselves for record-breaking drought andCalifornians to fix their roofs before the onset ofan unprecedented rainy season that also causedlife-threatening floods and mudslides. Now NCARresearchers are incorporating what they’ve learnedabout this massive El Niño event into supercom-puter-based climate models designed to simulateatmospheric circulation changes over the courseof decades and even centuries. And in May 1999,NCAR began working with the United NationsEnvironment Programme to conduct a nineteen-month study of the impact of the 1997–98 El Niño,with the goal of developing programs to helpcountries better prepare themselves for the daywhen El Niño makes a muscular comeback.

A Safer FutureIn recognition of the rising dangers and costsassociated with natural disasters around the world,the United Nations declared the 1990s theInternational Decade for Natural Disaster Reduction.At the close of the decade, NSF could look backon fifty years of sponsored programs whose aimshave been—and continue to be—to better under-

NSF Directorate for Engineeringwww.eng.nsf.gov

NSF Directorate for Geoscienceswww.geo.nsf.gov/

NSF Directorate for Social, Behavioral, and Economic Scienceswww.nsf.gov/sbe

Network for Earthquake Engineering Simulation (NEES)www.eng.nsf.gov/nees

Earthquake Engineering Research Institute (EERI)www.eeri.org

Federal Emergency Management Agencywww.fema.gov

Mid-America Earthquake Center (at the University of Illinois at Urbana-Champaign)http://mae.ce.uiuc.edu

Multidisciplinary Center for Earthquake Engineering Research (at the State University of New York at Buffalo)http://mceer.buffalo.edu

National Center for Atmospheric Research (NCAR)www.ncar.ucar.edu/ncar/

National Information Service for Earthquake Engineering (at the University of California at Berkeley)http://nisee.ce.berkeley.edu

National Weather Servicewww.nws.noaa.gov

Pacific Earthquake Engineering Research Center http://peer.berkeley.edu

Southern California Earthquake Centerwww.scec.org

U.S. Geological Survey Hazards Researchwww.usgs.gov/themes/hazard.html

U.S. Geological Survey Cascades Volcano Observatoryhttp://vulcan.wr.usgs.gov/home.html

University of Oklahoma Center for Analysis and Prediction of Storms (CAPS)www.caps.ou.edu

To Learn More

stand and prepare for the kinds of extreme nat-ural events that can prove disastrous for humancommunities. While the world can never beabsolutely safe, its human inhabitants can atleast rest easier in the knowledge that nature’sviolence holds less sway in an age where scien-tists and engineers are working so closely togetherto mitigate what cannot be controlled.