12
A Performance-Based Earthquake Engineering Method to Estimate Downtime using Fault Trees By Keith Porter A method is offered to estimate downtime (the time to restore a facility to operability, not total repair time) for second-generation performance-based earthquake engineering (PBEE-2). It applies in cases where downtime is primarily controlled by nonstructural systems (although red- tagging, lifeline failure, and other offsite hazards are addressed); restoration begins immediately after the earthquake; and components are repaired in parallel. It uses fault trees to relate component damage to a building’s post-earthquake operability. Restoration time for upper events that occur only if all lower events occur (i.e., events above “and” gates) is the minimum of the lower-event restoration times. For “or” gates, the upper-event restoration time is the largest lower-event time. With a system fault tree, structural response, and probability distributions of component capacity and component repair times, one can calculate probabilistic downtime either via Monte Carlo Simulation or in a closed-form manner that does not require simulation. 0.1% 1.0% 10.0% 100.0% 0 180 360 540 720 Downtime, days Exceedance probability in 50 years Keith Porter is Associate Research Professor at CU Boulder specializing in societal loss estimation, seismic vulnerability, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka or call +1 (626) 233-9758 for more information.

A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

Embed Size (px)

Citation preview

Page 1: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

A Performance-Based Earthquake Engineering Method to Estimate Downtime using Fault Trees

By Keith Porter

A method is offered to estimate downtime (the time to restore a facility to operability, not total repair time) for second-generation performance-based earthquake engineering (PBEE-2). It applies in cases where downtime is primarily controlled by nonstructural systems (although red-tagging, lifeline failure, and other offsite hazards are addressed); restoration begins immediately after the earthquake; and components are repaired in parallel. It uses fault trees to relate component damage to a building’s post-earthquake operability. Restoration time for upper events that occur only if all lower events occur (i.e., events above “and” gates) is the minimum of the lower-event restoration times. For “or” gates, the upper-event restoration time is the largest lower-event time. With a system fault tree, structural response, and probability distributions of component capacity and component repair times, one can calculate probabilistic downtime either via Monte Carlo Simulation or in a closed-form manner that does not require simulation.

0.1%

1.0%

10.0%

100.0%

0 180 360 540 720

Downtime, days

Exc

eeda

nce

prob

abil

ity

in 5

0 ye

ars

Keith Porter is Associate Research Professor at CU Boulder specializing in societal loss estimation, seismic vulnerability, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka or call +1 (626) 233-9758 for more information.

Page 2: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

Managing the Risk of Earthquake-induced Failure of Critical Facilities

By Keith Porter

Business-continuity risk from earthquakes can be said to depend on the likelihood of strong shaking at the business’s building (called seismic hazard), on the building’s seismic fragility (which here means how likely the building is to become inoperative given how strongly it is shaken, and on the degree to which it relies on utilities), and on how well staffed and trained its personnel are to deal with earthquakes. The work focuses on the first 2 issues, and offers a method for estimating the probability of earthquake-induced operational failure at a single facility or a number of primary and backup facilities. To deal with seismic hazard, the method uses new information from the US Geological Survey. To deal with seismic fragility we use a diagram called a fault tree, and on new information developed for FEMA and others about the seismic reliability of building components and utilities. With these, one can estimate the probability that earthquakes will cause a facility or facilities to become inoperative in some planning period such as 5 years, under as-is and what-if-mitigated conditions. One can use the method to identify mitigation measures to cost-effectively reduce risk to tolerable levels.

Operations Fail

Equipment Systems Fail

Uncontrolled Building Fire

Computers Or Telecom Fails

Building Support Systems Fail

1

Building is red-tagged

Fire Response Fails Ignition

occurs

HVAC and Equipment

Cooling FailsPower Fails

3 4

Suspended Ceilings Collapse

Failure caused by other perils

LegendOr gate: the event connected above occurs if any event connected below occurs

And gate: the event connected above occurs if all events connected below occur

Transfer symbol: tree continues elsewhere

Event: something undesirable occurs

Basic event: an event whose probability is quantified without lower events

Undeveloped event: an event whose probability is not quantified

Workers absent

Confla-gration

HazmatInunda-

tion

Civil unrestOperator

error

2

Keith Porter is Associate Research Professor at CU Boulder specializing in societal loss estimation, seismic vulnerability, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka or call +1 (626) 233-9758 for more information.

Page 3: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

Demand Surge

Research at the University of Colorado at Boulder Sponsored by the Willis Research Network

By Anna Olsen and Keith Porter

After large-scale natural disasters, the costs of repair and reconstruction are higher for a given group of damaged assets than if the same assets had been subjected to the same local excitation in a smaller-scale event. This phenomenon is often referred to as demand surge, and it has been observed after hurricanes, earthquakes, and other natural disasters. Following Hurricane Katrina, AIR, a catastrophe modeling company, recommended increasing loss estimates by 30% to account for demand surge. Hurricane Andrew (1992) was believed to have caused up to a 60% increase in costs relative to non-catastrophe conditions, and the 1994 Northridge Earthquake was believed to have caused a 20% increase. Demand surge is currently poorly documented and is addressed to varying degrees of sophistication in catastrophe models. An improved quantitative understanding of demand surge might enable government, insurers, and others to better plan for or reduce its effects.

Researchers at the University of Colorado at Boulder are studying the underlying mechanisms of demand surge. To generate a defensible predictive model of demand surge, the socioeconomic and engineering factors that drive post-disaster costs must be better understood. Some relevant issues include: the availability and substitution of materials, labor, and equipment; limits to emergency response capabilities; regional economic health (e.g., booming or in a recession); regulatory or government administrative action, and changes in insurance claims-

adjustment practice. The first phase of this research program will identify the pertinent issues through a literature search of previous studies and through post-disaster reconnaissance, beginning with the 2008 floods in the Midwestern US. Later phases will include model development, validation with available data, and econometric modeling possibly using computable general equilibrium. Dr. Anna Olsen is the Willis Research Network Fellow at the University of Colorado at Boulder. She received her Ph.D. in civil engineering at the California Institute of Technology in 2008. She may be contacted at [email protected]. Prof. Keith Porter is an Associate Research Professor at CU-Boulder specializing in seismic vulnerability, loss estimation, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka for more information, or call (626) 233-9758.

Photo: Andrea Lynn Photography

Page 4: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

THE ARKSTORM SCENARIO

By Keith Porter

The USGS Multi Hazards Demonstration Project (MHDP) uses hazards science to inform communities’ planning and preparedness decisions, and help them improve their resiliency to natural disasters. The first public product of the MHDP was the ShakeOut Earthquake Scenario, published in May 2008. This detailed depiction of a hypothetical magnitude 7.8 earthquake on the San Andreas Fault in southern California served as the centerpiece of the largest earthquake drill in U.S. history, involving 5,000 emergency responders and 5.5 million citizens; it has since become an annual California preparedness and planning event, in 2010 involving almost 8 million participants.

The MHDP’s next major public project is a winter storm scenario, a hypothetical event striking the US West Coast similar to intense California winter storms of 1861-1862. ARkStorm is somewhat smaller than 6 real California storms of the past 2,000 years. This scenario, developed by 125 scientists, engineers, public-policy experts, insurance experts, and employees of the affected lifelines, produced new science, highlighted pressing research needs, and concluded that the following outcomes could realistically occur with roughly the same probability as the ShakeOut event. The hypothetical storm produces precipitation that in many places exceeds levels only experienced on average once every 500 years or more. Extensive flooding results, in many cases overwhelming the state’s flood-protection system. The Central Valley experiences flooding 300 miles long and 20 miles wide. Serious flooding occurs in Orange, Los Angeles, and San Diego counties, the San Francisco Bay area, and elsewhere.

Windspeeds in reach 125 mph. Thousands of landslides occur, damaging roads, highways, and homes. Property damage exceeds $300 billion, mostly from flooding. Demand surge, agricultural, and business interruption losses bring the loss to $725 billion, of which only $20 to $30 billion is recoverable through insurance. Power, water, sewer, and other lifelines experience damage taking weeks or more to restore. Flooding requires the evacuation of 1.5 million residents. The ARkStorm raises several public policy issues: (1) Whether existing disaster policies could handle an ARkStorm, or whether long-term harm to the California economy would occur. (2) Whether and how to pay for risk mitigation. (3) The need for innovative recovery financing solutions. (4) ARkStorm’s use by responders, managers, and planners in self-assessments and table-top exercises. (5) Connecting federal, state and local natural hazards mapping and mitigation planning under the NFIP and Disaster Mitigation Act of 2000. (6) Common messaging about the risk of such an extreme event. Keith Porter is Associate Research Professor at the CU Boulder specializing in societal loss estimation, seismic vulnerability, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka or call +1 (626) 233-9758 for more information.

K St, Sacramento, 1861-1862

Page 5: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

The Great Southern California ShakeOut Earthquake Planning Scenario

Research sponsored by the United States Geological Survey

By Keith Porter

A large earthquake planning scenario was recently released by the USGS and California Geological Survey. It was created by more than 300 earth scientists, engineers, and social scientists, and hypothesizes the occurrence and effects of a MW7.8 earthquake on the southern San Andreas Fault. Keith Porter led the overall assessment of physical damages to buildings and lifelines. In the scenario, fault offsets reach 13m. State-of-the-art mathematical models were used to generate maps of shaking intensity, with peak motions greatly exceeding those of the 1994 Northridge earthquake and strong shaking occurring over are area of 10,000 km2. An analysis using the FEMA-funded risk software HAZUS®MH was performed, along with 18 special studies to characterize the effects of the earthquake on the built environment, including a substantial study by Charles Scawthorn of fire following earthquake.

The scenario is posited to result in 1,800 deaths and 53,000 injuries requiring emergency-room care. Approximately 1,600 fires are ignited, 1,200 of which are too large to be controlled by a single engine company, and despite the lack of Santa Ana winds, the fires destroy 200 million square feet of the building stock, equivalent to 130,000 single-family dwellings. Fire contributes $87 billion of the total $213 billion in economic loss, with most

of the rest coming from shake-related building and content damage ($46 billion) and business interruption unrelated to fire ($74 billion). Emergency response activities are considered in detail. The losses would be much greater if not for steadily improving building codes and other efforts to mitigate seismic risk in existing buildings and lifelines. This is the most comprehensive analysis ever of what a major Southern California earthquake would mean. It is the scientific framework for the largest earthquake preparedness drill in California history, which occurred on November 13, 2008. The drill involved 5.3 million people in area schools, businesses, government agencies, and other organizations—roughly 1 in 7 Californians. Keith Porter is an Associate Research Professor at the University of Colorado at Boulder. He specializes in seismic vulnerability and performance-based earthquake engineering. Further information can be found at http://spot.colorado.edu/~porterka, http://www.colorado.edu/hazards/shakeout.html, and http://www.shakeout.org.

Page 6: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

Vulnerability of Soft-story Woodframe Construction

By Keith Porter

The City of San Francisco has an estimated 2,800 woodframe, soft-story buildings 3 or more stories tall, with 5 or more housing units. Soft story construction refers to a building whose lowest story is significantly weaker, more flexible, or both, than stories above. These buildings, which performed so badly in the 1989 Loma Prieta earthquake, house 58,000 people—roughly 8% of the population—in 29,000 housing units. To inform a City government mitigation policy, a study was performed for the City of San Francisco’s Community Action Plan for Seismic Safety (CAPSS) project to quantify the risk. The study estimates the impacts on these buildings from each of 4 large, hypothetical, but highly realistic scenario earthquakes on the San Andreas and Hayward faults. Study results are being used to inform City policy on seismic risk mitigation.

In collaboration with experts from the San Francisco Department of Building Inspection (DBI) and consulting Structural Engineers, the project team developed four prototypical designs, called index buildings, to represent the range of San Francisco buildings in this construction category. The HAZUS-MH model was considered for use in loss estimation, but it lacks built-in structure types that properly reflect the characteristics of these unusual buildings. Custom HAZUS-like vulnerability models were created (using laboratory evidence, earthquake experience, and other information) to represent the relationship between shaking intensity,

damage, and repair cost, for each building, along with 3 seismic retrofit alternatives for each building. Earth scientists estimated seismic shaking intensities in terms of damped elastic spectral acceleration response across San Francisco. Fragility and vulnerability functions for the index buildings were assigned to the 2,800 buildings in DBI’s database. Site-specific shaking intensities were applied to the vulnerability models estimate damage and loss. Average shaking citywide is 2 to 4 times stronger than in the Marina District in the 1989 Loma Prieta Earthquake, where 15% of all corner soft-story apartment buildings were completely damaged or collapsed. In one scenario, 5 in 10 of these buildings are damaged to the point of being unsafe to occupy, and an addition 3 in 10 collapses. Retrofitting the ground floor with cantilever columns and new wood sheathing reduces collapses to fewer than 1 in 100, at a cost of $10,000 to $20,000 per housing unit. The methodology is validated against experience in the 1989 event. Keith Porter is Associate Research Professor at the University of Colorado at Boulder. Dr Porter created the fragility and vulnerability models used in the present study, and calculated the losses. For more information, contact [email protected] or +1 (626) 233-9758.

Collapsed apartment building in the SanFrancisco Marina District after the 1989 LomaPrieta earthquake

Page 7: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

Cost-Effectiveness of Woodframe Seismic Retrofit

By Keith Porter

A study for the CUREE-Caltech Woodframe Project examined the cost-effectiveness of improvements in woodframe buildings. These include retrofits, redesign measures, and improved quality in 19 hypothetical woodframe dwellings designed under the Woodframe Project. We estimated cost-effectiveness for each improvement and each California ZIP Code using 2nd-generation performance-based earthquake engineering techniques (PBEE-2) pioneered by the author and colleagues at Stanford University, Caltech, and other institutions of the Pacific Earthquake Engineering Research (PEER) Center. Costs and seismic vulnerability were determined on a component-by-component basis, within a nonlinear time-history structural-analysis framework and using full-size test-specimen data for component fragility functions. Probabilistic site hazard was calculated by ZIP Code and site soil class, and integrated with vulnerability to determine expected annualized repair cost. The approach provides insight into uncertainty of loss at varying shaking levels. We calculated present value of benefit to determine cost-effectiveness in terms of benefit-cost ratio (BCR), and found that one retrofit exhibits BCRs as high as 8, and exceeds 1 in half of California ZIP Codes. Four retrofit or redesign measures are cost effective in at least some locations. Higher construction quality is estimated to save thousands of dollars per house. Results are illustrated by maps for the Los Angeles and San Francisco regions and are available for every ZIP Code in California.

Keith Porter is Associate Research Professor at CU-Boulder specializing in seismic vulnerability, societal loss estimation, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka or call (626) 233-9758 for more information.

(b)

(a)

0 100 200 300 40050km

Benefit (2002 US$)

7 - 1,400 (0 < BCR < 1)

1,401 - 2,800 (1 < BCR < 2)

2,801 - 5,600 (2 < BCR < 4)

5,601 - 11,000 (4 < BCR < 8)

F Anaheim

San Jose

San Diego

Las Vegas

Sacramento

Los Angeles

San Francisco

(a) F0 10 20 30 405

km

(b)0 10 20 30 405

km

F

Benefit of adding foundation bolts and bracing cripple walls in an older, small woodframe house

Page 8: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

Cost-Effectiveness of Seismic Risk Mitigation

By Keith Porter

In 1999, the US Congress instructed the Federal Emergency Management Agency (FEMA) to sponsor an independent study quantifying the future savings from natural-hazard mitigation efforts. The study was carried out by the Applied Technology Council (ATC) under the direction of the National Institute of Building Sciences’ (NIBS) Multihazard Mitigation Council (MMC). It examined mitigation activities related to earthquake, wind, and flood funded through three major FEMA grant programs. It found that FEMA’s mitigation activities from 1993 to 2003:

Were cost-effective, reducing future losses from earthquakes, floods, and windstorms. The average benefit-cost ratio (BCR) for earthquake grants was 1.5, meaning a present value of $1.5 avoided future loss for every $1 spent. Earthquake project benefits exceed costs with 83% probability; Resulted in significant net benefits to society as a whole (individuals, states, and communities) in terms of future reduced losses, saving $16 billion in total; and Provided significant potential savings to the federal treasury in increased future tax revenues and reduced hazard-related costs.

This study involved two components. The first was the benefit-cost analysis of FEMA mitigation grants. It estimated the future savings from FEMA expenditures on mitigation activities. It was quantitative and considered a statistical sample of FEMA-funded mitigation activities selected from the National Emergency Management Information System (NEMIS) database. The unit of analysis was the individual FEMA-funded grant. The second study component, community studies, assessed the future savings from mitigation activities through empirical research on mitigation activities carried out in community contexts. The community studies were quantitative and qualitative and examined mitigation activities in a sample of communities. They provided insight into mitigation effectiveness by exploring how mitigation activities percolate throughout the community via synergistic activities — mitigation efforts that would not have occurred had it not been for the original FEMA grant. The unit of analysis was the individual community. The study was conducted by an ATC research team of more than 30 experts in a variety of disciplines. The benefit-cost analysis component was co-led by Keith Porter. Keith Porter is Associate Research Professor at CU-Boulder specializing in seismic vulnerability, loss estimation, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka or call (626) 233-9758 for more information.

Savings from Earthquake Grants

Environmental & Historical, $14

Casualties, $863

Building &Contents, $376

BI & Displacement, $139

(67 deaths, 1,399 injuries)

Earthquake benefit:

$1,392 million

Cost: $947M

$1.4 billion saved through seismic risk mitigation

Page 9: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

ATC-58’s Fragility Methodologies

By Keith Porter

The Applied Technology Council (ATC) FEMA-funded project ATC-58 is bringing 2nd generation of performance based earthquake engineering (PBEE-2) to professional practice. The methodology was developed by the author and many colleagues at Stanford University, Caltech, and other institutions of the Pacific Earthquake Engineering Research (PEER) Center. PBEE-2 provides the means for an engineer to estimate the future probabilistic seismic performance of buildings and other facilities in terms of repair cost, life safety, and loss of functionality (“dollars, deaths, and downtime”). Though most of the basic principles of PBEE-2 have been established, ATC is addressing several remaining challenges. One is that PBEE-2 requires a large suite of relationships called fragility functions that give the probability that a particular building component will be damaged, given that it experiences some level of force or deformation. How are these fragility functions to be created from laboratory or other evidence in a consistent, defensible fashion? Another is that an individual building may contain several identical components, each with the same fragility functions. How is the correlation of damage among multiple components to be addressed in PBEE-2?

The research summarized here developed a set of standard fragility methodologies, implemented them for a number of building components, and is addressing the question of correlation. The fragility methodologies deal with several kinds of damage data: (A) actual failure data, where several specimens have been tested to failure and the failure excitation for each is known; (B) bounding failure data, where several specimens have been observed, some failed and some not, and the maximum excitation to which each was subjected is known, but not the precise excitation that caused failures; (C) capable excitation, in which several specimens were tested and none failed; (D) derived capacity, in which a structural model is used to calculate capacity; (E) expert opinion, in which a group of experts is asked to

judge failure capacity in a transparent, reviewable fashion; and (U) updating, in which an existing fragility function is revised based on new observations through Bayesian updating. The research has applied these methods to hydraulic and traction elevators, 15 classes of mechanical, electrical, and plumbing equipment; tile roofs; and various other nonstructral components. The research is now examining the empirical evidence available to quantify damage correlation of identical components subjected to identical seismic excitation. Keith Porter is Associate Research Professor at CU-Boulder specializing in seismic vulnerability, loss estimation, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka or call (626) 233-9758 for more information.

0.00

0.25

0.50

0.75

1.00

0.00 0.25 0.50 0.75

Peak ground acceleration, g

Elev

ator

failu

re p

roba

bilit

y θ = 0.41g β = 0.28

Stanford Cedars Sinai, USC

St Johns

Valley Northridge

Fragility of 91 hydraulic elevators at 7 facilities in the Loma Prieta and Northridge earthquakes.

Page 10: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

Empirical Fragility Functions for ATC-58

By Keith Porter

A 2nd generation methodology for performance based earthquake engineering (PBEE-2) has been developed by the author and colleagues working with the Pacific Earthquake Engineering Research (PEER) Center and with the Applied Technology Council (ATC) in its FEMA-funded ATC-58 project. PBEE-2 provides the means for a practicing engineer to estimate the future probabilistic seismic performance of buildings and other facilities in terms of repair cost, life safety, and loss of functionality, i.e., dollars, deaths, and downtime. A central challenge to bringing this methodology to professional practice has been creating a suite of fragility functions diverse enough to model a variety of buildings, fine enough to resolve the effect of common design and construction alternatives, and robust enough to survive close scrutiny.

0.00

0.25

0.50

0.75

1.00

0.0 0.5 1.0 1.5

Base acceleration (geom mean, g)

Fai

lure

pro

babi

lity

0.00

0.25

0.50

0.75

1.00

0.0 0.5 1.0 1.5

Base acceleration (geom mean, g)

Fai

lure

pro

babi

lity

0.00

0.25

0.50

0.75

1.00

0.0 0.5 1.0 1.5

Base acceleration (geom mean, g)

Fai

lure

pro

babi

lity

(a) (b) (c) Motor control center seismic fragility: (a) no deficiencies, (b) average, (c) 1 deficiency. Solid line is fit to the data points (circles). Triangles show the overall failure rate at the weighted-average excitation. Dashed line is fit to the overall failure rate and β = 0.4

Fragility functions can be developed from first principals, expert judgment, lab tests, earthquake experience, or a combination. Prior work by the author and colleagues present several such methods. In current work, a number of fragility functions were derived empirically, especially various nonstructural components such as building service equipment. Many draw on previously unavailable earthquake experience data compiled over 20 years for the Electric Power Research Institute. These data tend to focus on earthquake excitation with zero-period acceleration (ZPA) up to 1.0g. Several draw on seismic qualification tests with ZPA up to 2.0g and higher. While a substantial fragility library was developed, we found that several conventions for deriving fragility functions don’t work very well. ZPAs to which specimens were subjected tend to be highly uncertain. Many failures relate only indirectly to ZPA. Data often poorly fit the lognormal cumulative distribution function (or other CDFs). And assumptions for damage correlation, appropriate for nuclear energy applications, do not seem to match experience well. The principals for deriving fragility functions have evolved because of this practical experience. Keith Porter is an Associate Research Professor specializing in seismic vulnerability and PBEE-2. See http://spot.colorado.edu/~porterka or call (626) 233-9758 for more information.

Page 11: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

OpenRisk: Free Open-Source Risk Software

By Keith Porter

What are the potential impacts of new knowledge in earth science, geotechnical engineering, structural engineering, or social science, in terms of societal risk? How, for example, do next-generation attenuation relationships affect probabilistic societal or insurance loss? Natural-disaster researchers cannot maintain adequate expertise in all of the many domains of natural hazard risk to answer these questions themselves, and partnering with the other disciplines can be time-consuming or prohibitively expensive. To employ commercial risk software by RMS, AIR, or EQECAT is likewise expensive. It usually impossible for an outsider to insert new knowledge into

the commercial models, and difficult to impossible to do so using public models such as HAZUS-MH. One possible solution to this problem is OpenRisk, a small by growing suite of object-oriented, web- and GUI-enabled, open-source, and freely available software code for conducting multihazard risk analysis. The resulting body of code and applications is also referred to as OpenRisk. OpenRisk extends the capabilities of the open-source seismic hazard analysis software OpenSHA (see www.opensha.org) developed by the US Geological Survey and the Southern California Earthquake Center. OpenSHA’s developers encode the state of the art in seismic hazard knowledge as it develops, and is generally 1-2 years ahead of commercial risk software. OpenRisk adds vulnerability and risk capabilities to OpenSHA that enable a researcher to estimate loss-exceedance curves for a single asset, perform benefit-cost analysis for retrofit or other change to a single asset, or calculate expected annualized loss for a portfolio of assets. The researcher can explore the sensitivity of the results to changes in the earthquake rupture forecast, ground-motion prediction equations, site soil conditions, or vulnerability model. In current development is the ability to estimate the loss-exceedance relationship for a portfolio of assets. Another OpenRisk application calculates fragility functions based on empirical damage evidence of various types, and an open-source vulnerability model cracks the “open safe” of the HAZUS-MH vulnerability relationships for repair costs and indoor casualties for 128 combinations of model building type and code era. All the data and software can be downloaded for free from www.risk-agora.org. Keith Porter is Associate Research Professor at CU-Boulder specializing in seismic vulnerability, loss estimation, and performance-based earthquake engineering. See http://spot.colorado.edu/~porterka or call (626) 233-9758 for more information.

Single-site loss exceedance curve calculator showing the sensitivity of loss to different ground-motion prediction equations

Page 12: A Performance-Based Earthquake Engineering Method …spot.colorado.edu/~porterka/Porter-2011-Abstracts.pdf · A Performance-Based Earthquake Engineering Method to Estimate Downtime

End-to-End Seismic Risk Management Software

By Keith Porter Data-related risk management tasks. Owners of buildings in earthquake country have several data-related seismic risk management concerns: Before an earthquake, they need to know the locations and seismic characteristics of their buildings to know which might pose a potential seismic risk. They need the ability to assess and prioritize risk-mitigation efforts, and to develop emergency response plans. Immediately after an earthquake, they need to know which buildings were most likely damaged to prioritize inspections, and may need to carry out the safety inspections themselves and compile the damage data. The US Federal Emergency Management Agency and US Geological Survey have developed a suite of free software to assist that effort: FEMA’s Rapid Observation of Vulnerability and Estimation of Risk (ROVER) is used to rapidly gather a pre-earthquake building inventory in the field and screen buildings for potential seismic risk. Its HAZUS-MH software can be used to assess probabilistic risk to the buildings. USGS’ ShakeCast provides situational awareness during an earthquake. Then ROVER can be used for post-earthquake safety evaluation. Together they provide a system of software for end-to-end data acquisition, risk-management, and post-earthquake response.

(a) (b) (c) (d) Building vulnerability data are collected in the field with ROVER on a smartphone (a), with its camera and sketch capability and geolocation captured via a Bluetooth GPS device. Data are sent wirelessly to a secure Internet-accessible server (b) located anywhere in the world. The data can be automatically imported to ShakeCast (c), which watches for relevant earthquakes and alerts the user when one has likely affected the buildings in the inventory, and lists the facilities in order of likely damage state. The user can then use ROVER on the same smartphone (d) to assist in safety tagging and data management. ROVER adapts FEMA 154 and ATC-20, two de facto international standard paper-based methodologies for pre-earthquake seismic risk screening and post-earthquake safety evaluation. Unlike paper forms, ROVER handles much of the data automatically, integrating pre- and post-earthquake data with ShakeCast and HAZUS. ROVER provides geolocation and automated soil and hazard lookup. The software is free, with no licensing costs. ROVER is open-source software, with human-readable source code available for examination and modification. ROVER was designed by Prof Porter, Instrumental Software Technologies, Inc., and the Applied Technology Council, with support from the Federal Emergency Management Agency. ShakeCast was developed by the US Geological Survey. The National Institute of Building Sciences developed HAZUS-MH for FEMA. For more information, contact [email protected].