"Review of major types of uncertainty in fisheries modeling and how to deal with them"

Preview:

DESCRIPTION

"Review of major types of uncertainty in fisheries modeling and how to deal with them". Randall M. Peterman School of Resource and Environmental Management (REM) Simon Fraser University, Burnaby, British Columbia, Canada. National Ecosystem Modeling Workshop II, - PowerPoint PPT Presentation

Citation preview

"Review of major types of uncertainty in fisheries modeling and how to deal with them"

Randall M. Peterman

School of Resource and Environmental Management (REM)

Simon Fraser University,Burnaby, British Columbia, Canada

National Ecosystem Modeling Workshop II, Annapolis, Maryland, 25-27 August 2009

Outline

• Five sources of uncertainty - Problems create - What scientists have done

• Adapting those approaches for ecosystem modelling

• Recommendations

Single-speciesstock assessments

Single-speciesstock assessments

General risk assessment methods

Uncertaintiesconsidered

Single-speciesstock assessments

General risk assessment methods

Uncertaintiesconsidered

My background

Single-speciesstock assessments

Scientific advice: including risk communication

Decision makers, stakeholders

Risk management

Uncertaintiesconsidered

General risk assessment methods

Single-speciesstock assessments

Decision makers, stakeholders

Risk management

Multi-species ecosystemmodels Impressive!!

Uncertaintiesconsidered

Scientific advice: including risk communication

General risk assessment methods

Single-speciesstock assessments

Decision makers, stakeholders

Risk management

Uncertaintiesconsidered

Uncertaintiesconsidered

Scientific advice: including risk communication

General risk assessment methods

Multi-species ecosystemmodels

Single-speciesstock assessments Uncertainties

considered

Decision makers, stakeholders

Risk management

Scientific advice: including risk communication

Multi-species ecosystemmodels

Uncertaintiesconsidered

General risk assessment methods

2. Provide broad strategic advice

3. Provide specific tactical advice

Purposes of ecosystem models from NEMoW 1

1. Improve conceptual understanding

Uncertainties are pervasive ...

1. Natural variability

Sources of uncertainty

Uncertainties

1. Natural variability

2. Observation error (bias and imprecision)

Sources of uncertainty

Uncertainties

3. Structural complexity

1. Natural variability

2. Observation error (bias and imprecision)

Sources of uncertainty

Uncertainties

3. Structural complexity

1. Natural variability

2. Observation error (bias and imprecision)

Sources of uncertainty

UncertaintiesResult:Parameteruncertainty

3. Structural complexity

1. Natural variability

4. Outcomeuncertainty(deviation from target)

2. Observation error (bias and imprecision)

Sources of uncertainty

UncertaintiesResult:Parameteruncertainty

3. Structural complexity

1. Natural variability

Result:Imperfect forecastsof system's dynamics

2. Observation error (bias and imprecision)

Sources of uncertainty

UncertaintiesResult:Parameteruncertainty

4. Outcomeuncertainty(deviation from target)

3. Structural complexity

1. Natural variability

5. Inadequate communicationamong scientists,decision makers,and stakeholders

Result:Imperfect forecastsof system's dynamics

2. Observation error (bias and imprecision)

Sources of uncertainty

UncertaintiesResult:Parameteruncertainty

4. Outcomeuncertainty(deviation from target)

3. Structural complexity

1. Natural variability

5. Inadequate communicationamong scientists,decision makers,and stakeholders

Result:Imperfect forecastsof system's dynamics

2. Observation error (bias and imprecision)

Sources of uncertainty

Uncertainties

Result:Poorlyinformed decisions

Result:Parameteruncertainty

4. Outcomeuncertainty(deviation from target)

Uncertainties

Biological risks

(ecosystems)

Economic risks

(industry)

Social risks

(coastalcommunities)

Risk:

Magnitude of variable/event and probability of that magnitude occurring

Sensitivity analyses across:

1. Which components to include

3. Parameter values

2. Structural forms of relationships

4. Management objectives

5. Environmental conditions

• Focus: - Which parts most affect management decisions?- Which parts are highest priority for more data?

6. Management options

2008 Mutton snapperU.S. South Atlantic & Gulf of Mexico

SSB / SSBF30%

F / F30%

Ove

rfis

hin

g

Overfished

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication

Sources of uncertaintyProblems

Resolution

UncertaintyProblems created by

not adequately accounting for that uncertainty

1. Natural variation in

space and time

• Poor estimates of model parameters and variables

• Inappropriate dynamics of model due to nonstationarity: - "Regime shifts" in productivity - "Phase shifts" in system structure

1. Simulate stochastically

2. Make parameters a function of age, size, density, ...

3. Include other components (static or dynamic) - Predators, prey, competitors - Bycatch/discards - Environmental variables...

1. Natural variability

What scientists have done to deal with ...

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication

Sources of uncertainty

Uncertainty Problems created

2. Observation error (bias and imprecision in input data)

• Biased/imprecise estimates of model parameters - Incorrect functional form of error term

• Biased output indicators

• Wrong probability distributions

1. Assume % of total variance due to observation error

2. Conduct sensitivity analyses

3. Use hierarchical models that "pool" information to help "average out" annual observation error

- Jerome Fiechter et al. using hierarchical Bayesian models on NEMURO (NPZD-based)

2. Observation error

What scientists have done to deal with ...

Stocknumber Pink salmon

-0.5 0.0 0.5 1.0

1

10

20

30

40Separate single- stock analyses

Multi-stock, mixed-effects model

Alaska

B.C., Wash.

South

North

, change in salmon productivity, loge(R/S),

per oC increase in summer sea-surface temperature

i

Mueter et al. (2002a)

4. Separately estimate natural variation and observation error -- Errors-in-variables models

-- State-space models-- Kalman filter

Example 1: tracking nonstationary productivity parameter (Ricker value)

2. Observation error ... (continued)

0

1

2

3

0 10 20 30 40 50 60 70 80 90 100

Year

Productivity parameter

Low High Decreasing

0

1

2

3

0 20 40 60 80 100

"True"Standard methodKalman filter

Year

Simulation test

Productivity(Ricker parameter)

Peterman et al. (2000)

• Kalman filter with random-walk system equation was best across all types of nonstationarity

Example 2 of observation error and natural variation

Simplest possible model: spawner-recruit relationshipSu and Peterman (2009, in prep.)

- Used operating model to determine statistical properties of various parameter-estimation schemes:-- Bias-- Precision-- Coverage probabilities (accuracy of estimated

width of probability interval for a parameter)

2. Observation error ... (continued)

Operating model (simulator to test methods)

User-specified "true" underlying parameter values

("What if ...?")

Test performance of an estimator

Operating model (simulator to test methods)

Generate "observed data"from natural variation and observation error

Parameters estimated

User-specified "true" underlying parameter values

("What if ...?")

Test performance of an estimator

Operating model (simulator to test methods)

Compare"true" and

estimated values

Parameters estimated

User-specified "true" underlying parameter values

("What if ...?")

Test performance of an estimator

Generate "observed data"from natural variation and observation error

Operating model (simulator to test methods)

Compare"true" and

estimated values

200trials

Parameters estimated

User-specified "true" underlying parameter values

("What if ...?")

Test performance of an estimator

Generate "observed data"from natural variation and observation error

Proportion of total variance due to measurement error

True = 2

0.25 0.75 0.25 0.75 0.25 0.75

% relativebiasin

Harvest-rate history Low Variable High

Extended Kalman filterErrors-in-variablesBayesian state-spaceStandard Ricker

X

*

• Results also change with true

-500

50

150

250

Results for 95% coverage probabilities- Uncertainty in estimated is too narrow (overconfident) for all 4 estimation methods

Ricker

Probability

Actual

Estimated

- Trade-off between bias and variance (Adkison 2009, Ecol. Applic. 19:198)

Recommendation• Test parameter estimation methods before applying them (Hilborn and Walters 1992)

• Use results with humility, caution - Parameter estimates for ecosystem models may inadvertently be quite biased!

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication

Sources of uncertainty

Uncertainty Problems created

3. Unclear structure of fishery system (model uncertainty, structural uncertainty, model misspecification)

• Biased/imprecise parameters

• Wrong system dynamics

• Overconfidence in results if only use one model

1. Choose single "best" model among alternatives1a. Informally1b. Formally using model selection criterion (AICc)

3. Unclear structure of fishery system

What scientists have done to deal with ...

Caution!! - Not appropriate for giving management advice- Asymmetric loss functions

(Walters and Martell 2004, p. 101)

0.2

0.6

1.0

A B

SSB / SSBmsy

Case 2Case 1

SpeciesA B

Asymmetric loss: Which case is preferred?

Spawning favoured Harvest favoured

H

H

T+Q

R

T Q

H T+Q T

Q

0% 25% 50% 75% 100%

D

C

B

A

O:U ratio

0.25 0.5 1 2 4

0.30 1.44

0.68 2.09

1.18 1.33

Cummings (2009)

0.25 0.5 1.0 2 4 Preference ratio

Symmetric Asymmetric withharvest obj. favored

Asymmetric withspawning obj. favored

Recommendation• To develop appropriate indicators, ecosystem scientists should understand asymmetry in managers' objectives, especially given many species.

Fraser River Early Stuart sockeye salmon:Best "management-adjustment" model (H, T, Q, T+Q)

3. Unclear structure of fishery system

What scientists have done to deal with ...

1c. Adaptive management experiment - Sainsbury et al. in Australia

More commonly, we have to consider a range of alternative models ...

1. Choose single "best" model among alternatives ... ...

3. Unclear structure of fishery system ... (cont'd.)

0 10 20 30 40 50 60

0

2

4

6

8

10

12

0.1

0.2

0.3

0.4

0.5

0.6

0.70.10.20.30.40.50.6 0.7

0.100.150.20

0.250.30

0.35

SSB ('000 t)

F * 1000

VPA

Stock-synthesis

Delay-diff.

(R. Mohn 2009)SSB (thousands of tonnes)

F*1000

Mvalues

Eastern Scotian Shelf cod (closed in mid-1990s)

2. Retain multiple models; conduct sensitivity analyses2a. Analyze separately

2. Retain multiple models; conduct sensitivity analyses2a. Analyze separately2b. Combine predictions from alternative models

- Unweighted model averaging - Weighted with AIC weights or posterior probab.,

then calculate expected values of indicators

• But weighting assumes managers useexpected value objectives

- Many use mini-max objectives (i.e., choose action with lowest chance of worst-case outcome)

3. Unclear structure of fishery system ... (cont'd.)

0

10

20

30

40

50

60

70

80

90

1 2 3 4 5 6 7 8 9 10 11

0

50

100

150

0.5 1.00.750.25 1.250.50.25

Probabilitywith manage-ment action A

0

0.05

0.1

0.2

Limit referencepoint

Expected SSB (weighted average)

0.05

00

Worst-caseoutcome(unlikely, but choose action withlowest probability )

SSB/SSBtarget

Recommendation• Ecosystem scientists should work iteratively with managers to find the most useful indicators to reflect management objectives.

2. Retain multiple models; conduct sensitivity analyses ...

...2c. Evaluate alternative ecosystem assessment

models by using an operating model to determine their statistical properties

(e.g., Fulton et al. 2005 re: community indicators)

3. Unclear structure of fishery system ... (cont'd.)

2. Retain multiple models; conduct sensitivity analyses......... 2d. Evaluate alternative ecosystem assessment

models within closed-loop simulation (MSE) to determine robust management strategies acrossrange of operating models

Caution!!!! Elaborated upon later.

3. Unclear structure of fishery system ... (cont'd.)

3. Unclear structure of fishery system ... (cont'd.)

Recommendation• Ecosystem scientists should compare management advice from multiple models.

• Models are "sketches" of real systems, not mirrors- Only essential features

Appropriate ecosystem model sketches?

ESAMMRMGADGETSEAPODYMEwEAtlantis...?

Appropriate ecosystem model sketches?

ESAMMRMGADGETSEAPODYMEwEAtlantis...?

• "A model should be as simple as possible, but no simpler than necessary" [and no more complex either!]

- Morgan and Henrion (1990)

Appropriate ecosystem model sketches?

ESAMMRMGADGETSEAPODYMEwEAtlantis...?

• "A model should be as simple as possible, but no simpler than necessary" [and no more complex either!]

- Morgan and Henrion (1990)

Appropriate model complexity depends on:- Type of questions/advice (Plagányi 2007)- Knowledge and data

Model complexity

Effectiveness, predictive power

(Fulton et al. 2003, others)

"Adaptive radiation" of ecosystem models

HighLow

Low

High

• Build multiple (nested) models of a given system- Which model is best for the questions?- Yodzis (1998) could omit 44% of interactions

Recommendation: How ecosystem scientists can deal with structural uncertainty ... (continued)

• Conduct closed-loop management strategy evaluations(MSEs) across a wide range of hypothesizedoperating models of aquatic ecosystem

- "Best practice" -- Plagányi (2007)-- Tivoli meeting (FAO 2008)-- NEMoW I report (Townsend et al. 2008)

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication

Sources of uncertainty

Uncertainty Problems created

4. Outcome uncertainty, i.e., deviation from target (implementation uncertainty, implementation error, management uncertainty) Results from:

- Non-compliance

- Inappropriate regulations

- Physical and biological factors affecting q

• Overconfidence in:

- Meeting management objectives

- Avoiding undesirable outcomes

• Surprises!

1. Empirically estimate it (historical deviations from targets)

4. Outcome uncertainty

What scientists have done to deal with ...

0.0 0.5 1.0 1.5 2.00.0

0.4

0.8Target

Realized

"Outcome uncertainty"Early Stuart sockeye salmon, B.C. (1986-2003)

Harvest rate

Forecast of adults (millions)Holt and Peterman (2006)

Outcome uncertainty:Both imprecise and biased

2. Add outcome uncertainty as a stochastic process

3. Conduct sensitivity analyses on nature of outcome uncertainty

1.5

1.6

1.7

1.8

1.9

2.0

Relativeaveragecatch

6%

Ricker

Ricker AR(1)

Kalman filter

Distance-based HBM

Non-spatial HBM

None

Outcome uncertainty

MSE with CLIM2, a 15-popul. salmon model

(Dorner et al.2009, in press)

1.5

1.6

1.7

1.8

1.9

2.0

6%

Ricker

Ricker AR(1)

Kalman filter

Distance-based HBM

Non-spatial HBM

None Imprecise and unbiased

Outcome uncertainty

MSE with CLIM2, a 15-popul. salmon model

(Dorner et al.2009, in press)

Relativeaveragecatch

1.5

1.6

1.7

1.8

1.9

2.0

None Imprecise and unbiased

6%

Ricker

Ricker AR(1)

Kalman filter

Distance-based HBM

Non-spatial HBM

24% decrease

Outcome uncertainty

Imprecise andbiased

MSE with CLIM2, a 15-popul. salmon model

(Dorner et al.2009, in press)

Relativeaveragecatch

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication

Sources of uncertainty

Uncertainty Problems created

5. Inadequate communication among scientists, decision makers, and stakeholders

• Missing indicators due to unclear operational objectives

• Misinterpretation

• Overconfidence by decision makers if uncertainties not clear

• Decision makers may under- value scientific advice

What scientists have done to deal with ...

1. Work iteratively with stakeholders and decision makers - Clarify management objectives and indicators

-- Maximize expected value, mini-max, or ...?

2. Conduct sensitivity analyses on mgmt. objectives

5. Inadequate communication

-- Cumulative probability distributions

-- Frequency format, not decimal probability format

5. Inadequate communication ... (continued)

(due to six interpretations of "probability", only one of which is "chance")

Recommendation:3. Show indicators with uncertainties - Use cognitive psychologists' findings about how people think about uncertainties and risks

“Chance" of an outcome for a given set of management regulations:

Probability format"There is a probability of 0.2 that SSB will drop below its limit reference point"

“Chance" of an outcome for a given set of management regulations:

Probability format"There is a probability of 0.2 that SSB will drop below its limit reference point"

Frequency format"In two out of every 10 situations like this, SSB will drop below its limit reference point".

“Chance" of an outcome for a given set of management regulations:

Probability format"There is a probability of 0.2 that SSB will drop below its limit reference point"

Frequency format"In two out of every 10 situations like this, SSB will drop below its limit reference point".

Gerd Gigerenzer et al.

4. Creatively display multiple indicators, and trade-offs among them

5. Inadequate communication ... (continued)

Recommendation

(Fulton, Smith, and Smith 2007)

Microfauna

Bycatch

Target

Habitat

Pelagic: demersal

Shark

TEP (marine mammals, seabirds)

Microfauna

Bycatch

Target

Habitat

Pelagic: demersal

Shark

TEP (marine mammals, seabirds)

Bycatch

Target

Habitat

Pelagic: demersal

Piscivore:planktivore

Bycatch

Target

Habitat

Pelagic: demersal

Piscivore:planktivore

Scenario 1

Scenario 4

Radar plots,kite diagrams

Piscivore:planktivore

Biomass size spectra

Biomass size spectra Piscivore:planktivore

HAD

COD

WHG

PLESOL

MACNOP

SAN

HER

POK

Status Quo Effort

Bpa

HADCODWHG

PLESOL

MACNOP

SAN

HER

POK

Precautionary Effort

Bpa

Collie et al. (2003)

Bpa = precautionary biomass

AMOEBA plots for North Sea

-800

-600

-400

-200

0

0 200 400 600 800 10000.0

0.2

0.4

0.6

0.8

1.0

100

300

500

700

Average spawners (1000s)Yukon R.fall chumsalmon

Target spawners (in 1000’s)

Collie et al. (in prep.)

Harvest rateon run exceedingtarget spawners

30

40

50

60

70

80

90

100

0 200 400 600 8000.0

0.2

0.4

0.6

0.8

1.0

-200

-150

-100

-50

0

0 200 400 600 8000.0

0.2

0.4

0.6

0.8

1.0

-800

-600

-400

-200

0

0 200 400 600 8000.0

0.2

0.4

0.6

0.8

1.0

-200

-150

-100

-50

0

0 200 400 600 8000.0

0.2

0.4

0.6

0.8

1.0Average spawners (1000s)Yukon R.

fall chumsalmon

Avg. subsistence catch (1000s)

Avg. commercial catch (1000s)% years commercial closed

100

500

700 180

20 100

40

80

80

200

100100

60

60150

140300 60

Target spawners (in 1000’s)

Harvest rateon run exceedingtarget spawners

Booshehrian, Moeller, et al.Spawning target (1000s) of chum salmon

Pro

po

rtio

n h

arve

sted

Avg. subsistence catch (1000s)

Avg

. co

mm

erci

al

catc

h (

1000

s)Vismon software, in prep.

Comment on tradeoffs• Remind managers and stakeholders:

Low

High

Stated Actual

Uncertainty

Ecologicalindicators

Comment on tradeoffs• Remind managers and stakeholders:

Low

High

Stated Actual

Uncertainty

Socio-economicindicators

Ecologicalindicators

Stated Actual

Comment on tradeoffs• Remind managers and stakeholders:

- Apply same standards to economists/social scientists and ecologists!!!

Low

High

Stated Actual

Uncertainty

Socio-economicindicators

Ecologicalindicators

Stated Actual

1. Formal training:

Recommendations to deal with inadequate communication

2. "User studies" about effectiveness of communication methods

ScientistsDecision makersand stakeholders

3. Develop interactive, hierarchical information systems to show: - Management options - Consequences - Trade-offs - Uncertainties

4. Develop communications strategies like Intergovernmental Panel on Climate Change (IPCC):

Recommendations to deal with inadequate communication ...

IPCC

• Advises decision makers and stakeholders

• Communication challenges:

- Complexity - Uncertainty - Risks - Credibility

How IPCC solves these communication challenges

1. Multi-level information systems: IPCC (2007) reportsa. Aim at multiple audiencesb. Hierarchicalc. Numerous footnotes (~ hypertext links)d. Diverse graphics

IPCC (2007) reports

How IPCC solves these communication challenges ...

2. Standardized format for describing uncertaintiesassociated with "essential statements":

- Chance of an outcome- Confidence in that estimated chance of that outcome

- "...very high confidence that there is a high chance of ..."

- "We have medium confidence that ..."

• Similar to recent Marine Stewardship Council guidelines

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication

Sources of uncertainty

What scientists have done to deal with ...

• Simulations of entire fishery systems- Closed-loop simulations (Walters 1986) - Management strategy evaluations (MSEs) (Punt and Butterworth early 1990s) - Which management procedure is most robust

to uncertainties-- A single management procedure includes:

--- Data collection method--- Stock or ecosystem assessment model--- State-dependent harvest rule

Combination of first 4 sources of uncertainty

Closed-loop simulation or MSE: ~ flight simulator

Robustprocedures for responding to unexpectedevents

Unusualweather

Uncertainties

Randomevents

Equipmentfailure

Natural Aquatic System Sampling, data collection

Operating model such as Atlantis

Natural Aquatic System Sampling, data collection

ESAMMRMEwEGADGET...

Operating model such as Atlantis

Ecosystemassessment model

What What we we know don't know

Stakeholders

Natural Aquatic System Sampling, data collection

Decision makers(harvest rules)

Operating model such as Atlantis

Ecosystemassessment model

What What we we know don't know

Stakeholders

Natural Aquatic System

Managementobjectives

Sampling, data collection

Harvesting

Fishing regulations(harvest quotas, closed areas, ...)

Decision makers(harvest rules)

Operating model such as Atlantis

Ecosystemassessment model

What What we we know don't know

Stakeholders

Natural Aquatic System

Observation error

Managementobjectives

Outcome uncertainty

Decision makers(harvest rules)

Sampling, data collection

Harvesting

Naturalvariability

Structuraluncertainty

Fishing regulations(harvest quotas, closed areas, ...)

Peterman (2004)

Operating model

Entire diagram = closed-loop simulation (MSE)

Inadequate communic.

Ecosystemassessment model

What What we we know don't know

MSEs include iterating across all major hypotheses about operating model

Result of MSE:Identifies relative merits of management procedures

for meeting management objectives

Caution: Substantial challenges ahead!

1. Characterizing operating model - Range of alternative hypotheses- Reliability of predictions from ecosystem models- Nonstationary environment (what if ...?)

2. Simulating ecosystem assessment process based on "observed" data using GADGET, an ESAM, ... - Automation of assessment process

3. Engaging scientists with decision makers, stakeholders

Conducting MSEs of ecosystem models

4. Simulating outcome uncertainty (deviation from target)

- Lack of data

5. Simulating state-dependent decision-making process- Lack of clear operational ecosystem objectives and indicators

- Complex objectives: optimize for one, make tradeoffs for others (Smith et al., Mapstone et al., and others

in Dec. 2008 Fisheries Research)

plus ...

Conducting MSEs of ecosystem models

(Link et al. 2002)

Can indicators of ecosystems from PCAs be used as measures of system state for input to harvest rules?

A C

Ecosystem status (similarity to PCA category)

F

A B C

B

PC 1

PC

2

6. Interpreting results- Across multiple indicators and sensitivity analyses

7. Computations- CPU time

Conducting MSEs of ecosystem models

1. Need standards for evaluating reliability of models

2. If fitting ecosystem models to data, use operating models to check adequacy of estimation methods

3. Evaluate how much difference will be made by proposed "improvements" to ecosystem models (more complex not necessarily better)

4. Clarify operational management objectives and indicators that reflect ecosystem concerns

5. Analyze multiple models

Recommendations for next steps for ecosystem models

6. If use MSE approach (Tivoli, Plagányi, NEMoW I) - Start simply (ESAMs, MRMs) for assessment models (Butterworth and Plagányi 2004)

- Choose operating model (e.g., Atlantis) - Build experience- Determine feasibility of MSEs for evaluating more

complex assessment models (GADGET, EwE, ...)

7. Add to "Best practices" - Standardized protocol for determining performance of multiple assessment models for a given aquatic

ecosystem. - Training/gaming workshops to improve communication

Recommendations for next steps for ecosystem models

Reminders

• Sensitivity analyses should focus on finding which components cause changes in management advice.

• We probably underestimate the magnitude of uncertainty in estimates of parameters, state variables

• C.S. Holling: "The domain of our ignorance is larger than the domain of our knowledge."

Recommended