77
Tuesday 9:00-12:30 Part 3 Robust Bayesian statistics & applications in reliability networks by Gero Walter 69

Part 3 Robust Bayesian statistics & applications in

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Part 3 Robust Bayesian statistics & applications in

Tuesday 9:00-12:30

Part 3Robust Bayesian

statistics & applicationsin reliability networks

by Gero Walter

69

Page 2: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian statistics & applications in reliabilitynetworksOutline

Robust Bayesian Analysis (9am)WhyThe Imprecise Dirichlet ModelGeneral Framework for Canonical Exponential Families

Exercises I (9:30am)

System Reliability Application (10am)

Break (10:30am)

Exercises II (11am)

70

Page 3: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian statistics & applications in reliabilitynetworksOutline

Robust Bayesian Analysis (9am)WhyThe Imprecise Dirichlet ModelGeneral Framework for Canonical Exponential Families

Exercises I (9:30am)

System Reliability Application (10am)

Break (10:30am)

Exercises II (11am)

71

Page 4: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Why

I choice of prior can severely affect inferenceseven if your prior is ‘non-informative’

I solution: systematic sensitivity analysis over prior parametersI models from canonical exponential family make this easy to do

[18]I close relations to robust Bayes literature, e.g. [7, 19, 20]I concerns uncertainty in the prior

(uncertainty in data generating process: imprecise samplingmodels)

I here: focus on imprecise Dirichlet modelI if your prior is informative then prior-data conflict can be an

issue [31, 29](we’ll come back to this in the system reliability application)

72

Page 5: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Why

I choice of prior can severely affect inferenceseven if your prior is ‘non-informative’

I solution: systematic sensitivity analysis over prior parameters

I models from canonical exponential family make this easy to do[18]

I close relations to robust Bayes literature, e.g. [7, 19, 20]I concerns uncertainty in the prior

(uncertainty in data generating process: imprecise samplingmodels)

I here: focus on imprecise Dirichlet modelI if your prior is informative then prior-data conflict can be an

issue [31, 29](we’ll come back to this in the system reliability application)

72

Page 6: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Why

I choice of prior can severely affect inferenceseven if your prior is ‘non-informative’

I solution: systematic sensitivity analysis over prior parametersI models from canonical exponential family make this easy to do

[18]

I close relations to robust Bayes literature, e.g. [7, 19, 20]I concerns uncertainty in the prior

(uncertainty in data generating process: imprecise samplingmodels)

I here: focus on imprecise Dirichlet modelI if your prior is informative then prior-data conflict can be an

issue [31, 29](we’ll come back to this in the system reliability application)

72

Page 7: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Why

I choice of prior can severely affect inferenceseven if your prior is ‘non-informative’

I solution: systematic sensitivity analysis over prior parametersI models from canonical exponential family make this easy to do

[18]I close relations to robust Bayes literature, e.g. [7, 19, 20]

I concerns uncertainty in the prior(uncertainty in data generating process: imprecise samplingmodels)

I here: focus on imprecise Dirichlet modelI if your prior is informative then prior-data conflict can be an

issue [31, 29](we’ll come back to this in the system reliability application)

72

Page 8: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Why

I choice of prior can severely affect inferenceseven if your prior is ‘non-informative’

I solution: systematic sensitivity analysis over prior parametersI models from canonical exponential family make this easy to do

[18]I close relations to robust Bayes literature, e.g. [7, 19, 20]I concerns uncertainty in the prior

(uncertainty in data generating process: imprecise samplingmodels)

I here: focus on imprecise Dirichlet modelI if your prior is informative then prior-data conflict can be an

issue [31, 29](we’ll come back to this in the system reliability application)

72

Page 9: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Why

I choice of prior can severely affect inferenceseven if your prior is ‘non-informative’

I solution: systematic sensitivity analysis over prior parametersI models from canonical exponential family make this easy to do

[18]I close relations to robust Bayes literature, e.g. [7, 19, 20]I concerns uncertainty in the prior

(uncertainty in data generating process: imprecise samplingmodels)

I here: focus on imprecise Dirichlet model

I if your prior is informative then prior-data conflict can be anissue [31, 29](we’ll come back to this in the system reliability application)

72

Page 10: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Why

I choice of prior can severely affect inferenceseven if your prior is ‘non-informative’

I solution: systematic sensitivity analysis over prior parametersI models from canonical exponential family make this easy to do

[18]I close relations to robust Bayes literature, e.g. [7, 19, 20]I concerns uncertainty in the prior

(uncertainty in data generating process: imprecise samplingmodels)

I here: focus on imprecise Dirichlet modelI if your prior is informative then prior-data conflict can be an

issue [31, 29](we’ll come back to this in the system reliability application)

72

Page 11: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Principle of Indifference

How to construct a prior if we do not have a lot of information?

Laplace: Principle of IndifferenceUse the uniform distribution.Obvious issue: this depends on the parametrisation!

ExampleAn object of 1kg has uncertain volume V between 1` and 2`.

I Uniform distribution over volume V =⇒ E (V ) = 1.5`.I Uniform distribution over density ρ = 1/V =⇒

E (V ) = E (1/ρ) =∫ 10.5 2/ρdρ = 2(ln 1− ln 0.5) = 1.39`

The uniform distribution does not really model prior ignorance.(Jeffreys prior is transformation-invariant, but depends on thesample space and can break decision making!)

73

Page 12: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Principle of Indifference

How to construct a prior if we do not have a lot of information?

Laplace: Principle of IndifferenceUse the uniform distribution.Obvious issue: this depends on the parametrisation!

ExampleAn object of 1kg has uncertain volume V between 1` and 2`.

I Uniform distribution over volume V =⇒ E (V ) = 1.5`.I Uniform distribution over density ρ = 1/V =⇒

E (V ) = E (1/ρ) =∫ 10.5 2/ρdρ = 2(ln 1− ln 0.5) = 1.39`

The uniform distribution does not really model prior ignorance.(Jeffreys prior is transformation-invariant, but depends on thesample space and can break decision making!)

73

Page 13: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Principle of Indifference

How to construct a prior if we do not have a lot of information?

Laplace: Principle of IndifferenceUse the uniform distribution.Obvious issue: this depends on the parametrisation!

ExampleAn object of 1kg has uncertain volume V between 1` and 2`.

I Uniform distribution over volume V =⇒ E (V ) = 1.5`.I Uniform distribution over density ρ = 1/V =⇒

E (V ) = E (1/ρ) =∫ 10.5 2/ρdρ = 2(ln 1− ln 0.5) = 1.39`

The uniform distribution does not really model prior ignorance.(Jeffreys prior is transformation-invariant, but depends on thesample space and can break decision making!)

73

Page 14: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Prior Ignorance via Sets ofProbabilities

How to construct prior if we do not have a lot of information?

Boole: Probability BoundingUse the set of all probability distributions (vacuous model).Results no longer depend on parametrisation!

ExampleAn object of 1kg has uncertain volume V between 1` and 2`.

I Set of all distributions over volume V =⇒ E (V ) ∈ [1, 2].I Set of all distribution over density ρ = 1/V =⇒

E (V ) = E (1/ρ) ∈ [1, 2]

74

Page 15: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Prior Ignorance via Sets ofProbabilities

How to construct prior if we do not have a lot of information?

Boole: Probability BoundingUse the set of all probability distributions (vacuous model).Results no longer depend on parametrisation!

ExampleAn object of 1kg has uncertain volume V between 1` and 2`.

I Set of all distributions over volume V =⇒ E (V ) ∈ [1, 2].I Set of all distribution over density ρ = 1/V =⇒

E (V ) = E (1/ρ) ∈ [1, 2]

74

Page 16: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Prior Ignorance via Sets ofProbabilities

TheoremThe set of posterior distributions resulting from a vacuous set ofprior distributions is again vacuous, regardless of the likelihood.We can never learn anything when starting from a vacuous set ofpriors.

Solution: Near-Vacuous Sets of PriorsOnly insist that the prior predictive, or other classes of inferences,are vacuous.This can be done using sets of conjugate priors [4, 5].

75

Page 17: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Prior Ignorance via Sets ofProbabilities

TheoremThe set of posterior distributions resulting from a vacuous set ofprior distributions is again vacuous, regardless of the likelihood.We can never learn anything when starting from a vacuous set ofpriors.

Solution: Near-Vacuous Sets of PriorsOnly insist that the prior predictive, or other classes of inferences,are vacuous.This can be done using sets of conjugate priors [4, 5].

75

Page 18: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian statistics & applications in reliabilitynetworksOutline

Robust Bayesian Analysis (9am)WhyThe Imprecise Dirichlet ModelGeneral Framework for Canonical Exponential Families

Exercises I (9:30am)

System Reliability Application (10am)

Break (10:30am)

Exercises II (11am)

76

Page 19: Part 3 Robust Bayesian statistics & applications in

The Imprecise Dirichlet Model: DefinitionI introduced by Peter Walley [27, 28]I for multinomial sampling, k categories 1, 2, . . . , kI Bayesian conjugate analysis

I multinomial likelihood (sample n = (n1, . . . , nk),∑

ni = n)

f (n | θ) =n!

n1! · · · nk !

k∏

i=1

θnii

I conjugate Dirichlet priorI with mean t = (t1, . . . , tk) = prior expected proportionsI and parameter s > 0

f (θ) =Γ(s)

∏ki=1 Γ(sti )

k∏

i=1

θsti−1i

Definition (Imprecise Dirichlet Model)Use the setM(0) of all Dirichlet priors, for a fixed s > 0,and take the infimum/supremum over t of the posteriorto get lower/upper predictive probabilities/expectations.

77

Page 20: Part 3 Robust Bayesian statistics & applications in

The Imprecise Dirichlet Model: DefinitionI introduced by Peter Walley [27, 28]I for multinomial sampling, k categories 1, 2, . . . , kI Bayesian conjugate analysis

I multinomial likelihood (sample n = (n1, . . . , nk),∑

ni = n)

f (n | θ) =n!

n1! · · · nk !

k∏

i=1

θnii

I conjugate Dirichlet priorI with mean t = (t1, . . . , tk) = prior expected proportionsI and parameter s > 0

f (θ) =Γ(s)

∏ki=1 Γ(sti )

k∏

i=1

θsti−1i

Definition (Imprecise Dirichlet Model)Use the setM(0) of all Dirichlet priors, for a fixed s > 0,and take the infimum/supremum over t of the posteriorto get lower/upper predictive probabilities/expectations.

77

Page 21: Part 3 Robust Bayesian statistics & applications in

The Imprecise Dirichlet Model: Properties

I conjugacy: f (θ | n) again Dirichlet with parameters

t∗i =sti + nis + n

=s

s + nti +

ns + n

nin,

s∗ = s + n

I t∗i = E (θi | n) = P (i | n) is a weighted average of ti andni/n, with weights proportional to s and n, respectively

I s can be interpreted as a prior strength or pseudocountI lower and upper expectations / probabilities by min and max

over t ∈ ∆ (unit simplex)

78

Page 22: Part 3 Robust Bayesian statistics & applications in

The Imprecise Dirichlet Model: PropertiesPosterior predictive probabilities

I for observing a particular category

P (i | n) =ni

s + n, P (i | n) =

s + nis + n

I for observing a non-trivial event A ⊆ {1, . . . , k}

P (A | n) =nA

s + n, P (A | n) =

s + nAs + n

,

with nA =∑

i∈A ni

Satisfies prior near ignorance:vacuous for prior predictive P(A) = 0,P(A) = 1Inferences are independent of categorisation(‘Representation Invariance Principle’).

79

Page 23: Part 3 Robust Bayesian statistics & applications in

The Imprecise Dirichlet Model: Why A Set of Priors?

I single prior =⇒ dependence on categorisationI for example, single Dirichlet prior (with tA =

∑i∈A ti , s = 2)

P (A | n) =2tA + nAn + 2

one red marble observedI two categories red (R) and other (O):

prior ignorance =⇒ tR = tO = 12 =⇒ P (R | n) = 2

3I three categories red (R), green (G), and blue (B):

prior ignorance =⇒ tR = tG = tB = 13 =⇒ P (R | n) = 5

9

prior ignorance + representation invariance principle=⇒ must use set of priors

80

Page 24: Part 3 Robust Bayesian statistics & applications in

The Imprecise Dirichlet Model: Why A Set of Priors?

I single prior =⇒ dependence on categorisationI for example, single Dirichlet prior (with tA =

∑i∈A ti , s = 2)

P (A | n) =2tA + nAn + 2

one red marble observedI two categories red (R) and other (O):

prior ignorance =⇒ tR = tO = 12 =⇒ P (R | n) = 2

3I three categories red (R), green (G), and blue (B):

prior ignorance =⇒ tR = tG = tB = 13 =⇒ P (R | n) = 5

9

prior ignorance + representation invariance principle=⇒ must use set of priors

80

Page 25: Part 3 Robust Bayesian statistics & applications in

The Imprecise Dirichlet Model: The s Parameter

I s can be interpreted as a prior strength or pseudocountI s determines learning speed:

P (A | n) − P (A | n) =s

s + n

I no objective way of choosing s, but s = 2 covers mostBayesian and frequentist inferences

I for s = n posterior imprecision is half the prior imprecisionI for informative ti bounds, using a range of s values

allows the set of posteriors to reflect prior-data conflict(see system reliability application)

81

Page 26: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian statistics & applications in reliabilitynetworksOutline

Robust Bayesian Analysis (9am)WhyThe Imprecise Dirichlet ModelGeneral Framework for Canonical Exponential Families

Exercises I (9:30am)

System Reliability Application (10am)

Break (10:30am)

Exercises II (11am)

82

Page 27: Part 3 Robust Bayesian statistics & applications in

General Framework for Canonical Exponential FamiliesConjugate priors like the Dirichlet can be constructed for sampledistributions (likelihood) from:

Definition (Canonical exponential family)

f (x | ψ) = h(x) exp{ψT τ(x)− b(ψ))

}

I includes multinomial, normal, Poisson, exponential, . . .I ψ generally a transformation of original parameter θ

Definition (Family of conjugate priors)A family of priors for i.i.d. sampling from the can. exp. family:

f (ψ | n(0), y (0)) ∝ exp{n(0)[ψT y (0) − b(ψ)

]}

with hyper-parameters n(0) (↔ s) and y (0) (↔ t from IDM).

83

Page 28: Part 3 Robust Bayesian statistics & applications in

General Framework for Canonical Exponential FamiliesConjugate priors like the Dirichlet can be constructed for sampledistributions (likelihood) from:

Definition (Canonical exponential family)

f (x | ψ) = h(x) exp{ψT τ(x)− b(ψ))

}

I includes multinomial, normal, Poisson, exponential, . . .I ψ generally a transformation of original parameter θ

Definition (Family of conjugate priors)A family of priors for i.i.d. sampling from the can. exp. family:

f (ψ | n(0), y (0)) ∝ exp{n(0)[ψT y (0) − b(ψ)

]}

with hyper-parameters n(0) (↔ s) and y (0) (↔ t from IDM).83

Page 29: Part 3 Robust Bayesian statistics & applications in

General Framework for Canonical Exponential Families

Theorem (Conjugacy)Posterior is of the same form:

f (ψ | n(0), y (0), x) ∝ exp{n(n)

[ψT y (n) − b(ψ)

]}

where

x = (x1, . . . , xn)

(s∗ ↔) n(n) = n(0) + n

(t∗ ↔) y (n) =n(0)

n(0) + n· y (0) +

nn(0) + n

· τ(x)

n

(ni ↔) τ(x) =n∑

i=1

τ(xi )

84

Page 30: Part 3 Robust Bayesian statistics & applications in

General Framework for Canonical Exponential Families

I y (0) (↔ t) = prior expectation of τ(x)/nI n(0) (↔ s) determines spread and learning speed

I usefulness of this framework for IP / robust Bayesdiscovered by Quaghebeur & de Cooman [18]

I near-noninformative sets of priorsdeveloped by Benavoli & Zaffalon [4, 5]

I for informative sets of priors, Walter & Augustin [29, 31]suggest to use parameter sets Π(0) = [n(0), n(0)]× [y (0), y (0)]

85

Page 31: Part 3 Robust Bayesian statistics & applications in

General Framework for Canonical Exponential Families

I y (0) (↔ t) = prior expectation of τ(x)/nI n(0) (↔ s) determines spread and learning speed

I usefulness of this framework for IP / robust Bayesdiscovered by Quaghebeur & de Cooman [18]

I near-noninformative sets of priorsdeveloped by Benavoli & Zaffalon [4, 5]

I for informative sets of priors, Walter & Augustin [29, 31]suggest to use parameter sets Π(0) = [n(0), n(0)]× [y (0), y (0)]

85

Page 32: Part 3 Robust Bayesian statistics & applications in

General Framework: Why vary n(0)(↔ s)?

What if prior assumptions and data tell different stories?

Prior-Data ConflictI informative prior beliefs and trusted data

(sampling model correct, no outliers, etc.) are in conflictI “the prior [places] its mass primarily on distributions in the

sampling model for which the observed data is surprising” [10]I there are not enough data to overrule the prior

Example: IDM with k = 2 =⇒ Imprecise Beta Model

86

Page 33: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model (IBM)I binomial likelihood (observing x successes in n trials)

f (x | θ) =

(nx

)θx(1− θ)n−x

I conjugate Beta priorI with mean y (0) = prior expected probability of successI and prior strength parameter n(0) > 0

f (θ) ∝ θn(0)y (0)−1 (1− θ)n(0)(1−y (0))−1

I informative set of priors: Use the setM(0) of Beta priors withy (0) ∈ [y (0), y (0)] and

I n(0) > 0 fixed, orI n(0) ∈ [n(0), n(0)]

I E (θ | x) = y (n) is a weighted average of E (θ) = y (0) andxn!

I Var (θ | x) =y (n)(1− y (n))

n(n) + 1decreases with n!

87

Page 34: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model (IBM)I binomial likelihood (observing x successes in n trials)

f (x | θ) =

(nx

)θx(1− θ)n−x

I conjugate Beta priorI with mean y (0) = prior expected probability of successI and prior strength parameter n(0) > 0

f (θ) ∝ θn(0)y (0)−1 (1− θ)n(0)(1−y (0))−1

I informative set of priors: Use the setM(0) of Beta priors withy (0) ∈ [y (0), y (0)] and

I n(0) > 0 fixed, orI n(0) ∈ [n(0), n(0)]

I E (θ | x) = y (n) is a weighted average of E (θ) = y (0) andxn!

I Var (θ | x) =y (n)(1− y (n))

n(n) + 1decreases with n!

87

Page 35: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model (IBM)I binomial likelihood (observing x successes in n trials)

f (x | θ) =

(nx

)θx(1− θ)n−x

I conjugate Beta priorI with mean y (0) = prior expected probability of successI and prior strength parameter n(0) > 0

f (θ) ∝ θn(0)y (0)−1 (1− θ)n(0)(1−y (0))−1

I informative set of priors: Use the setM(0) of Beta priors withy (0) ∈ [y (0), y (0)] and

I n(0) > 0 fixed, orI n(0) ∈ [n(0), n(0)]

I E (θ | x) = y (n) is a weighted average of E (θ) = y (0) andxn!

I Var (θ | x) =y (n)(1− y (n))

n(n) + 1decreases with n!

87

Page 36: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model (IBM)I binomial likelihood (observing x successes in n trials)

f (x | θ) =

(nx

)θx(1− θ)n−x

I conjugate Beta priorI with mean y (0) = prior expected probability of successI and prior strength parameter n(0) > 0

f (θ) ∝ θn(0)y (0)−1 (1− θ)n(0)(1−y (0))−1

I informative set of priors: Use the setM(0) of Beta priors withy (0) ∈ [y (0), y (0)] and

I n(0) > 0 fixed, orI n(0) ∈ [n(0), n(0)]

I E (θ | x) = y (n) is a weighted average of E (θ) = y (0) andxn!

I Var (θ | x) =y (n)(1− y (n))

n(n) + 1decreases with n!

87

Page 37: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model with n(0) fixed

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

)

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

16out

of 16

no conflict:prior n(0) = 8, y (0) ∈ [0.7, 0.8]data s/n = 12/16 = 0.75

Hn(n) = 24, y (n) ∈ [0.73, 0.77]

Nprior data conflict:prior n(0) = 8, y (0) ∈ [0.2, 0.3]data s/n = 16/16 = 1

88

Page 38: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model with n(0) fixed

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

)

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

16out

of 16

no conflict:prior n(0) = 8, y (0) ∈ [0.7, 0.8]data s/n = 12/16 = 0.75

Hn(n) = 24, y (n) ∈ [0.73, 0.77]

Nprior data conflict:prior n(0) = 8, y (0) ∈ [0.2, 0.3]data s/n = 16/16 = 1

88

Page 39: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model with n(0) fixed

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

)

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

16out

of 16

no conflict:prior n(0) = 8, y (0) ∈ [0.7, 0.8]data s/n = 12/16 = 0.75

Hn(n) = 24, y (n) ∈ [0.73, 0.77]

N

prior data conflict:prior n(0) = 8, y (0) ∈ [0.2, 0.3]data s/n = 16/16 = 1

88

Page 40: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model with n(0) fixed

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

)

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

16out

of 16

no conflict:prior n(0) = 8, y (0) ∈ [0.7, 0.8]data s/n = 12/16 = 0.75

Hn(n) = 24, y (n) ∈ [0.73, 0.77]

Nprior data conflict:prior n(0) = 8, y (0) ∈ [0.2, 0.3]data s/n = 16/16 = 1

88

Page 41: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model with n(0) interval

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

)

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

16out

of16

no conflict:prior n(0)∈ [4, 8], y (0)∈ [0.7, 0.8]data s/n = 12/16 = 0.75

Hy (n) ∈ [0.73, 0.77]“spotlight” shape

prior-data conflict:prior n(0)∈ [4, 8], y (0)∈ [0.2, 0.3]data s/n = 16/16 = 1

Hy (n) ∈ [0.73, 0.86]“banana” shape

89

Page 42: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model with n(0) interval

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

)

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

16out

of16

no conflict:prior n(0)∈ [4, 8], y (0)∈ [0.7, 0.8]data s/n = 12/16 = 0.75

Hy (n) ∈ [0.73, 0.77]“spotlight” shape

prior-data conflict:prior n(0)∈ [4, 8], y (0)∈ [0.2, 0.3]data s/n = 16/16 = 1

Hy (n) ∈ [0.73, 0.86]“banana” shape

89

Page 43: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model with n(0) interval

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

)

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

16out

of16

no conflict:prior n(0)∈ [4, 8], y (0)∈ [0.7, 0.8]data s/n = 12/16 = 0.75

Hy (n) ∈ [0.73, 0.77]“spotlight” shape

prior-data conflict:prior n(0)∈ [4, 8], y (0)∈ [0.2, 0.3]data s/n = 16/16 = 1

Hy (n) ∈ [0.73, 0.86]“banana” shape

89

Page 44: Part 3 Robust Bayesian statistics & applications in

Imprecise Beta Model with n(0) interval

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

)

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

5 10 15 20 25

0.0

0.2

0.4

0.6

0.8

1.0

n(0) resp. n(n)

y(0)resp.y(n

) 12 out of 16

16out

of16

no conflict:prior n(0)∈ [4, 8], y (0)∈ [0.7, 0.8]data s/n = 12/16 = 0.75

Hy (n) ∈ [0.73, 0.77]“spotlight” shape

prior-data conflict:prior n(0)∈ [4, 8], y (0)∈ [0.2, 0.3]data s/n = 16/16 = 1

Hy (n) ∈ [0.73, 0.86]“banana” shape

89

Page 45: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Other Models

I How to define sets of priorsM(0) is a crucial modeling choiceI SetsM(0) via parameter sets Π(0) seem to work better than

other models discussed in the robust Bayes literature:I Neighbourhood models

I set of distributions ‘close to’ a central distribution P0I example: ε-contamination class:{P : P = (1− ε)P0 + εQ,Q ∈ Q}

I not necessarily closed under Bayesian updating

I Density ratio class / interval of measuresI set of distributions by bounds for the density function f (ϑ):

M(0)l,u =

{f (θ) : ∃c ∈ R>0 : l(θ) ≤ cf (θ) ≤ u(θ)

}

I posterior set is bounded by updated l(θ) and u(θ)I u(θ)/l(θ) is constant under updating

I size of the set does not decrease with nI very vague posterior inferences

90

Page 46: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian Analysis: Other Models

I How to define sets of priorsM(0) is a crucial modeling choiceI SetsM(0) via parameter sets Π(0) seem to work better than

other models discussed in the robust Bayes literature:I Neighbourhood models

I set of distributions ‘close to’ a central distribution P0I example: ε-contamination class:{P : P = (1− ε)P0 + εQ,Q ∈ Q}

I not necessarily closed under Bayesian updatingI Density ratio class / interval of measures

I set of distributions by bounds for the density function f (ϑ):

M(0)l,u =

{f (θ) : ∃c ∈ R>0 : l(θ) ≤ cf (θ) ≤ u(θ)

}

I posterior set is bounded by updated l(θ) and u(θ)I u(θ)/l(θ) is constant under updating

I size of the set does not decrease with nI very vague posterior inferences

90

Page 47: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian statistics & applications in reliabilitynetworksOutline

Robust Bayesian Analysis (9am)WhyThe Imprecise Dirichlet ModelGeneral Framework for Canonical Exponential Families

Exercises I (9:30am)

System Reliability Application (10am)

Break (10:30am)

Exercises II (11am)

91

Page 48: Part 3 Robust Bayesian statistics & applications in

Exercise: Update step in the IBM

Consider first a single Beta prior with parameters n(0) and y (0).

Discuss how E[θ | x ] = y (n) and Var(θ | x) = y (n)(1−y (n))

n(n)+1 behavewhen(i) n(0) → 0, (ii) n(0) →∞, (iii) n→∞ when x/n = const.

What do the results for E[θ | x ] and Var(θ | x) imply for the shapeof f (θ | x)?

When considering a set of Beta priors with

(n(0), y (0)) ∈ Π(0) = [n(0), n(0)]× [y (0), y (0)],

what does the weighted average structure of y (n) tell you about theupdated set of parameters? (Remember the prior-data conflict example!)

92

Page 49: Part 3 Robust Bayesian statistics & applications in

Exercise: Update step for canonically constructed priorsThe canonically constructed prior for an i.i.d. sample distribitionfrom the canonical exponential family

f (x | ψ) =

(n∏

i=1

h(xi )

)exp{ψT τ(x)− nb(ψ)

}

is given by

f (ψ | n(0), y (0)) ∝ exp{n(0)[ψT y (0) − b(ψ)

]},

and the corresponding posterior by

f (ψ | n(0), y (0), x) ∝ exp{n(n)

[ψT y (n) − b(ψ)

]},

where y (n) =n(0)

n(0) + n· y (0) +

nn(0) + n

· τ(x)

nand n(n) = n(0) + n.

Confirm the expressions for y (n) and n(n).93

Page 50: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian statistics & applications in reliabilitynetworksOutline

Robust Bayesian Analysis (9am)WhyThe Imprecise Dirichlet ModelGeneral Framework for Canonical Exponential Families

Exercises I (9:30am)

System Reliability Application (10am)

Break (10:30am)

Exercises II (11am)

94

Page 51: Part 3 Robust Bayesian statistics & applications in

System Reliability Application: Reliability Block Diagrams

1

1

1

1

2 3

Reliability block diagrams:I system consists of components k

(different types k = 1, . . . ,K )I each k either works or notI system works when there is a path

using only working components

We want to learn about the system reliability Rsys(t) = P(Tsys > t)(system survival function)based onI component test data:

nk failure times for components of type k , k = 1, . . . ,KI cautious assumptions on component reliability:

expert information, e.g. from maintenance managers and staff

95

Page 52: Part 3 Robust Bayesian statistics & applications in

System Reliability Application: Reliability Block Diagrams

1

1

1

1

2 3

Reliability block diagrams:I system consists of components k

(different types k = 1, . . . ,K )I each k either works or notI system works when there is a path

using only working components

We want to learn about the system reliability Rsys(t) = P(Tsys > t)(system survival function)based onI component test data:

nk failure times for components of type k , k = 1, . . . ,KI cautious assumptions on component reliability:

expert information, e.g. from maintenance managers and staff

95

Page 53: Part 3 Robust Bayesian statistics & applications in

Nonparametric Component Reliability

Functioning probability pkt of k for each time t ∈ T = {t ′1, t ′2, . . .}I discrete component reliability function Rk(t) = pkt , t ∈ T .

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

use Imprecise Beta Model to estimate pkt ’s:

I Set of Beta priors for each pkt :pkt ∼ Beta(n(0)

k,t , y(0)k,t ) with

(n(0)k,t , y

(0)k,t ) ∈ Π

(0)k,t = [n(0)

k,t , n(0)k,t ]× [y (0)

k,t , y(0)k,t ]

M(0)k,t can be near-vacuous by setting [y (0)

k,t , y(0)k,t ] = [0, 1]

M(n)k,t will reflect prior-data conflict for informativeM(0)

k,t

I failure times tk = (tk1 , . . . , tknk ) from component test data:

number of type k components functioning at t:Sk

t | pkt ∼ Binomial(pk

t , nk)

96

Page 54: Part 3 Robust Bayesian statistics & applications in

Nonparametric Component Reliability

Functioning probability pkt of k for each time t ∈ T = {t ′1, t ′2, . . .}I discrete component reliability function Rk(t) = pkt , t ∈ T .

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

use Imprecise Beta Model to estimate pkt ’s:

I Set of Beta priors for each pkt :pkt ∼ Beta(n(0)

k,t , y(0)k,t ) with

(n(0)k,t , y

(0)k,t ) ∈ Π

(0)k,t = [n(0)

k,t , n(0)k,t ]× [y (0)

k,t , y(0)k,t ]

M(0)k,t can be near-vacuous by setting [y (0)

k,t , y(0)k,t ] = [0, 1]

M(n)k,t will reflect prior-data conflict for informativeM(0)

k,t

I failure times tk = (tk1 , . . . , tknk ) from component test data:

number of type k components functioning at t:Sk

t | pkt ∼ Binomial(pk

t , nk)

96

Page 55: Part 3 Robust Bayesian statistics & applications in

Nonparametric Component Reliability

Functioning probability pkt of k for each time t ∈ T = {t ′1, t ′2, . . .}I discrete component reliability function Rk(t) = pkt , t ∈ T .

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

use Imprecise Beta Model to estimate pkt ’s:

I Set of Beta priors for each pkt :pkt ∼ Beta(n(0)

k,t , y(0)k,t ) with

(n(0)k,t , y

(0)k,t ) ∈ Π

(0)k,t = [n(0)

k,t , n(0)k,t ]× [y (0)

k,t , y(0)k,t ]

M(0)k,t can be near-vacuous by setting [y (0)

k,t , y(0)k,t ] = [0, 1]

M(n)k,t will reflect prior-data conflict for informativeM(0)

k,t

I failure times tk = (tk1 , . . . , tknk ) from component test data:

number of type k components functioning at t:Sk

t | pkt ∼ Binomial(pk

t , nk)

96

Page 56: Part 3 Robust Bayesian statistics & applications in

Nonparametric Component Reliability

Functioning probability pkt of k for each time t ∈ T = {t ′1, t ′2, . . .}I discrete component reliability function Rk(t) = pkt , t ∈ T .

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

use Imprecise Beta Model to estimate pkt ’s:

I Set of Beta priors for each pkt :pkt ∼ Beta(n(0)

k,t , y(0)k,t ) with

(n(0)k,t , y

(0)k,t ) ∈ Π

(0)k,t = [n(0)

k,t , n(0)k,t ]× [y (0)

k,t , y(0)k,t ]

M(0)k,t can be near-vacuous by setting [y (0)

k,t , y(0)k,t ] = [0, 1]

M(n)k,t will reflect prior-data conflict for informativeM(0)

k,t

I failure times tk = (tk1 , . . . , tknk ) from component test data:

number of type k components functioning at t:Sk

t | pkt ∼ Binomial(pk

t , nk)

96

Page 57: Part 3 Robust Bayesian statistics & applications in

Nonparametric Component Reliability

Functioning probability pkt of k for each time t ∈ T = {t ′1, t ′2, . . .}I discrete component reliability function Rk(t) = pkt , t ∈ T .

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

●●

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ● ● ● ● ● ● ● ●0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0t

p tk

use Imprecise Beta Model to estimate pkt ’s:I Set of Beta priors for each pkt :

pkt ∼ Beta(n(0)

k,t , y(0)k,t ) with

(n(0)k,t , y

(0)k,t ) ∈ Π

(0)k,t = [n(0)

k,t , n(0)k,t ]× [y (0)

k,t , y(0)k,t ]

M(0)k,t can be near-vacuous by setting [y (0)

k,t , y(0)k,t ] = [0, 1]

M(n)k,t will reflect prior-data conflict for informativeM(0)

k,t

I failure times tk = (tk1 , . . . , tknk ) from component test data:

number of type k components functioning at t:Sk

t | pkt ∼ Binomial(pk

t , nk)

96

Page 58: Part 3 Robust Bayesian statistics & applications in

Component Reliability with Sets of Priors

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 2][y (0), y (0)] = l

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 2][y (0), y (0)] = (0, 1)

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 8][y (0), y (0)] = l

97

Page 59: Part 3 Robust Bayesian statistics & applications in

Component Reliability with Sets of Priors

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 2][y (0), y (0)] = l

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 2][y (0), y (0)] = (0, 1)

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 8][y (0), y (0)] = l

97

Page 60: Part 3 Robust Bayesian statistics & applications in

Component Reliability with Sets of Priors

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 2][y (0), y (0)] = l

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 2][y (0), y (0)] = (0, 1)

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

[n(0), n(0)] = [1, 8][y (0), y (0)] = l

97

Page 61: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3 3 7 7 7 7

11

11

11

11

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 62: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3 3 7 7 7 7

1

1

1

1

1

1

1

1

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 63: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3

3 7 7 7 7

1

1

1

1

1

1

1

1

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 64: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3 3

7 7 7 7

1

1

1

1

1

1

1

1

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 65: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3 3 7

7 7 7

1

1

1

1

1

1

1

1

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 66: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3 3 7 7

7 7

1

1

1

1

1

1

1

1

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 67: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3 3 7 7 7

7

1

1

1

1

1

1

1

1

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 68: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3 3 7 7 7 7

1

1

1

1

1

1

1

1

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 69: Part 3 Robust Bayesian statistics & applications in

System Reliability

I Closed form for the system reliability via the survival signature:

Rsys

(t | ⋃K

k=1{n(0)k,t , y

(0)k,t , t

k}) = P(Tsys > t | · · · )

=

m1∑

l1=0

· · ·mK∑

lK=0

Φ(l1, . . . , lK )K∏

k=1

P(C kt = lk | n(0)

k,t , y(0)k,t , t

k)

Survival signature [8] Φ(l1, . . . , lK ) =

P(system functions | {lk k ’s function}1:K )

l1 l2 l3 Φ

0 0 1 01 0 1 02 0 1 1/33 0 1 14 0 1 1

l1 l2 l3 Φ

0 1 1 01 1 1 02 1 1 2/33 1 1 14 1 1 1

3 3 7 7 7 7

11

11

11

11

2 3

Post. pred. probability thatin a new system, lk of the mk k ’sfunction at time t:(mklk

) ∫[P(T < t | pkt )]lk

[P(T ≥ t | pkt )]mk−lk

f (pkt | n(0)k,t , y

(0)k,t , t

k) dpktI analytical solution for integral:C kt | n(0)

k,t , y(0)k,t ,~t

k ∼Beta-Binom.

98

Page 70: Part 3 Robust Bayesian statistics & applications in

System Reliability Bounds

I Bounds for Rsys

(t∣∣ K⋃k=1

{n(0)k,t , y

(0)k,t , t

k})

over Π(0)k,t ,

k = 1, . . . ,K :I minRsys(·) by y (0)

k,t = y (0)k,t for any n(0)

k,t[30, Theorem 1]

I minRsys(·) for n(0)k,t or n

(0)k,t according to simple conditions

[30, Theorem 2 & Lemma 3]

I numeric optimization over [n(0)k,t , n

(0)k,t ] in the very few cases

where Theorem 2 & Lemma 3 do not applyI implemented in R package ReliabilityTheory [1]

99

Page 71: Part 3 Robust Bayesian statistics & applications in

System Reliability Bounds

M H C

P System

0.00

0.25

0.50

0.75

1.00

0.00

0.25

0.50

0.75

1.00

0.0 2.5 5.0 7.5 10.0 0.0 2.5 5.0 7.5 10.0Time

Sur

viva

l Pro

babi

lity

Prior Posterior

M

C1

C2

C3

C4

P1

P2

P3

P4H

100

Page 72: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian statistics & applications in reliabilitynetworksOutline

Robust Bayesian Analysis (9am)WhyThe Imprecise Dirichlet ModelGeneral Framework for Canonical Exponential Families

Exercises I (9:30am)

System Reliability Application (10am)

Break (10:30am)

Exercises II (11am)

101

Page 73: Part 3 Robust Bayesian statistics & applications in

Robust Bayesian statistics & applications in reliabilitynetworksOutline

Robust Bayesian Analysis (9am)WhyThe Imprecise Dirichlet ModelGeneral Framework for Canonical Exponential Families

Exercises I (9:30am)

System Reliability Application (10am)

Break (10:30am)

Exercises II (11am)

102

Page 74: Part 3 Robust Bayesian statistics & applications in

Exercise: Try it yourself!

I Download the development version of the packageReliabilityTheory:library("devtools")install_github("louisaslett/ReliabilityTheory")

(The author is present!)I You can find a How-To in Appendix B (p. 32) of the preprint

paper at https://arxiv.org/abs/1602.01650.I Exercise 1 considers a single component only.I Exercise 2 considers a system with several types of

components.

103

Page 75: Part 3 Robust Bayesian statistics & applications in

Exercise: Technical HintsI Keeping the number of time points low (e.g., |T | = 50)

keeps computation time short.I Setting lower and upper y (0)

k,t bounds to exactly 0 or 1can lead to errors, use, e.g., 1e-5 and 1-1e-5 instead.

I To use nonParBayesSystemInferencePriorSets() for asingle component, you can set up a system with onecomponent only:onecomp <- graph.formula(s -- 1 -- t)onecomp <- setCompTypes(onecomp , list("A" = 1))onecompss <- computeSystemSurvivalSignature(onecomp)

The component type name is arbitrary, but must be used inthe test.data argument (as test.data is a named list).

I You can use nonParBayesSystemInferencePriorSets() tocalculate the set of prior system reliability functions by settingtest.data = list(A = NULL , B = NULL , C = NULL)

104

Page 76: Part 3 Robust Bayesian statistics & applications in

Exercise 1: Single Component Reliability Analysis

I Simulate 10 failure times from Gamma(5, 1).I Calculate and graph the (sets of) prior and posterior reliability

functions for. . .(a) a (discrete) precise prior, where the prior expected reliability

values follow 1 - plnorm(t, 1.5, 0.25), and the priorstrength is 5 for all t.

(b) a vacuous set of priors, with prior strength interval [1, 5].(Try out other prior strength intervals! Why is the lower boundirrelevant here?)

(c) a set of priors based on an expert’s opinion that thefunctioning probability is, at time 2, between 0.9 and 0.8, andbetween 0.6 and 0.3 at time 5. Use [1, 2] and [10, 20] for theprior strength interval.

(d) a set of priors, where y (0)k,t follows 1-pgamma(t, 3, 4), y (0)

k,tfollows 1-pgamma(t, 3, 2), and the prior strength interval is[1, 10].

105

Page 77: Part 3 Robust Bayesian statistics & applications in

Exercise 2: System Reliability Analysis

I Define a system with multiple types of components usinggraph.formula() and setCompTypes(). You can take the‘bridge’ system, the braking system, or invent your own system.

I For prior sets of component reliability functions, you can usethe sets from the previous exercise, or think of your own.

I Simulate test data for the components such that they are,from the viewpoint of the component prior, . . .

I as expected,I surprisingly early,I surprisingly late.

What is the effect on the posterior set of system reliabilityfunctions?

I Vary the sample size and the prior strength interval. What isthe effect on the posterior set of system reliability functions?

106