37
Baysian Prob.

Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Embed Size (px)

Citation preview

Page 1: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Baysian Prob.

Page 2: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Example• Red box has 2 apples and 6 oranges• Blue box has 3 apples and 1 orange• Pick a box, randomly select a fruit, and put it back

• Joint Prob.• Assume, p(B=r) = 0.4 & p(B=b) = 0.6• p(B=r, F=a) = 0.4*(2/8) = 0.1 p(B=r, F=o) =

0.4*(3/4) = 0.3• p(B=b, F=a) = 0.6*(3/4) = 9/20 p(B=b, F=o) =

0.6*(1/4) = 3/20

Joint Prob.

B\F a or 0.1 0.3b 9/20 3/20

Page 3: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Conditional Prob.• Blue box is picked, what is the

prob. Of selecting an apple ?• p(F=a|B=b) = (9/20)/(9/20+3/20)

= ¾• Or, p(F=a|B=b) = p(F=a, B=b)/p(B=b)

• p(F=a, B=b) = p(F=a|B=b)*p(B=b) = p(B=b|F=a)*p(F=a) => Product Rule

• Similarly, p(F=a|B=r) =2/8 = ¼

• p(F=a) = p(F=a|B=r)*p(B=r) + p(F=a|B=b)*p{B=b) = Σp(F=a, B=x) => Sum Rule = Σp(F=a|B=x)p(B=x)

Conditional Prob.

B\F a or 0.1 0.3b 9/20 3/20

Page 4: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• When an apple is selected, which box is it from ?• p(B=r|F=a) and p(B=b|F=a) ?

• Use the product rule• p(F=a, B=b) = p(F=a|B=b)*p(B=b) = p(B=b|

F=a)*p(F=a) • P(B=r|F=a) = p(B=r, F=a)/p(F=a) = p(F=a|B=r)*p(B=r)/p(F=a)

• In general,• p(X,Y) = p(X|Y)p(Y) = p(Y|X)p(X) Product Rule

• p(X) = Σyp(X,Y) Sum Rule

• p(Y|X) = p(X,Y)/p(X) = p(X|Y)p(Y)/p(X) Bayes’ Theorem

Bayes’ Theorem

B\F a or 0.1 0.3b 9/20 3/20

Page 5: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Predict helices and loops in a protein• Known info: helices have a high content of

hydrophobic residues

• ph and pl: frequencies of AA being in the helix or loop

• Lh and Ll : likelihoods that a sequence of N AAs are in a helix or a loop

• Lh = ∏N ph , Ll = ∏N pl

• Rather than likelihoods, their ratios have more info

• Lh/Ll : is sequence more or less likely to be a helical or loop region

• S = ln(Lh/Ll) = ∑ N ln(ph/pl): positive for helical region

• Partition a sequence into N-AA segments (N=300)

Another Example

Page 6: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Two models• Two prior probs.: Pprior

0 , Pprior1

• Pposti = Li Pprior

i/(L0 Pprior0 + L1 Pprior

1)

• Log-odd score: S΄ = ln(L1Pprior

1/L0Pprior0)

= ln(L1/L0) + ln(Pprior1/Pprior

0)

= S + ln(Pprior1/Pprior

0)

• Difference between S΄and S is simply the additive constant, and ranking will be identical whether we use S΄or S

• Warning: if Pprior1 is small, S has to be high to make

S΄positive

• When Pprior0 = Pprior

1, S΄= S

• Ppost1 = 1/(1 + L0 Pprior

0 /L1 Pprior1) = 1/(1 + exp(- S΄))

• S΄=0 →Ppost1 =1/2; S΄is large and negative → Ppost

1 ≈1

Bayesian Prob.

Page 7: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Previous example has two hypotheses (Helix or Loop)• The sequence is described by models 0 and 1

• Models 0 and 1 are defined by ph and pl

• Generalize to k hypotheses: Mk models (k=0,1,2,…)• Given a test dataset D, what is the prob. that D is described

by each of the models ?

• Known info: prior probs., Pprior(Mk) for each modelfrom other info sources

• Compute likelihood of D according to each of the models: L(D|Mk)

• Of interest is not the prob of D arising from Mk but the prob of D being described by Mk

• Namely, Ppost(Mk| D) ∞ L(D|Mk) Pprior(Mk) : posterior prob.

• Ppost(Mk| D) = L(D|Mk) Pprior(Mk)/∑iL(D|ii) Pprior(Mi)

• => Bayesian prob.• Basic principles

• We make inference using posterior probs.• If a posterior prob. of one model is higher, it can be the best

model with confidence

Prior and Posterior Probs.

Page 8: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Max Likelihood Phylogeny

Page 9: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Given a model of sequence evolution and a proposed tree structure, compute the likelihood that the known sequences would have evolved on that tree

• ML chooses the tree that maximizes this likelihood

• Three parameters• Tree toplogy• Branch lengths• Values of the

parameters in the rate matrix

Maximum Likelihood (ML) Phylogeny

Page 10: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Given a model of sequence evolution at a site

• Likelihood of ancestor X: L(X) = PXA(t1) PXG(t2)

• L(Y) = PYG(t4) ∑X L(X) PYX(t3)• L(W) = ∑y ∑Z

L(Y)PWY(t5)L(Z)PWZ(t6)

• Total likelihood for the site:• L = ∑W P W L(W)• P W: equilibrium prob.• Is equal to posterior

prob. of different clades

What is Likelihood in ML Tree ?

X

A

Y

W

Z

G G T T

t1

t3

t5

t2

t4

t6

Page 11: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• ML tree maximizes the total likelihood of the data given the tree, i.e., L(data|tree)

• We want to compute posterior prob: P(tree|data)• From Bayes theorem,

Pt(tree|data)= L(data|tree)*Pr(tree)/ ∑L(data|tree)*Pr(tree) (summation is over all possible trees)

• Namely, posterior prob. ∞ L(data|tree)*Pr(tree)• Problem is the summation over all possible trees• Moreover, what we really want is, given the data, the

posterior prob. that a particular clade of interest is present

Ppost(clade|data)= ∑P(data|tree) for trees containing the clade = ∑cladeL(data|tree)*Pprior(tree)/ ∑all treesL(data|tree)*Pprior(tree)

• In practice, Ppsot(clade|data) = # of trees containing clade/total # of trees in the sample

Computing Posterior Prob.

Page 12: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Maximum likelihood is an alternative to maximumparsimony. It is computationally intensive. A likelihoodis calculated for the probability of each residue inan alignment, based upon some model of thesubstitution process.

What are the tree topology and branch lengths that have the greatest likelihood of producing the observed data set?

ML is implemented in the TREE-PUZZLE program,as well as PAUP and PHYLIP.

Making trees using maximum likelihood

Page 262

Page 13: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

1. Reduce the problem to a series of quartet sequences with 1,2,3,4 -> Three possible topologies ()

Construct all quartet trees For N sequences, (N

4) possible quartets (N=12, 495) Three quartet topologies are weighted by posterior

probabilities

Maximum likelihood: Tree-Puzzle

1 3

2 4

1 2

3 4

1 3

4 3

Page 14: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

2. Puzzling step Start with one quartet tree (N-4 sequences remain) Add them to the branches systematically estimating

the support for each internal branch

3. Generate a majority consensus tree Branch lengths and max. likelihood values are

estimated.

Maximum likelihood: Tree-Puzzle

Page 15: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Quartet puzzling Likelihood mapping indicates the frequency with which

quartets are successfully resolved 495 points corresponding to all possible quartets

Only 9.7% of quartets are unresolved

Page 16: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

An Example with 13 globins

Page 17: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• We need to generate a large sample of trees with prob. of finding a tree in the sample being proportional to its likelihood*prior prob.

• Metropolis Algorithm• Start with a trial tree and compute likelihood, L1

• Make a slight change (change branch length, move a vertex, ..)

• Compute L2 in the modified tree• If L2 >L1 new tree is accepted• Otherwise, new tree is accepted with prob.

L2 /L1

• if rejected, start with tree of L1

• Hill-climbing, and also downhill moves• Generates trees with percentages

ML Tree Generation Algorithm

Page 18: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Calculate:

Pr [ Tree | Data] =

Bayesian inference of phylogeny

Pr [ Data | Tree] x Pr [ Tree ]

Pr [ Data ]

Pr [ Tree | Data ] is the posterior probability distribution of trees. Ideally this involves a summation over all possible trees. In practice, Monte Carlo Markov Chains (MCMC) are run to estimate the posterior probability distribution.

Notably, Baysian approaches require you to specify prior assumptions about the model of evolution.

Page 19: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Bootstrapping A method of assessing the reliability of trees Numbers in the rooted tree – called bootstrap

percentages Distances according to models are not realistic due

to chance fluctuations Boostrapping

addresses the question on if these fluctuations are influencing the tree configuration

Boostrapping deliberately construct sequence data sets that differe by some small random fluctuations from real sequences

And check if the same tree topology is obtained Randomized sequences are constructed by sampling

columns

Evaluation of Trees: Bootstrapping

Page 20: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Generate 100 or 1,000 randomized sequences And compute what percentage of randomized trees

contain the same group 77% boostrap value is considered to be reliable

e.g. 24% -- doubtful if they form a clade 71% - human/chimpanzee/pygmy chimapnzee

Between two high figures Chimpanzee/pygmy always form a clade Gorilla/human/chimpanzee/pygmy always form a clade (Gorilla.(human,chimpanzees)) appear more frequently

than (human,(gorilla, chimpanzees)) or (chimpanzees, (gorilla,human))

Thus, can conclude (human, chimpanzees) is more reliable

Can construct a consensus tree Frequency of each possible clade is determined Construct a consensus tree by adding clades from more

frequent clades

Bootstrapping

Page 21: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint
Page 22: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

In 61% of the bootstrapresamplings, ssrbp and btrbp(pig and cow RBP) formed adistinct clade. In 39% of the cases, another protein joinedthe clade (e.g. ecrbp), or oneof these two sequences joinedanother clade.

Page 23: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Evaluate trees according to the least squared error E = ∑(dij – dtree

ij)2/d2ij

Fitch and Margliash , 1967 Clustering methods such as NJ and UPGMA have a well-

defined algorithm and produces one tree, but no criterion Optimization approach has a well-defined criterion, but no

well-defined algorithm Has to construct many alternative trees and test each

one for the criterion Other optimization approaches

Maximum likelihood – choose the tree on which likelihood of observing the given sequences is highest

Parsimony – choose the tree for which the fewest number of substitutions are required in the sequences

Tree Optimization

Page 24: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Number of distinct trees grows by n!! (product of odd numbers) For N species, (2N-5)!! for unrooted trees, (2N-3)!!,

rooted N=7, 9*7*5*3*1 = 945, N=10, 2.0*106

Consider a ‘tree space’ as a set of all possible tree topologies Two trees are neighbors if they differ by a topological

change known as a nearest-neighbor interchange (NNI) With NNI, an internal branch of a tree is selected

Tree Space

A subtree is swapped with any other at the other end of the internal branch

Tree 4 is not a neighbor of tree 1 (rather by pruning and refgrafting (SPR))

Page 25: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Hill climbing algorithm Given an initial tree (from distance matrix, for

example) Find a neighboring tree that is better if found, move to this new tree, and search neighbors Until a local optimum is reached (no neighbors are

found) Cannot guarantee global optimum

Heuristic search Start with random three species, and construct an

unrooted tree Add one species at a time, connecting it in the

optimal way Continue with different initial random three species,

each time producing a local optimum Repeat this enough, and may claim a global optimum

Optimization in Tree Space

Page 26: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

Parsimony is fast – ML requires each tree topology to be optimized

ML is model-based parsimony’s model is equal substitution Parsimony can incorporate models, but not clear

what the weights have to be Parsimony tries to minimize the number of substitutions,

irrespective of the branch lengths ML allows for changes more likely to happen on

longer branches On a long branch, no reason to try to minimize the

number of substitutions Parsimony is strong for evaluating trees based on

qualitative characters

ML vs. Parsimony

Page 27: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• www.cs.uml.edu/~kim/580/review_baysian.pdf• Example

• Experiment• Stimulant (ST) present/not present (a priori prob. x)• Extracellular signal (SI): high/medium/low (posterior prob.

y)• Stimulant High Med Low

present 0.6 0.3 0.1 not present 0.1 0.2 0.7

• Inference• What is the prob. of ST being present when SI is high ?• Need a priori prob. (present = 0.4)

• P(ST=p|SI=high) = p(SI=h|ST=p)p(ST=p)/Σxp(y|x)p(x) = 0.8

Bayesian Network

Page 28: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Discrete data

• θSI = p(SI|ST)

• Stimulant High Med Low present 0.6 0.3 0.1

not present 0.1 0.2 0.7• Continuous data – e.g. Gaussian

Model Parameter Set

Page 29: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Multiple Variables• Stimulant (ST), Signal (SI), inhibitor (IN) of the signal, a G

protein-coupled receptor bind (RE), a G protein (GP) and the cellular response (CR)

• Express relationships• ST may or may not generate a signal• Concentration of the signal may affect the level of inhibitor• If the signal binds with receptor depends on the concentration

of both the signal and the inhibitor• GP should become active if the receptor binds• An active GP initiates a cascade of reaction that causes the

cellular response

Bayesian Network

Page 30: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint
Page 31: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint
Page 32: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• CI if p(a,b|c) = p(a|c)p(b|c)• Three cases with an example of regulation of three

genes x,y, and z• Serial:

• If expression level of y is unknown, its level of x affects that of z

• If that of y is known, that of z is conditionally independent from x

Conditional Independence

Page 33: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Three cases with an example of regulation of three genes x,y, and z• Diverging

• If expression level of y is unknown, its level of x affects that of z (they are co-regulated, and if x is highly expressed, then the likely level of y may be inferred, which in turn would influence the expression level of z)

• If that of y is known, that of z is conditionally independent from x

Conditional Independence

Page 34: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Converging• If expression level of y is

unknown, its level of x does not help to infer that of z (x and z are independent)

• If that of y is known, that of x does help to infer that of z

• Y is dependent on both x and z,• P(x,z|y) ≠ p(x|y)p(z|y)

• If y and x are known, it helps to infer the value of z, and x and z are no longer independent• p(z|x,y)=p(z)p(y|x,z)/ Σzp(z)p(y|x,z)

≠ p(z)

Conditional Independence

Page 35: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• BN with n variables (nodes) x = {x1, x2,…xn} and model

parameter θ = {θ1, θ2, …, θn} (θi: a set of parameters

describing the distribution for xi)

• Each node has parents pa(xi)

• p(x1, x2,…xn|θ) = Πi p(xi|pa(xi), θ) from conditional independence• P(ST,SI,IN,RE,GP,CR) = p(CR|GP)p(GP|RE)p(RE|SI,IN)p(IN|ST)p(ST) simpler expression for joint prob. By using

conditional independence

Joint Prob. Distribution

Page 36: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Given the states of a variables, state of other variable can be inferred

• P(GP=active|ST=present) = ΣΣΣ P(GP=active|RE=x)P(RE=x|IN=y, SI=z)

P(IN=y|SI=z) p(SI=z|ST=present)

= 0.5048

Inference of GP active, knowing ST is present

Page 37: Baysian Prob.. Example Red box has 2 apples and 6 oranges Blue box has 3 apples and 1 orange Pick a box, randomly select a fruit, and put it back Joint

• Posterior prob

• P(ST=present | SI=high) = P(AI=h|ST=p) P(ST=p)/[P(SI=h|ST=p) P(ST=p)

+ P(SI=h|ST=n) P(ST=n) ] = 0.6*0.4/[0.6*0.4+0.1*0.6] = 0.8

Prob. of ST present, given signal is high