12
Gaussian Process Networks Gaussian Process Networks Nir Friedman and Iftach Nachman UAI-2K

Gaussian Process Networks

  • Upload
    kasen

  • View
    36

  • Download
    0

Embed Size (px)

DESCRIPTION

Gaussian Process Networks. Nir Friedman and Iftach Nachman UAI-2K. Abstract. Learning structures of Bayesian networks Evaluating the marginal likelihood of the data given a candidate structure. For continuous networks Gaussians, Gaussian mixtures were used as priors for parameters. - PowerPoint PPT Presentation

Citation preview

Page 1: Gaussian Process Networks

Gaussian Process NetworksGaussian Process Networks

Nir Friedman and Iftach Nachman

UAI-2K

Page 2: Gaussian Process Networks

AbstractAbstract

Learning structures of Bayesian networks Evaluating the marginal likelihood of the data given a candidate st

ructure.

For continuous networks Gaussians, Gaussian mixtures were used as priors for parameters.

In this paper, a new prior Gaussian Process is presented.

)|( GDP

Page 3: Gaussian Process Networks

IntroductionIntroduction

Bayesian networks are particularly effective in domains where the interactions between variables are fairly local.

Motivation - Molecluar Biology problems To understand transcription of genes. Continuous variable are necessary.

Gaussian Process prior A Bayesian method. Semi-parametric nature allows to learn the complicated functional

relationships between variables.

Page 4: Gaussian Process Networks

Learning Continuous NetworksLearning Continuous Networks

The posterior probability Three assumptions

Structure modularity

Parameter independence

Parameter modularity

The posterior probability is now can be represented as follows.

)|()()|( GDPGPDGP

i

iGi XXGP ))(Pa,()(

i

XXG GPGPiGi

)|()|( )(Pa|

U

UU

)(Pa)(Pa if

)'|()|(

'

||

iGiG

XX

XX

GPGPii

iiGiiGi

iiGi

iiGii

DXXscoreXXDGP

DXXscoreXAMMxxPGDP

)|)(Pa,())(Pa,()|( So,

)|)(Pa,(]))(Pa[],[],...,1[|][],...,1[()|(

Uuu

Page 5: Gaussian Process Networks

Priors for Continuous VariablesPriors for Continuous Variables

Linear Gaussian

So simple…

Gaussian mixtures

Approximations are required to learn.

Kernel method

Smoothness parameter

),(~),...,|( 21

iiiok uaaNuuXP

j

jj XfwXP )|()|( UU

)||][||1

(1

)(1

2

M

mkernel mxxg

MxP

Page 6: Gaussian Process Networks

Gaussian Process(1/2)Gaussian Process(1/2)

Basic of Gaussian Process A prior over a variable X is a function of U. The stochastic process over U is said to be Gaussian Process if for

each finite set of values, u1:M = {u[1], …, u[M]}, the distribution over the corresponding random variables x1:M = {X[1], …, X[M]} is a multivariate normal distribution.

The joint distribution of x1:M is

))()(2

1exp(

1)|( :1:1

1:1:1:1:1:1 MMM

TMMMM C

ZP xxux

Page 7: Gaussian Process Networks

Gaussian Process(2/2)Gaussian Process(2/2)

Prediction P(XM+1|X1:M, U1:M, UM+1) is a univariate Gaussian distribution.

Covariance functions Williams and Rasmussen suggest the following function.

])1[],1[(

]))[],1[(]),...,1[],1[((

11

:11

1

MMC

MMCMC

C

C

MT

M

MMT

M

uu

uuuuk

kk

xk

',3212

2

0 '})'(

2

1exp{):',( uu

UU

uu

d

Ukk

d

U k

kk

kk

uuuu

C

Page 8: Gaussian Process Networks

Learning Networks with Gaussian ProcesLearning Networks with Gaussian Process Priorss Priors

score is defined as follows.

With this Gaussian process prior, the computation of marginal probability can be done in closed form.

Parameters for covariance matrix

MAP approximation Laplace approximation

)2

1exp(||)2(),|,( :1

1:1

2

1

2M

TM

M

i CCDXscore xxU UU

dPDXscoreDXscore )(),|,()|,( UU

Page 9: Gaussian Process Networks

Artificial Experimentation(1/3)Artificial Experimentation(1/3)

For two variables X, Y

Non-invertible relationship

Page 10: Gaussian Process Networks

Artificial Experimentation(2/3)Artificial Experimentation(2/3)

The results for non-invertible dependencies learning

Page 11: Gaussian Process Networks

Artificial Experimentation(3/3)Artificial Experimentation(3/3)

Comparison for Gaussian, Gaussian Process, Kernel methods

Page 12: Gaussian Process Networks

DiscussionDiscussion

Reproducing Kernel Hilbert Space(RKHS) and Gaussian Process

Currently this method is applied to analyze biological data.