Signal decompositions using trans-dimensional Bayesian methods: Alireza Roodaki's Ph.D. defense

Preview:

DESCRIPTION

These are the slides I have used during the defense of my thesis (see https://sites.google.com/site/alireza4702/publications/phd-thesis for more information).

Citation preview

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Signal decompositions usingtrans-dimensional Bayesian methods

Alireza Roodaki

Ph.D. Thesis Defense

Department of Signal Processing and Electronic Systems

2012, May 14th

Advisors: Julien Bect and Gilles Fleury

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 1/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project

Figure: A conceptual shower(http://auger.org).

Ultra high energy particlescoming from space(E ∼ 1019eV)

How and where?

What is their composition(Proton, Iron)?

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 1/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project

Figure: A conceptual shower(http://auger.org).

Ultra high energy particlescoming from space(E ∼ 1019eV)

How and where?

What is their composition(Proton, Iron)?

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 1/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project

Figure: A conceptual shower(http://auger.org).

Ultra high energy particlescoming from space(E ∼ 1019eV)

How and where?

What is their composition(Proton, Iron)?

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 1/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project (Contd.)

Figure: A conceptual shower anddetectors (water tanks)(http://auger.org).

muons are generatedwhen particles cross theatmosphere

the number k of muonsand their arrival times tµare indicators of the originand composition of particle

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 2/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project (Contd.)

Figure: A conceptual shower anddetectors (water tanks)(http://auger.org).

muons are generatedwhen particles cross theatmosphere

the number k of muonsand their arrival times tµare indicators of the originand composition of particle

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 2/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project (Contd.)

Figure: A conceptual shower anddetectors (water tanks)(http://auger.org).

muons are generatedwhen particles cross theatmosphere

the number k of muonsand their arrival times tµare indicators of the originand composition of particle

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 2/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project (Contd.)

Figure: Water tank detector.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 3/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project (Contd.)

#P

E

t [ns]

inte

nsity

100 200 300 400 500 6000

0.5

1

1.5

0

10

20

Figure: Observed signal (n)

Prof. Balázs Kégl from LAL, University of Paris 11.Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 4/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 1: Detection and estimation of muons in theAuger project (Contd.)

#P

E

t [ns]

inte

nsity

100 200 300 400 500 6000

0.5

1

1.5

0

10

20

Figure: Observed signal (n)

Prof. Balázs Kégl from LAL, University of Paris 11.Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 4/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 2: Spectral analysis

time

Pow

er

radial frequency0 0.5 1 1.5 2 2.5 3

0 10 20 30 40 50 60

0

50

100

150

−10

0

10

Figure: Observed signal (top) and itsperiodogram (bottom).

Applications

RADAR / SONAR

Array signal processing

Vibration analysis

. . .

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 5/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 2: Spectral analysis (Cont.)

Detection and estimation of sinusoids in white noisemodel the observed signal y by sinusoidal components

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 6/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 2: Spectral analysis (Cont.)

Detection and estimation of sinusoids in white noisemodel the observed signal y by sinusoidal components

observed signal

Mk : y [i] =

k∑

j=1

(

aj cos[ωj i] + bj sin[ωj i])

+ n[i].

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 6/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 2: Spectral analysis (Cont.)

Detection and estimation of sinusoids in white noisemodel the observed signal y by sinusoidal components

observed signal

Mk : y [i] =

k∑

j=1

(

aj cos[ωj i] + bj sin[ωj i])

+ n[i].

Joint model selection and parameter estimation problem

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 6/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Trans-dimensional problems

Def. The problems in which the number of things that wedon′t know is one of the things that we don′t know [Green,

2003.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 7/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Trans-dimensional problems

Def. The problems in which the number of things that wedon′t know is one of the things that we don′t know [Green,

2003.]

space X =⋃

k∈K{k} ×Θk with points x = (k ,θk )

➠ k ∈ K denotes number of components

➠ θk ∈ Θk is a vector of component-specific

parameters

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 7/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Trans-dimensional problems

Def. The problems in which the number of things that wedon′t know is one of the things that we don′t know [Green,

2003.]

space X =⋃

k∈K{k} ×Θk with points x = (k ,θk )

➠ k ∈ K denotes number of components

➠ θk ∈ Θk is a vector of component-specific

parameters

Applications:➠ Spectral Analysis (Array signal processing)

➠ (Gaussian) Mixture modeling & Clustering

➠ Object detection and recognition

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 7/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Bayesian inference

Likelihood

p(x | y) =p(y | x) p(x)

Xp(y | x ′) p(x ′)dx ′

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 8/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Bayesian inference

Likelihood

p(x | y) =p(y | x) p(x)

Xp(y | x ′) p(x ′)dx ′

Prior distribution

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 8/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Bayesian inference

Likelihood

p(x | y) =p(y | x) p(x)

Xp(y | x ′) p(x ′)dx ′

Prior distributionPosterior distribution

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 8/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Bayesian inference

Likelihood

p(x | y) =p(y | x) p(x)

Xp(y | x ′) p(x ′)dx ′

Prior distributionPosterior distribution

x = (k ,θk )

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 8/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Bayesian inference

Likelihood

p(x | y) =p(y | x) p(x)

Xp(y | x ′) p(x ′)dx ′

Prior distributionPosterior distribution

x = (k ,θk )

➠ both detection and estimation problems

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 8/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Bayesian inference

Likelihood

p(x | y) =p(y | x) p(x)

Xp(y | x ′) p(x ′)dx ′

Prior distributionPosterior distribution

x = (k ,θk )

➠ both detection and estimation problems

high-dimensional / intractable integrals

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 8/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Markov Chain Monte Carlo (MCMC) methods

generate samples from the posterior distribution of interest(target distribution), say, π.

construct a Markov chain (x (1), . . . , x (M)) that under someconditions converges to π.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 9/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Markov Chain Monte Carlo (MCMC) methods

generate samples from the posterior distribution of interest(target distribution), say, π.

construct a Markov chain (x (1), . . . , x (M)) that under someconditions converges to π.

Famous algorithms:➠ Metropolis-Hastings (MH) sampler [Metropolis, et al.

1953, Hastings, 1970.]

➠ Gibbs sampler [Geman and Geman, 1984.]

➠ RJ-MCMC sampler [Green, 1995.]

[Robert and Casella, 2004.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 9/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Markov Chain Monte Carlo (MCMC) methods

generate samples from the posterior distribution of interest(target distribution), say, π.

construct a Markov chain (x (1), . . . , x (M)) that under someconditions converges to π.

Famous algorithms:➠ Metropolis-Hastings (MH) sampler [Metropolis, et al.

1953, Hastings, 1970.]

➠ Gibbs sampler [Geman and Geman, 1984.]

➠ RJ-MCMC sampler [Green, 1995.]

[Robert and Casella, 2004.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 9/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Example 2: spectral analysis (Cont.)

RJ-MCMC sampler ⇒ variable dimensional samples

ωk

k

Iteration number160 170 180 190 200

2

34

0.4

0.6

0.8

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 10/ 52

Outline

1 Relabeling and summarizing posterior distributionsLabel-switching issueVariable-dimensional summarization

Outline

1 Relabeling and summarizing posterior distributionsLabel-switching issueVariable-dimensional summarization

2 Proposed approachAn original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Outline

1 Relabeling and summarizing posterior distributionsLabel-switching issueVariable-dimensional summarization

2 Proposed approachAn original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

3 ResultsDetection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Outline

1 Relabeling and summarizing posterior distributionsLabel-switching issueVariable-dimensional summarization

2 Proposed approachAn original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

3 ResultsDetection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

4 Conclusion

Outline

1 Relabeling and summarizing posterior distributionsLabel-switching issueVariable-dimensional summarization

2 Proposed approachAn original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

3 ResultsDetection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

4 Conclusion

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

summarizing posterior distributions

Posterior = all the information

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 12/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

summarizing posterior distributions

Posterior = all the information➠ It is a complex mathematical object (not easy to

manipulate)

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 12/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

summarizing posterior distributions

Posterior = all the information➠ It is a complex mathematical object (not easy to

manipulate)

(RJ)-MCMC sampler

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 12/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

summarizing posterior distributions

Posterior = all the information➠ It is a complex mathematical object (not easy to

manipulate)

(RJ)-MCMC sampler➠ What to do with the generated samples?

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 12/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

summarizing posterior distributions

Posterior = all the information➠ It is a complex mathematical object (not easy to

manipulate)

(RJ)-MCMC sampler➠ What to do with the generated samples?

Summarizationhuman readable summaries

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 12/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

summarizing posterior distributions

Posterior = all the information➠ It is a complex mathematical object (not easy to

manipulate)

(RJ)-MCMC sampler➠ What to do with the generated samples?

Summarizationhuman readable summaries

interpretable figures (e.g., histograms)

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 12/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

summarizing posterior distributions

Posterior = all the information➠ It is a complex mathematical object (not easy to

manipulate)

(RJ)-MCMC sampler➠ What to do with the generated samples?

Summarizationhuman readable summaries

interpretable figures (e.g., histograms)

statistical measures (e.g., mean and variance)

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 12/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Fixed-dimensional problems

uni-modal uni-variate case

Samples

p

0 2 40

0.2

0.4

0.6

report location (mean and median) and dispersion(variance and confidence intervals) parameters

µ σ2

2.0 0.25

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 13/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Label-switching issue

Additive mixture: lack of identifiability

the likelihood is invariant under relabeling of components

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 14/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Label-switching issue

Additive mixture: lack of identifiability

the likelihood is invariant under relabeling of components

the posterior distribution is invariant under permutation ofcomponents

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 14/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Label-switching issue

Additive mixture: lack of identifiability

the likelihood is invariant under relabeling of components

the posterior distribution is invariant under permutation ofcomponents

Comp. #1

Comp. #2

ω

Comp. #3

0.5 0.75 10

100

100

10Marginal posteriors ofcomponent-specificparameters are identical!

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 14/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Label-switching issue

Additive mixture: lack of identifiability

the likelihood is invariant under relabeling of components

the posterior distribution is invariant under permutation ofcomponents

Comp. #1

Comp. #2

ω

Comp. #3

0.5 0.75 10

100

100

10Marginal posteriors ofcomponent-specificparameters are identical!How to summarize theposterior information?

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 14/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

strategies to deal with label-switching

imposing artificial “identifiability constraints”Exp: sorting the components [Richardson and Green, 1997.]

Comp. #1

Comp. #2

Comp. #3

0.5 0.75 10

10

0

10

010

Figure: components are sorted based on ω.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 15/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

strategies to deal with label-switching

imposing artificial “identifiability constraints”Exp: sorting the components [Richardson and Green, 1997.]

Comp. #1

Comp. #2

Comp. #3

0.5 0.75 10

10

0

10

010

Figure: components are sorted based on ω.

relabeling algorithms [Celeux, et al. 1998, Stephens, 2000, Jasra,

et al, 2005, Sperrin et al, 2010, Yao, 2011.].

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 15/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Example 2: spectral analysis (Cont.)Variable-dimensional posterior distribution

k

p(k |y) ω0.5 0.75 10 0.3 0.6

2

3

4

Figure: Posteriors of k and sorted radial frequencies, ωk , given k .Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 16/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Classical Bayesian approaches

Bayesian Model Selection (BMS)

One model is selected (estimated) by looking at the MAP,i.e. k = argmax p(k |y).

Component-specific parameters are summarized givenk = k .

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 17/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Bayesian Model Selection (BMS)

k

p(k |y) ω

0.5 0.75 10 0.3 0.6

3

2

3

4

Figure: The model with k = 2 is selected.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 18/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Classical Bayesian approaches

Bayesian Model Selection (BMS)

One model is selected (estimated) by looking at the MAP,i.e. k = argmax p(k |y).

Component-specific parameters are summarized givenk = k .

Bayesian Model Averaging (BMA)

Use the information from all possible models:p(∆|y) =

k p(∆|k , y)p(k |y)

However, ∆ cannot be ωk as its size changes from modelto model.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 19/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Bayesian Model Averaging (BMA)

Binned data representation (∆ = N(Bj)):

E(N(Bj) | y) =

kmax∑

k=1

E(N(Bj) | k , y) · p(k | y)

wherej = 1, . . . ,Nbinand E(N(Bj)) is theexpected number ofcomponents in binBj .

expe

cted

nbr

com

p

ω0 0.5 1 1.5 2 2.5 3

0

.25

.5

.75

1

Figure: Expected number of componentsusing BMA.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 20/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Are BMS and BMA approaches satisfactory?

Bayesian Model Selection (BMS)

➠ selects a model ⇒ component-specific parameters

➠ losing information from the discarded models

➠ ignoring the uncertainties about the presence ofcomponents.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 21/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

Are BMS and BMA approaches satisfactory? (Cont.)

Bayesian Model Averaging (BMA)

➠ appropriate for signal reconstruction and prediction

➠ does not provide information about component-specificparameters

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 22/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

A novel approach is needed!

A novel approach is needed!

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 23/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Label-switching issueVariable-dimensional summarization

A novel approach is needed!

A novel approach is needed!

Properties of an “ideal” approach

information from all (plausible) models➠ interpretable summaries for component-specific

parameters

uncertainties about the presence of components

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 23/ 52

Outline

1 Relabeling and summarizing posterior distributionsLabel-switching issueVariable-dimensional summarization

2 Proposed approachAn original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

3 ResultsDetection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

4 Conclusion

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Big picture: relabeling and summarizing posteriordistributions

True posterior f = p(· | y)

0 1 2 30

1

2

Approximate posterior qη

0 1 2 30

1

2

Parametric family {qη, η ∈ N}

Measure of “distance”

[Stephens, 2000.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 24/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Big picture: relabeling and summarizing posteriordistributions

True posterior f = p(· | y)

0 1 2 30

1

2

Samples

0 1 2 30

1

2

Approximate posterior qη

0 1 2 30

1

2

Parametric family {qη, η ∈ N}

Measure of “distance”

Samples

[Stephens, 2000.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 24/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Fixed-dimensional problems

uni-modal uni-variate case

Samples

p

0 2 40

0.2

0.4

0.6

report location (mean and median) and dispersion(variance and confidence intervals) parameters

µ σ2

2.0 0.25

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 25/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

variable-dimensional approximate posterior

An original (variable-dimensional) parametric model qη

Four main requirements:1 Must be defined on the same space X =

k∈K{k} ×Θk

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 26/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

variable-dimensional approximate posterior

An original (variable-dimensional) parametric model qη

Four main requirements:1 Must be defined on the same space X =

k∈K{k} ×Θk

2 Must be permutation invariance

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 26/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

variable-dimensional approximate posterior

An original (variable-dimensional) parametric model qη

Four main requirements:1 Must be defined on the same space X =

k∈K{k} ×Θk

2 Must be permutation invariance3 Must be “simple” (small number of parameters)

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 26/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

variable-dimensional approximate posterior

An original (variable-dimensional) parametric model qη

Four main requirements:1 Must be defined on the same space X =

k∈K{k} ×Θk

2 Must be permutation invariance3 Must be “simple” (small number of parameters)4 Must be able to capture the main features of the posterior

distributions typically met in practice.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 26/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 2

· · ·

L

x = (k ,θk ) ∈ X =⋃

k{k} ×Θk

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 27/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

u11 u2

2

· · ·

uLL

x = (k ,θk ) ∈ X =⋃

k{k} ×Θk

ul ∈ Θ

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 27/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

ξ1 u11

ξ2 u22

· · ·

ξL uLL

x = (k ,θk ) ∈ X =⋃

k{k} ×Θk

ul ∈ Θ

ξl ∈ {0, 1}

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 27/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

ξ1 u11

ξ2 u22

· · ·

ξL uLL

{u l | ξl = 1}

x = (k ,θk ) ∈ X =⋃

k{k} ×Θk

ul ∈ Θ

ξl ∈ {0, 1}

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 27/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

ξ1 u11

ξ2 u22

· · ·

ξL uLL

{u l | ξl = 1}

random arrangement

x = (k ,θk ) ∈ X =⋃

k{k} ×Θk

ul ∈ Θ

ξl ∈ {0, 1}

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 27/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

ξ1 u1

π1

1ξ2 u2

π2

2

· · ·

ξL uL

πL

L

{u l | ξl = 1}

random arrangement

x = (k ,θk ) ∈ X =⋃

k{k} ×Θk

ul ∈ Θ

ξl ∈ {0, 1}

ξl ∼ B(πl)

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 27/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

ξ1 u1

π1µ1 Σ1

1ξ2 u2

π2µ2 Σ2

2

· · ·

ξL uL

πLµL ΣL

L

{u l | ξl = 1}

random arrangement

x = (k ,θk ) ∈ X =⋃

k{k} ×Θk

ul ∈ Θ

ξl ∈ {0, 1}

ξl ∼ B(πl)

ul ∼ N (µl ,Σl)

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 27/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

ξ1 u1

π1µ1 Σ1

1ξ2 u2

π2µ2 Σ2

2

· · ·

ξL uL

πLµL ΣL

L

{u l | ξl = 1}

random arrangement

x = (k ,θk ) ∈ X =⋃

k{k} ×Θk

ul ∈ Θ

ξl ∈ {0, 1}

ξl ∼ B(πl)

ul ∼ N (µl ,Σl)

ηl = {πl ,µl ,Σl}

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 27/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

2 set k =∑L

l=1 ξl ;3 for each l such that ξl = 1

generate random sample ul ∼ N (µl ,Σl) end

4 Random arrangement ⇒ θk

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

2 set k =∑L

l=1 ξl ;3 for each l such that ξl = 1

generate random sample ul ∼ N (µl ,Σl) end

4 Random arrangement ⇒ θk

Example:

L = 3π = (0.4, 0.9, 0.7)µ = (0.2, 0.5, 1)s2 = (0.05, 0.02, 0.1)

0.4 0.8 1.2

6

12

18

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

2 set k =∑L

l=1 ξl ;3 for each l such that ξl = 1

generate random sample ul ∼ N (µl ,Σl) end

4 Random arrangement ⇒ θk

Example:

L = 3π = (0.4, 0.9, 0.7)µ = (0.2, 0.5, 1)s2 = (0.05, 0.02, 0.1)

ξ = (0, 1, 1) ⇒ k = 2 &θk = (0.52, 1.05)

0.4 0.8 1.2

6

12

18

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

2 set k =∑L

l=1 ξl ;3 for each l such that ξl = 1

generate random sample ul ∼ N (µl ,Σl) end

4 Random arrangement ⇒ θk

Example:

L = 3π = (0.4, 0.9, 0.7)µ = (0.2, 0.5, 1)s2 = (0.05, 0.02, 0.1)

ξ = (0, 1, 0) ⇒ k = 1 &θk = 0.49

0.4 0.8 1.2

6

12

18

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

2 set k =∑L

l=1 ξl ;3 for each l such that ξl = 1

generate random sample ul ∼ N (µl ,Σl) end

4 Random arrangement ⇒ θk

Example:

L = 3π = (0.4, 0.9, 0.7)µ = (0.2, 0.5, 1)s2 = (0.05, 0.02, 0.1)

ξ = (1, 0, 1) ⇒ k = 2 &θk = (0.27, 1.03)

0.4 0.8 1.2

6

12

18

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

2 set k =∑L

l=1 ξl ;3 for each l such that ξl = 1

generate random sample ul ∼ N (µl ,Σl) end

4 Random arrangement ⇒ θk

Example:

L = 3π = (0.4, 0.9, 0.7)µ = (0.2, 0.5, 1)s2 = (0.05, 0.02, 0.1)

ξ = (0, 0, 1) ⇒ k = 1 &θk = 1.05

0.4 0.8 1.2

6

12

18

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

2 set k =∑L

l=1 ξl ;3 for each l such that ξl = 1

generate random sample ul ∼ N (µl ,Σl) end

4 Random arrangement ⇒ θk

Example:

L = 3π = (0.4, 0.9, 0.7)µ = (0.2, 0.5, 1)s2 = (0.05, 0.02, 0.1)

ξ = (1, 1, 1) ⇒ k = 3 &θk = (0.27, 0.53, 1.15)

0.4 0.8 1.2

6

12

18

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

A generative model point of view

1 for l = 1, . . . , Lgenerate binary number ξl ∼ B (πl) end

2 set k =∑L

l=1 ξl ;3 for each l such that ξl = 1

generate random sample ul ∼ N (µl ,Σl) end

4 Random arrangement ⇒ θk

Example:

L = 3π = (0.4, 0.9, 0.7)µ = (0.2, 0.5, 1)s2 = (0.05, 0.02, 0.1)

ξ = (0, 0, 0) ⇒ k = 0 &θk = ()

0.4 0.8 1.2

6

12

18

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 28/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Big picture: relabeling and summarizing posteriordistributions

True posterior f = p(· | y)

0 1 2 30

1

2

Samples

0 1 2 30

1

2

Approximate posterior qη

0 1 2 30

1

2

Parametric family {qη, η ∈ N}

Measure of “distance”

Samples

[Stephens, 2000.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 29/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

fitting the parametric model qη to the posterior f

minimizing the Kullback-Leibler divergence

J (η) , DKL (f (x)‖qη(x)) =

f (x) logf (x)

qη(x)dx

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 30/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

fitting the parametric model qη to the posterior f

minimizing the Kullback-Leibler divergence

J (η) , DKL (f (x)‖qη(x)) =

f (x) logf (x)

qη(x)dx

A key point: samples x (i), i = 1, 2, · · · ,M, are generatedfrom f , so

J (η) ≃ −1M

M∑

i=1

log(

qη(x (i)))

+ Const.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 30/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

fitting the parametric model qη to the posterior f

minimizing the Kullback-Leibler divergence

J (η) , DKL (f (x)‖qη(x)) =

f (x) logf (x)

qη(x)dx

A key point: samples x (i), i = 1, 2, · · · ,M, are generatedfrom f , so

J (η) ≃ −1M

M∑

i=1

log(

qη(x (i)))

+ Const.

Objective: η = argmaxη∑M

i=1 log(

qη(x (i)))

.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 30/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Expectation Maximization (EM)

latent (hidden) variable:➠ Binary indicator vector ξ

➠ Random permutation

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 31/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Expectation Maximization (EM)

latent (hidden) variable:➠ Binary indicator vector ξ

➠ Random permutation

➠ Define z = (z1, . . . , zk ) as an allocation vector for x

➠ zj = l ⇒ x j comes from component l

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 31/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Expectation Maximization (EM)

latent (hidden) variable:➠ Binary indicator vector ξ

➠ Random permutation

➠ Define z = (z1, . . . , zk ) as an allocation vector for x

➠ zj = l ⇒ x j comes from component l

idea: use EM to maximize the likelihood.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 31/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Expectation Maximization (EM)

latent (hidden) variable:➠ Binary indicator vector ξ

➠ Random permutation

➠ Define z = (z1, . . . , zk ) as an allocation vector for x

➠ zj = l ⇒ x j comes from component l

idea: use EM to maximize the likelihood.

The E-step is computationally expensive!

For example, assuming L = 15 and k (i) = 10, then, itcontains 1.1 × 1010 terms.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 31/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Stochastic EM (SEM)

at iteration (r + 1)

Stochastic (S)-step: for i = 1, . . . ,Mgenerate z(i) from p( · | x (i), η(r)) end

M-step: η(r+1) = argmaxη∑M

i=1 log(p(x (i), z(i) |η))

[Broniatowski, et al. 1983. Celeux and Diebolt, 1986. Celeux and Diebolt,

1993.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 32/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Stochastic EM (SEM)

at iteration (r + 1)

Stochastic (S)-step: for i = 1, . . . ,Mgenerate z(i) from p( · | x (i), η(r)) end

M-step: η(r+1) = argmaxη∑M

i=1 log(p(x (i), z(i) |η))

[Broniatowski, et al. 1983. Celeux and Diebolt, 1986. Celeux and Diebolt,

1993.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 32/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Stochastic EM (SEM)

at iteration (r + 1)

Stochastic (S)-step: for i = 1, . . . ,Mgenerate z(i) from p( · | x (i), η(r)) end

M-step: η(r+1) = argmaxη∑M

i=1 log(p(x (i), z(i) |η))

[Broniatowski, et al. 1983. Celeux and Diebolt, 1986. Celeux and Diebolt,

1993.]

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 32/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Stochastic EM (SEM)

at iteration (r + 1)

Stochastic (S)-step: for i = 1, . . . ,Mgenerate z(i) from p( · | x (i), η(r)) end

M-step: η(r+1) = argmaxη∑M

i=1 log(p(x (i), z(i) |η))

[Broniatowski, et al. 1983. Celeux and Diebolt, 1986. Celeux and Diebolt,

1993.]

S-step

To draw z(i) ∼ p( · | x (i), η(r)) we developed an I-MH sampler.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 32/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Example 2: spectral analysis (Cont.)Variable-dimensional posterior distribution

k

p(k |y) ω0.5 0.75 10 0.3 0.6

2

3

4

Figure: Posteriors of k and sorted radial frequencies, ωk , given k .Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 33/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Robustifying solutions

Add a Poisson point process component

To capture the “outliers”

λ is the mean parameter

points are uniformly distributed on Θ

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 34/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Robustifying solutions

Add a Poisson point process component

To capture the “outliers”

λ is the mean parameter

points are uniformly distributed on Θ

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 34/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Robustifying solutions

Add a Poisson point process component

To capture the “outliers”

λ is the mean parameter

points are uniformly distributed on Θ

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 34/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

An original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

Robustifying solutions

Add a Poisson point process component

To capture the “outliers”

λ is the mean parameter

points are uniformly distributed on Θ

Other possibilities

Robust estimates in the M-step.➠ Median instead of mean

➠ interquartile range instead of variance

Using another divergence measure.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 34/ 52

Outline

1 Relabeling and summarizing posterior distributionsLabel-switching issueVariable-dimensional summarization

2 Proposed approachAn original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

3 ResultsDetection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

4 Conclusion

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Example 2: Spectral analysis

time

Pow

er

radial frequency0 0.5 1 1.5 2 2.5 3

0 10 20 30 40 50 60

0

50

100

150

−10

0

10

Figure: Observed signal (top) and its periodogram (bottom).

Mk : y [i] =k

j=1

(

aj cos[ωj i] + bj sin[ωj i])

+ n[i].

[Andrieu and Doucet, 1999.]Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 35/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Example 2: spectral analysis (Cont.)Variable-dimensional posterior distribution

k

p(k |y) ω0.5 0.75 10 0.3 0.6

2

3

4

Figure: Posteriors of k and sorted radial frequencies, ωk , given k .Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 36/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

z1 z2 z1 z2 z3 z4 z1 z2 z3

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.1 Iter = 0

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 4 2 3 1 1 4 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.1 Iter = 0

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 4 2 3 1 1 4 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.167 Iter = 1

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 4 2 3 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.167 Iter = 1

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 4 2 3 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.226 Iter = 2

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 2 4 3 1 1 2 3

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.226 Iter = 3

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 2 4 3 1 1 2 3

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.236 Iter = 4

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.236 Iter = 4

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.246 Iter = 5

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.246 Iter = 5

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.248 Iter = 6

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.248 Iter = 6

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.251 Iter = 7

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.251 Iter = 7

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.248 Iter = 8

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.248 Iter = 8

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.256 Iter = 9

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.256 Iter = 9

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

S-step: randomized allocation procedure

0.72 0.62 0.73 0.83 0.67 0.64 0.64 0.74 0.72

3 1 3 4 2 1 1 3 2

M-step

ω

norm

.de

nsity

0.5 0.75 10

0.5

1

λ = 0.255 Iter = 10

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 37/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (convergence)µ s2

π λ

J

SEM iteration0 20 40 60 80 100

−3−2−1

01

0

0.25

0.5

0.250.5

0.751

10−4

10−3

10−2

0.60.650.7

0.75

Figure: parameter evolutions vs SEM iterations

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 38/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (validation)k

0.5 0.7 0.90

0.51

2

3

4

Figure: Marginal posteriors of sorted radial frequencies (top) vs.normalized densities of the fitted Gaussian components (bottom).

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 39/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (relabeling properties)

µ1 = 0.62, s1 = 0.019, π1 = 1.00In

tens

ityµ2 = 0.68, s2 = 0.056, π2 = 0.29

µ3 = 0.73, s3 = 0.011, π3 = 0.98

ω

Inte

nsity

λ = 0.24

ω0 1 2 30.5 0.75 1

0.5 0.75 10.5 0.75 1

0.5

1

10

20

30

40

5

10

20

30

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 40/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (goodness-of-fit)

inte

nsity

ω

0.4 0.6 0.8 10

10

20

30

40

Figure: Estimated intensity of radial frequencies (Histogram : BMA,Curve : parametric model).

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 41/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (goodness-of-fit)

k0 2 4 6 8 10

0

0

0.2

0.4

0.6

Figure: Posterior distribution of k (black) vs. its approximate version(gray).

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 42/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (Comparison with BMS)

Comp. µ s π µBMS sBMS ωtruek

1 0.620 0.019 1 0.617 0.016 0.6282 0.686 0.056 0.29 — — 0.6773 0.727 0.011 0.98 0.727 0.012 0.726

Table: summaries of the variable-dimensional posterior distribution;the proposed method vs. BMS.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 43/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (Comparison with BMS)

Comp. µ s π µBMS sBMS ωtruek

1 0.620 0.019 1 0.617 0.016 0.6282 0.686 0.056 0.29 — — 0.6773 0.727 0.011 0.98 0.727 0.012 0.726

Table: summaries of the variable-dimensional posterior distribution;the proposed method vs. BMS.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 43/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (Comparison with BMS)

Comp. µ s π µBMS sBMS ωtruek

1 0.620 0.019 1 0.617 0.016 0.6282 0.686 0.056 0.29 — — 0.6773 0.727 0.011 0.98 0.727 0.012 0.726

Table: summaries of the variable-dimensional posterior distribution;the proposed method vs. BMS.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 43/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: sinusoid detection (Comparison with BMS)

Comp. µ s π µBMS sBMS ωtruek

1 0.620 0.019 1 0.617 0.016 0.6282 0.686 0.056 0.29 — — 0.6773 0.727 0.011 0.98 0.727 0.012 0.726

Table: summaries of the variable-dimensional posterior distribution;the proposed method vs. BMS.

The summary obtained by the proposed approach is richer thanthe one of the BMS approach.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 43/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: Auger (observatory)

Figure: A conceptual shower (http://auger.org).

Prof. Balázs Kégl from LAL, University of Paris 11.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 44/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: Auger (Observed signal)

#P

E

t [ns]

inte

nsity

100 200 300 400 500 600

0

0

0.5

1

1.5

0

10

20

Figure: Observed signal (n) made up of five muons.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 45/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: Auger (RJ-MCMC samples)

k

p(k |n) t [ns]100 200 300 400 5000 0.25 0.5

0.3

4

5

6

7

Figure: Posteriors of k and sorted arrival times, tµ, given k .

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 46/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: Auger (choice of L)

L = 6

norm

aliz

edde

nsity

L = 7

L = 8

100 200 300 400 5000

0.5

1

0

0.5

1

0

0.5

1

Figure: Normalized densities of the fitted Gaussian components.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 47/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: Auger (relabeling properties)µ1 = 99.55, s1 = 3.882, π1 = 1.00

pdfµ2 = 173.28, s2 = 7.715, π2 = 0.18

µ3 = 173.41, s3 = 5.332, π3 = 1.00

pdf

µ4 = 237.78, s4 = 8.531, π4 = 0.39

µ5 = 261.18, s5 = 6.992, π5 = 0.93

pdf

µ6 = 504.32, s6 = 5.499, π6 = 1.00

t [ns]

inte

nsity

λ = 0.42

100 200 300 400 5000

0.01

0

0.1

0

0.1

0

0.1

0

0.1

0

0.1

0

0.1

Figure:Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 48/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Detection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

Results: Auger (choice of L)

λ = 1.91

inte

nsity

L = 3

λ = 1.01

inte

nsity

L = 4

λ = 0.43

inte

nsity

t [ns]

L = 6

λ = 0.27

inte

nsity

t [ns]

L = 8

100 200 300 400 500100 200 300 400 5000

0.02

0.04

0

0.02

0.04

0

0.02

0.04

0

0.02

0.04

Figure: Residuals of the fitted model for different values of L.

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 49/ 52

Outline

1 Relabeling and summarizing posterior distributionsLabel-switching issueVariable-dimensional summarization

2 Proposed approachAn original variable-dimensional parametric modelEstimating the model parameters (SEM-type algorithms)Robustifying strategies

3 ResultsDetection and estimation of sinusoids in white noiseDetection and estimation of muons in the Auger project

4 Conclusion

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work

Problem?relabeling and summarizing variable-dimensionalposteriors

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 50/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work

Problem?relabeling and summarizing variable-dimensionalposteriors

BMS and BMA have limitations

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 50/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work

Problem?relabeling and summarizing variable-dimensionalposteriors

BMS and BMA have limitations

Label-switching ⇒ marginal posteriors are identical

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 50/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work

Problem?relabeling and summarizing variable-dimensionalposteriors

BMS and BMA have limitations

Label-switching ⇒ marginal posteriors are identical

ContributionWe proposed to approximate the variable-dimensionalposterior by an original parametric model

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 50/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work

Problem?relabeling and summarizing variable-dimensionalposteriors

BMS and BMA have limitations

Label-switching ⇒ marginal posteriors are identical

ContributionWe proposed to approximate the variable-dimensionalposterior by an original parametric model

Developed SEM-type algorithms to estimate theparameters

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 50/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work

Problem?relabeling and summarizing variable-dimensionalposteriors

BMS and BMA have limitations

Label-switching ⇒ marginal posteriors are identical

ContributionWe proposed to approximate the variable-dimensionalposterior by an original parametric model

Developed SEM-type algorithms to estimate theparameters

Designed an I-MH sampler to generate allocation vectors

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 50/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work

Problem?relabeling and summarizing variable-dimensionalposteriors

BMS and BMA have limitations

Label-switching ⇒ marginal posteriors are identical

ContributionWe proposed to approximate the variable-dimensionalposterior by an original parametric model

Developed SEM-type algorithms to estimate theparameters

Designed an I-MH sampler to generate allocation vectors

Robustness issue: Poisson point process

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 50/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work (Cont.)

The proposed approach

the label-switching issue

summaries for component-specific parameters

meaningful probabilities of presence

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 51/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work (Cont.)

The proposed approach

the label-switching issue

summaries for component-specific parameters

meaningful probabilities of presence

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 51/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work (Cont.)

The proposed approach

the label-switching issue

summaries for component-specific parameters

meaningful probabilities of presence

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 51/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work (Cont.)

The proposed approach

the label-switching issue

summaries for component-specific parameters

meaningful probabilities of presence

Perspectives

Choice of L

Theoretical properties of the SEM-type algorithm

Adaptive RJ-MCMC samplers

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 51/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work (Cont.)

The proposed approach

the label-switching issue

summaries for component-specific parameters

meaningful probabilities of presence

Perspectives

Choice of L

Theoretical properties of the SEM-type algorithm

Adaptive RJ-MCMC samplers

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 51/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

Conclusion & Future work (Cont.)

The proposed approach

the label-switching issue

summaries for component-specific parameters

meaningful probabilities of presence

Perspectives

Choice of L

Theoretical properties of the SEM-type algorithm

Adaptive RJ-MCMC samplers

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 51/ 52

Relabeling and summarizing posterior distributionsProposed approach

ResultsConclusion

List of publications

i) Alireza Roodaki, Julien Bect, and Gilles Fleury. Summarizing posteriordistributions in signal decomposition problems when the number of componentsis unknown. In ICASSP’12, Kyoto, Japan, 2012.

ii) Alireza Roodaki, Julien Bect, and Gilles Fleury. Note on the computation of theMetropolis-Hastings ratio for Birth-or-Death moves in trans-dimensional MCMCalgorithms for signal decomposition problems. submitted to IEEE Transaction onsignal processing.

iii) Alireza Roodaki, Julien Bect, and Gilles Fleury. Comparison of fully Bayesian andempirical Bayes approaches for joint Bayesian model selection and estimation ofsinusoids via reversible jump MCMC. ISBA’10, Benidorm, Spain, 2010.

iv) Alireza Roodaki, Julien Bect, and Gilles Fleury. An Empirical Bayes Approach forJoint Bayesian Model Selection and Estimation of Sinusoids via Reversible JumpMCMC. In: EUSIPCO’10, Aalborg , Denmark, 2010.

v) Alireza Roodaki, Julien Bect, and Gilles Fleury. On the joint Bayesian modelselection and estimation of sinusoids via reversible jump MCMC in low SNRsituations. In: ISSPA’10, Kuala Lumpur, Malaysia, 2010.

Thank you for your attention !

Alireza Roodaki (SUPELEC) Signal decompositions using trans-dimensional . . . 52/ 52

Recommended