18
Population coding • Population code formulation • Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood • Fisher information

Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Embed Size (px)

Citation preview

Page 1: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Population coding

• Population code formulation

• Methods for decoding:population vectorBayesian inferencemaximum a posteriorimaximum likelihood

• Fisher information

Page 2: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Cricket cercal cells coding wind velocity

Page 3: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Population vector

Theunissen & Miller, 1991

RMS error in estimate

Page 4: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Population coding in M1

Cosine tuning:

Pop. vector:

For sufficiently large N,

is parallel to the direction of arm movement

Page 5: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

The population vector is neither general nor optimal.

“Optimal”: Bayesian inference and MAP

Page 6: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

By Bayes’ law,

Introduce a cost function, L(s,sBayes); minimise mean cost.

Bayesian inference

For least squares, L(s,sBayes) = (s – sBayes)2 ;solution is the conditional mean.

Page 7: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

MAP and ML

MAP: s* which maximizes p[s|r]

ML: s* which maximizes p[r|s]

Difference is the role of the prior: differ by factor p[s]/p[r]

For cercal data:

Page 8: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Decoding an arbitrary continuous stimulus

E.g. Gaussian tuning curves

Page 9: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Assume Poisson:

Assume independent:

Need to know full P[r|s]

Population response of 11 cells with Gaussian tuning curves

Page 10: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Apply ML: maximise P[r|s] with respect to s

Set derivative to zero, use sum = constant

From Gaussianity of tuning curves,

If all same

Page 11: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Apply MAP: maximise p[s|r] with respect to s

Set derivative to zero, use sum = constant

From Gaussianity of tuning curves,

Page 12: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Given this data:

Constant prior

Prior with mean -2, variance 1

MAP:

Page 13: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

How good is our estimate?

For stimulus s, have estimated sest

Bias:

Cramer-Rao bound:

Mean square error:

Variance:

Fisher information

Page 14: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Fisher information

Alternatively:

For the Gaussian tuning curves:

Page 15: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Fisher information for Gaussian tuning curves

Quantifies local stimulus discriminability

Page 16: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Do narrow or broad tuning curves produce better encodings?

Approximate:

Thus, Narrow tuning curves are better

But not in higher dimensions!

Page 17: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Fisher information and discrimination

Recall d’ = mean difference/standard deviation

Can also decode and discriminate using decoded values.

Trying to discriminate s and s+s:Difference in estimate is s (unbiased)variance in estimate is 1/IF(s).

Page 18: Population coding Population code formulation Methods for decoding: population vector Bayesian inference maximum a posteriori maximum likelihood Fisher

Comparison of Fisher information and human discriminationthresholds for orientation tuning

Minimum STD of estimate of orientation angle from Cramer-Rao bound

data from discrimination thresholds fororientation of objects as a function ofsize and eccentricity