Upload
rodney-dorsey
View
217
Download
0
Tags:
Embed Size (px)
Citation preview
Projects:
1. Predictive coding in balanced spiking networks (Erwan Ledoux).
2. Using Canonical Correlation Analysis (CCA) to analyse neural data (David Schulz).
3. Encoding and Decoding in the Auditory System (Izzett Burak Yildiz).
4. Quadratic programming of tuning curves: a theory for tuning curve shape (Ralph Bourdoukan).
5. The Bayesian synapse: A theory for synaptic short term plasticity (Sophie Deneve).
Projects:
1. Choose a project. Send email to [email protected]
2. Once project assigned, take appointment with advisor ASAP (before April 17).
3. Plan another meeting with advisor (mid-May). 4. Prepare Oral presentation (June 5). Pedagogy, context, clarity, results not so important.
The efficient coding hypothesis
Predicting sensory receptive fields
Schematics of the visual system
The retina
Center-surround RFs
Hubel and Wiesel
V1 orientation selective cell
Hubel and Wiesel model
How are receptive fields measured?
How are receptive fields measured?
How are receptive fields measured?
How are receptive fields measured?
It is a linear regression problem
It is a linear regression problem
Solution: 1T Tw ss sr
Receptive fields of V1 simple cells
Optimal sensory coding?
The notion of surprise
The entropy of a distribution
Minimal and maximal entropy
Maximizing information transfer
| |x A
H Y X p x H Y X x
Conditional entropy H(Y|X): Surprise about Y when one knows X
Or more shortly:
| | log |y B
H Y X x p y x p y x
With:
,
| , log |x A y B
H Y X p x y p y x
Maximizing information transfer
| |x A
H Y X p x H Y X x
Conditional entropy H(Y|X): Surprise about Y when one knows X
Mutual information between X and Y:
, |I X Y H X H X Y
|H Y H Y X
Maximizing informationMutual information between x and y:
Maximize…or…
Minimize
Interesting!
Boring! |p x y
|p x y
X Y
Unreliable!
Precise!
, |I X Y H X H X Y
Sensory system as information channel
Maximizing information transfer
Mutual information between x and r:
, |I r s H s H s r
|H r H r s
Fixed (no noise)Maximize
Generative models
Analysis models
Maximizing information transfer
Distribution of responses
Entropy maximization
Infomax activation function
An example in the fly
But: neurons cannot have any activation function!
Information maximization
Information maximization
Information maximization
Two neurons
Each neuron maximizing its own entropy
Entropy of a 2D distribution
Two neurons
Entropy maximization = Independent component analysis
Entropy maximization, 2 neurons
Independent component analysis, N neurons
Application: visual processing
Transformation of the visual input
Entropy maximization
Entropy maximization
Weights learnt by ICA (image patch)
The distribution of natural images
Geometric interpretation of ICA
First stages of visual processing
The efficient coding hypothesis
Limitations of ICA
Works only once…
Great!
Limitations of ICA
Works only once…
… and then what?
Great!
Limitations of ICA
Complete basis. Number of features = Number of pixels
Limitations of ICA
Bottleneck
Number optic nerve fibers << Number of retinal receptors
Maximizing information transfer
Mutual information between x and r:
, |I r s H s H s r
|H r H r s
Fixed MinimizeReconstruction error
Fixed (no noise)Maximize
Generative models
Analysis models
Maximizing informationMutual information between x and y:
, |I r s H s H s r
Fixed Minimize
|p s r
|p s r
s r
Unreliable!
Precise!
Maximizing informationMutual information between x and y:
, |I r s H s H s r
Fixed Minimize
|p s r
|p s r
s r
Unreliable!
Precise!
r must predict the sensory input as well as possible
Generative model
1h 2h 4h 5h3h
1s 2s 3s
i ij jj
s h Noise Generate
Independent, prior p h
Generative model
1h 2h 4h 5h3h
1s 2s 3s
i ij jj
s h Noise Generate
Independent, prior p h
, |I H H s h s s h
Generative model
1h 2h 4h 5h3h
1s 2s 3s
i ij jj
s h Noise Generate
Independent, prior p h
Find the dictionary of features, , minimizing |H s h
, |I H H s h s s h
The Gaussian Distribution
Minimize mean squared error
Generative model, recognition model
1h 2h 4h 5h3h
1s 2s 3s
20,i ij j
j
s h N Generate
Recognize
1 1r h 2 2r h3r 4r 5r
Independent, prior p h
ˆ reconstruction errori ij j ij
s r s
Minimize entropy
Minimize expected reconstruction error
Separate the problem in two:
• Given current sensory input , and dictionary estimate the hidden state
• Given the current state estimates and sensory input update the to minimize reconstruction error.
• Repeat until convergence.
r
*
s *
r s
• Start with some random dictionary *
How to estimate r= h?
1h 2h 4h 5h3h
1s 2s 3s
arg max | ,p h
r h s
Generate
Recognize
1r 2r 3r 4r 5r
Maximum a-posteriori (MAP)
How to estimate r= h?
1h 2h 4h 5h3h
1s 2s 3s
Generate
Recognize
1r 2r 3r 4r 5r
arg max log | ,p p h
r s h h
arg max log | , logp p h
r s h h
| ,| ,
p pp
p
s h hh s
s
Bayes rule:
Reconstruction error and MAP
2| N ,i ij jj
p s h
h logk kh p h
Normal distribution
Variance of pixel noise
2
2
1log | , i ij j k
i j k
p s h h
h s
Minus log posterior equivalent to reconstruction error with cost:
PriorCost
Minimize reconstruction error
i ij jj
s r Reconstructed sensory input Neural responses
Dictionary of features
2ˆarg min i i k
r i k
s s r r
Reconstruction error Penalty or cost
1h 2h 4h 5h3h
1s 2s 3s
T T
t
rr
s rr
Generate
Recognize
1r 2r 3r 4r 5r
T
How to estimate r= h?
T
'
2ˆj i i k
i kj
r s s rr
Maximize log posterior probability:
1h 2h 4h 5h3h
1s 2s 3s
Generate
Recognize
1r 2r 3r 4r 5r
T
T
'
How to update the dictionary
2ˆij i iiij
s s
ˆij j i ir s s
Minimize mean-squared error:
Generative model, recognition model
1h 2h 4h 5h3h
1s 2s 3s
Generate
Recognize
1r 2r 3r 4r 5r
1. Find
2. Update to minimize MSE
most probable hidden states
What prior to use? Sparse coding
Cost = number of neurons with non-zero responses
Good! Bad!
Many cortical neurons are near-silent…
p h
p h
h h
Sparse responses of an edge detector
… …
ir
ir
expk kp h h
Sparse prior:
Elementary features found by sparse coding
hp h e
Limitation of the sparse coding approach applied to sensory RFs
1h 2h 4h 5h3h
1s 2s 3s
Generate
Recognize
1r 2r 3r 4r 5r
“Predictive fields”
“Receptive fields”
Different!
ˆ r ws
ˆ s h
Receptive fields depend on stimulus type
Receptive fields depend on stimulus type
Carandini et al, JNeurosci 2005
f
t
Responses to natural scene are poorly predicted by the RF.
STRF:
Machens CK, Wehr MS, Zador AM. J Neurosci. 2004