View
215
Download
0
Tags:
Embed Size (px)
Citation preview
Levels in Computational Neuroscience
Reasonably good Reasonably good understanding (for our understanding (for our
purposes!) purposes!)
Poor understanding Poor understanding
Poorer understanding Poorer understanding
Very poorer Very poorer understanding understanding
From neuron to network
The layered structure of the first visual area, and connections to other areas (Fig. 27.10 in Kandel and Schwartz, Principles of Neural Science)
The columnar organization of the monkey visual cortex (Fig. 12.6 in Shepherd, The Synaptic Organization of the Brain)
Definition of the firing rate in terms of a temporal average. (Fig. 1.9, Spiking Neuron Models)
Definition of the firing rate in terms of the peri-stimulus-time-histogram (PSTH) as an average over several runs of an experiment. (Fig. 1.10, Spiking Neuron Models)
Definition of the firing rate as a population density.
Gerstner & Kistler Fig. 1.11
Feedforward inputs to a single neuron.
Dayan and Abbott Fig. 7.81
Feedforward and recurrent networks
Dayan and Abbott Fig. 7.3
Dayan and Abbott Fig. 7.4
Coordinate transformations during a reaching taskTargetFixation
Gaze angle Retinal angle
Body coordinates
Objective: transform from retinal coordinates to body coordinates
Tuning curves of a visually responsive neuron
in premotor cortex
Dayan and Abbott Fig. 7.5
Head fixed
Fixate on
• Body coordinates
• Response curve fixed!
• Retinal coordinates
• Curve shifts to compensate!
Head rotates
Fixation fixed
Model tuning curve
g=00g=100 g=-200
Dayan and Abbott Fig. 7.6
The gaze-dependent gain modulation of visual responses of neurons in area 7a
Tuning curve
2 Gaze directions
Gaze independence!
Related to s
2D tuning function
burst and an integrator neurons involved in horizontal eye positioning
Dayan and Abbott Fig. 7.7
Eigenvector expansion
Steady state rates – linear network
Real-valued matrix M: use real and imaginary parts
Selective amplification by a linear network
Dayan and Abbott Fig. 7.8
Input: cosine with peak at = 0o + added noise
Fourier amplitude of inputs
Output: steady state
Fourier amplitude of output
= 0 component enhancedAll Fourier components present
Effect of nonlinearity on amplification
Dayan and Abbott Fig. 7.8
Smoother response
Several Fourier components appear
Visual information flow
Dayan and Abbott Fig.2.5
Center surround responseOriented response
Visual receptive fields
Dayan and Abbott Fig. 2.25
Mathematical fit
Actual response
LGN neuron Center surround
Orientation selective
V1 neuron (simple)
Hubel Wiesel model
Low response
Simple summation
Vertical response Undirected response
High response
Effect of contrast
Dayan and Abbott Fig. 7.10
4 input contrast levels
Note: response is amplified but
Real responses Network amplification
not broadened
Nonlinear winner-takes-all selection
Dayan and Abbott Fig. 7.12Dayan and Abbott Fig. 7.12
Input: cantered at ±900Output: Higher peak selected
Associative recall
Dayan and Abbott Fig. 7.16
2 representative units Memory: units 18-31 high, others low
Memory: every 4th unit high
Nv=50, 4 patterns
Partial inputs Converged outputs
Pattern recall – Hopfield model
Input Output
Time
Dayan and Abbott Fig. 7.17
Excitatory-Inhibitory network Nullclines Eigenvalues
Unstable
Stable
Dayan and Abbott Fig. 7.18
Excitatory-Inhibitory network Temporal behavior Stable fixed point
30msI
Dayan and Abbott Fig. 7.19
Excitatory-Inhibitory network Temporal behavior Unstable fixed point –
limit cycle
50msI
Dayan and Abbott Fig. 7.20
Extracellular field potential in olfactory bulb
Olfactory model I
To cortex
Excitatory
Inhibitory interneurons
Sniffs
Oscillatory neural activity
No fast oscillations
Dayan and Abbott Fig. 7.16
Olfactory model II Activation functions Eigenvalues
Region of instability
Dayan and Abbott Fig. 7.22
Olfactory model III
Behavior during a sniff cycle
Identity of odor determined by:
• Amplitudes and phases of oscillations
• Identity of participating mitral cells