Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
• Is the brain a digital signal processor?
• Digital vs continuous signals
• Digital signals involve streams of binary encoded numbers
• The brain uses digital, all or none, action potentials or spikes for information transmission in its neuronal axons over distances of 1 mm to several metres to avoid the uncertain decay of an analog signal in a long non-uniform cable.
• But the spikes are then converted into analog signals within a neuron, which has a threshold for the summed depolarization produced by many small synaptic currents to elicit an action potential, which is binary.
• The brain does not work by logical operations as in digital computers, but instead by the similarity of an input vector with a synaptic weight vector.
• The spikes in the brain have Poisson timing, i.e. are random in time for a given mean firing rate. This leads to stochastic processing in the brain, related to the timing of the digital inputs to a neuron.
Rolls,E.T. (2008) Memory, Attention, and Decision-Making. Oxford University Press.
www.oxcns.org [email protected]
Digital Signal Processing
and the Brain
Oxford
University Press
2008
Parts available at
www.oxcns.org/
publications
Brain computation vs Digital computer computation
1. Dot product similarity vs logical functions e.g. AND, NAND, OR
followed by a threshold
2. 10,000 inputs per neuron vs several inputs to a logic gate
3. Content-addressable memory vs data accessed by address
4. Fault tolerant: dot product vs no inherent fault tolerance
5. Generalization and completion vs only exact match in the hardware
6. Fast: inherently parallel update of a vs serial processing with fast components
neuron, and of neurons in a network
7. Approximate, low precision vs high precision 64 bit
8. Stochastic spiking noise: vs noise free because of non-linearities
probabilistic computation
helps originality, creativity
9. Dynamics are parallel – vs inherently serial
an attractor network retrieves in 1 - 2
time constants of the synapses.
10. Heuristics, e.g. vs attempt to understand a whole scene
Invariant object recognition
uses temporo-spatial constraints,
analyses only part of a scene.
11. Syntax: none inherent vs syntactical operations on data at an address
12. The architecture adapts vs fixed architecture with different software
to implement different computations.
13. Sparse distributed representation vs binary encoding in a computer word, e.g. 64 bits
14. Mind vs brain vs software vs hardware
Each neuron has approximately 10,000 inputs though synapses. The
neuron sums the currents produced by each input firing rate x_j
injected through each synapse w_ij to produce an activation h_i.
hi = Σj xj . wij
The activations are converted into spikes when the neurons reaches a threshold for
firing. This is a non-linear operation and produces digital spikes, with a firing rate
that depends on the strength of the activation. The output firing yi is a function of
the activation
yi = f (hi)
Neurodynamical Modeling:
Neurons, Synapses & Cortical Architecture
Spiking Neuron -> Integrate-and-Fire Model:
)())(()( tIVtVgtVdt
dsynLimim
)(tVi
t
Spikes
Reset
Synaptic Dynamics:
EPSP, IPSP)(tI syn
Spike
Spike
SomaSynapses
synCmC mR
synR
k
k
j
rise
j
j
jj
decay
j
j
j
jijiEi
tttx
txdt
d
tstxts
tsdt
d
tswtVfVtVgtI
)()(
)(
))(1)(()(
)(
)())(())(()(
Dynamical
Cooperation and
Competition
NMDA
AMPANMDA
AMPA
NMDA
AMPA
GABA
GABA
PoolInhibitoryExcitatory
Pool
)()()()()( ,,,, tItItItItI recGABArecNMDArecAMPAextAMPAsyn
Pattern Association Memory
Learning Rule:
Where xi approximates ei (which is the unconditioned or external or forcing input)
δwij = k . yi . xj
Recall: The activation,
hi = Σj xj . wij
where the sum is over the C input axons, xj
The output firing yi is a function of the activation
yi = f (hi)
This activation function f may be linear, sigmoid, binary threshold, etc.
Long Term Potentiation and Long Term Depression of synapses
Auto-association Memory
Learning Rule:
Where yi approximates ei (which is the external input)
δwij = k . yi . xj (Hebb rule)
and yi is the firing of the output neuron (post-synaptic term)
xj is the firing of input axon j (presynaptic term)
wij is the synapse to output neuron i from input axon j.
Recall: The (internal) activation produced by the recurrent collateral effect (after the first
iteration) is
hi = Σj xj . wij where the sum is over the C axons indexed by j.
The output firing yi is a function of the activation produced by the recurrent collateral effect and
by the external input (ei):
yi = f (hi + ei)
The activation function f may be binary threshold, linear threshold, sigmoid, etc.
Temporal
‘What’
Hippocampal
‘Memory’
Stimulus 1
Stimulus 2
Stimulus 3
Stimulus 4
How is
information
encoded by the
inferior
temporal visual
cortex?
Rate Code
Local (Grandmother), Distributed and Sparse
Distributed Coding
• Local (Grandmother): One cell codes
for one stimulus and fires only in response to
this one.
• Fully Distributed: Almost every cell
participates in encoding all stimuli. (With
binary firing rates half the cells respond
to any stimulus)
• Sparse Distributed: The cell responds
to a certain number of stimuli. Intermediate
between Grandmother and Fully distributed.
Inferior Temporal Visual Cortex neuron
firing rate response to a set of 68 visual stimuli
Rate
Stimuli
How does the information scale with the number of neurons ?
Approximately linearly.
Optimal vs Dot Product decoding.
DP
Optimal
Optimal
DP
How does the number of stimuli encoded scale with the
number of neurons ? (Information uses a log scale.)
An attractor network for probabilistic decision-making, with lambda 1 and 2 inputs, and noise from the neuronal spiking influencing which decision attractor, D1 or D2, wins.
This is also a model for short-term memory.
Rolls and Deco 2010 The Noisy Brain: Stochastic
Dynamics as a Principle of Brain Function. Oxford.
Deco, Rolls et al 2013 Progress in Neurobiology 103
Decision inputs
Integrate-and-fire simulations predict earlier and higher neuronal responses on easy vs difficult trials.
Confidence is reflected in the higher firing on easy trials.
The decision stimuli start at t=2 s
Rolls, Grabenhorst and Deco 2010 Neuroimage
• The theoretical framework based in statistical mechanics predicts higher and faster neuronal responses as Delta I, the difference between the two stimuli, increases.
• Delta I is a measure of the easiness of the decision, and subjective confidence ratings correlate with this.
Rolls, Grabenhorst and Deco 2010a Neuroimage
• Is the brain a digital signal processor?
• Digital vs continuous signals
• Digital signals involve streams of binary encoded numbers
• The brain uses digital, all or none, action potentials or spikes for information transmission in its neuronal axons over distances of 1 mm to several metres to avoid the uncertain decay of an analog signal in a long non-uniform cable.
• But the spikes are then converted into analog signals within a neuron, which has a threshold for the summed depolarization produced by many small synaptic currents to elicit an action potential, which is binary.
• The brain does not work by logical operations as in digital computers, but instead by the similarity of an input vector with a synaptic weight vector.
• The spikes in the brain have Poisson timing, i.e. are random in time for a given mean firing rate. This leads to stochastic processing in the brain, related to the timing of the digital inputs to a neuron.
Digital Signal Processing
and the Brain
Brain computation vs Digital computer computation
1. Dot product similarity vs logical functions e.g. AND, NAND, OR
followed by a threshold
2. 10,000 inputs per neuron vs several inputs to a logic gate
3. Content-addressable memory vs data accessed by address
4. Fault tolerant: dot product vs no inherent fault tolerance
5. Generalization and completion vs only exact match in the hardware
6. Fast: inherently parallel update of a vs serial processing with fast components
neuron, and of neurons in a network
7. Approximate, low precision vs high precision 64 bit
8. Stochastic spiking noise: vs noise free because of non-linearities
probabilistic computation
helps originality, creativity
9. Dynamics are parallel – vs inherently serial
an attractor network retrieves in 1 - 2
time constants of the synapses.
10. Heuristics, e.g. vs attempt to understand a whole scene
Invariant object recognition
uses temporo-spatial constraints,
analyses only part of a scene.
11. Syntax: none inherent vs syntactical operations on data at an address
12. The architecture adapts vs fixed architecture with different software
to implement different computations.
13. Sparse distributed representation vs binary encoding in a computer word, e.g. 64 bits
14. Mind vs brain vs software vs hardware
On differences between signal processing in the brain and in digital computers:
Rolls,E.T. (2012) Neuroculture: On the Implications of Brain Science. Oxford University Press: Oxford. Section 2.15, pages 88-98.
On the signal processing performed by neuronal networks in the brain:
Rolls,E.T. (2008) Memory, Attention, and Decision-Making. Oxford University Press. Appendix 2:
http://www.oxcns.org/papers/RollsMemoryAttentionAndDecisionMakingContents+Appendices1+2.pdf
On the representation of information in the brain:
Rolls,E.T. and Treves.A. (2011) The neuronal encoding of information in the brain. Progress in Neurobiology 95: 448-490.
http://www.oxcns.org/papers/508 Rolls and Treves 2011 Neuronal encoding of information in the brain.pdf
Papers and contact information:
www.oxcns.org [email protected]
Digital Signal Processing and the Brain.
Further reading
DCSP-1: Introduction
• Jianfeng Feng
• Department of Computer Science Warwick Univ., UK
• http://www.dcs.warwick.ac.uk/~feng/dcsp.html
Time
• Monday (L) 14.00 –- 15.00 CS 1.01
• Tuesday (L) 16.00 -- 17.00 CS 1.01
• Thursday (S) 13.00 -- 14.00 CS 1.01
From this week, seminar starts
My research: 冯建峰
Using MRI to see your brain
• http://dcs.warwick.ac.uk/~feng
My research: 冯建峰
•In general, our research is about dealing with big data
(brain research)
• http://dcs.warwick.ac.uk/~feng
first
secondthird
fouth?
Brain_like AI
1831
1785
1970s
2020?AI
Are we facing another revolution?
Apple Siri
Google NowDeepMind
Deep face(Facebook)
Deep medicine
(J Rothberg & T Xu)
Google brain(Andrew Ng)
Automatic drive on Motoway
IBM
Watson
Microsoft
Automatic translations
Eugene Goostman
Past Turing test
Applications:Smart city, financial markets, medicine……
Brain-like AI:new revolution
2012
2013
2014
2015
2016
Apple Siri
Google NowDeepMind
Deep face(Facebook)
Deep medicine
(J Rothberg & T Xu)
Google brain(Andrew Ng)
Automatic drive on Motoway
IBM
Watson
Microsoft
Automatic translations
Eugene Goostman
Past Turing test
Applications:Smart city, financial markets, medicine……
Brain-like AI:new revolution
2012
2013
2014
2015
2016
Deep learning
Big Data
• Big Data is as ‘petroleum to our society’
• This module is all about data data data
• This module will enable you to equip with skills to
deal with big data
• ‘Most practical module in our two years’ –
comments from students of previous years
Big Data
• Very successful with deep learning
• But it deals with only static data such as face
• We will deal with dynamic data (language, video
etc.)
Announcement for Seminars
•DCSP seminars (to cover DCSP tutorial problems)
start in Week 2.
•only attend one seminar per week.
Assignment
• Assignment will be issued in Week 4
• Worth 20% of the module assessment
• Submission deadline is 12 noon on
thursday Week 10 -- 17th March 2016
(The winner will be awarded 500 RMB)
References
• Any good book about digital communications and
digital signal processing
• Wikipedia, the free encyclopedia or public
lectures
• Lecture notes is available at
http://www.dcs.warwick.ac.uk/~feng/teaching/dcsp.html
• Public lectures
Outline of today’s lecture
• Digital vs. analog
• Module outline
• Data transmission (sampling)
Signal: video, audio, etc. -- Data
Signal: video, audio, etc. -- Data
information carrying signals are divided into two broad classes
Continuous x(t) vs . Digital x[n]
Two types of signal
Continuous (Analog) Signals
• Analog signals: continuous (electrical) signalsthat vary in time
• Most of the time, the variations follow that of the non-electric (original) signal.
• The two are analogous hence the name analog.
Example ITelephone voice signal is analog.
• The intensity of the voice causes electric current variations.
• At the receiving end, the signal is reproduced in the same proportion.
Example ITelephone voice signal is analog.
• The intensity of the voice causes electric current variations.
• At the receiving end, the signal is reproduced in the same proportion.
Digitized, communication and
processing
Example II
Digital Signals
Non-continuous, they change in individual steps.
Consist of pulses with discrete levels or values.
Each pulse is constant, but there is an abrupt
change from one digit to the next.
Example I
Example II: please meet
• Two-dimensional signal x[n1,n2], n1,n2∈Z
• A point on a grid → pixel
• Grid is usually regularly spaced
• Values x[n1,n2] refer to the pixel’s appearance
Example III: Our brain, Neuronal Activities
Vertical white bar: spike
soma
Dendrite
axon
synapse
As in Prof. Rolls talk on Monday
Advantages I
a. The ability to process a digital signal means that
errors caused by random processes can be detected
and corrected.
b. Digital signals can also be sampled instead of
continuously monitored and multiple signals can be
multiplexed together to form one signal.
Advantages IIBecause of all these advantages, and because
c. Advances in wideband communication channels and
solid-state electronics have allowed scientists to
fully realize a and b
digital communications has grown quickly.
d. Digital communications is quickly edging out
analog communication because of the vast demand
to transmit computer data and the ability of digital
communications to do so.
Module Summary I
• One sentence: Deal with digital signals
• Data transmission: Channel characteristics,
signalling methods, interference and noise,
data compression and encryption;
• Information Sources and Coding:Information theory, coding of information for
efficiency and error protection;
Module Summary II
• Data transmission: Channel characteristics, signalling methods, interference and noise, synchronisation, data compression and encryption;
• Information Sources and Coding: Information theory, coding of information for efficiency and error protection;
• Signal Representation: Representation of discrete time signals in time and frequency; z transform and Fourier representations; discrete approximation of continuous signals; sampling and quantisation; stochastic signals and noise processes;
Module Summary III
• Data transmission: Channel characteristics, signalling methods, interference and noise, synchronisation, data compression and encryption;
• Information Sources and Coding: Information theory, coding of information for efficiency and error protection;
• Signal Representation: Representation of discrete time signals in time and frequency; z transform and Fourier representations; discrete approximation of continuous signals; sampling and quantisation; stochastic signals and noise processes;
• Filtering: Analysis and synthesis of discrete time filters; finite impulse response and infinite impulse response filters; frequency response of digital filters; poles and zeros; filters for correlation and detection; matched filters;
Module Summary IV
• Data transmission: Channel characteristics, signalling methods, interference and noise, synchronisation, data compression and encryption;
• Information Sources and Coding: Information theory, coding of information for efficiency and error protection;
• Signal Representation: Representation of discrete time signals in time and frequency; z transform and Fourier representations; discrete approximation of continuous signals; sampling and quantisation; stochastic signals and noise processes;
• Filtering: Analysis and synthesis of discrete time filters; finite impulse response and infinite impulse response filters; frequency response of digital filters; poles and zeros; filters for correlation and detection; matched filters;
• Digital Signal Processing applications:Processing of images and sound using digital techniques.
Module Summary V
Data Transmission I: General Form
• A modulator that takes the source signal and transforms it so that it is physically suitable for the transmission channel
Data Transmission II:
• A modulator that takes the source signal and transforms it so that it is physically suitable for the transmission channel
• A transmitter that actually introduces the modulated signal into the channel, usually amplifying the signal as it does so
Data Transmission II:
• A modulator that takes the source signal and transforms it so that it is physically suitable for the transmission channel
• A transmitter that actually introduces the modulated signal into the channel, usually amplifying the signal as it does so
• A transmission channel that is the physical link between the communicating parties
Data Transmission II:
• A modulator that takes the source signal and transforms it so that it is physically suitable for the transmission channel
• A transmitter that actually introduces the modulated signal into the channel, usually amplifying the signal as it does so
• A transmission channel that is the physical link between the communicating parties
• a receiver that detects the transmitted signal on the channel and usually amplifies it (as it will have been attenuated by its journey through the channel)
• A demodulator that receives the original source signal from the received signal and passes it to the sink
Data Transmission II:
• A modulator that takes the source signal and transforms it so that it is physically suitable for the transmission channel
• A transmitter that actually introduces the modulated signal into the channel, usually amplifying the signal as it does so
• A transmission channel that is the physical link between the communicating parties
• a receiver that detects the transmitted signal on the channel and usually amplifies it (as it will have been attenuated by its journey through the channel)
• A demodulator that receives the original source signal from the received signal and passes it to the sink
Data Transmission II:
Digital data is universally represented by strings of 1s or 0s.
Each one or zero is referred to as a bit.
Often, but not always, these bit strings are interpreted as numbers in a binary number system.
Thus 1010012=4110.
The information content of a digital signal is equal to the number of bits required to represent it.
Thus a signal that may vary between 0 and 7 has an information content of 3 bits.
Data Transmission III:
Data Transmission IVWritten as an equation this relationship is
I= log2(n) bits
where n is the number of levels a signal may take.
It is important to appreciate that information is a measure of the number of different outcomes a value may take.
The information rate is a measure of the speed with which information is transferred. It is measured in bits/second or b/s.
Bandwidth
Signal is bandlimited if it contains no energy
at frequencies higher than some bandlimit
or bandwidth B
ExamplesAudio signals. An audio signal is an example of an analogue signal.
It occupies a frequency range from about 200 Hz to about 15KHz.
Speech signals occupy a smaller range of frequencies, and telephone
speech typically occupies the range 300 Hz to 3300 Hz..
The range of frequencies occupied by the signal is called its
bandwidth (B = f2 – f1 ~ f2)
ExamplesTelevision. A television signal is an analogue signal created by linearly
scanning a two dimensional image.
Typically the signal occupies a bandwidth of about 6 MHz.
Teletext is written (or drawn) communications that are interpreted visually.
Reproducing cells, in which the daughter cells's DNA contains information from the parent cells;
A disk drive
Our brain
ADC and DAC
send analogue signals over a digital communication system, or process them on a digital computer, to convert analogue signals to digital ones.
preformed by an analogue-to-digital converter (ADC).
The analogue signal is sampled (i.e. measured at regularly spaced instant)
The converse operation to the ADC is performed by a digital-to-analogue converter (DAC).
Example
Example
How can we get the original signal back?
How fast we have to sample to
recover the original signal?
The ADC process is governed by an important law.
(will be discussed in Chapter 3)
An analogue signal of bandwidth B can be
completely recreated from its sampled from
provided its sampled at a rate equal to at
least twice it bandwidth S > 2 B
Nyquist-ShannonThm