Upload
diana-webster
View
218
Download
0
Tags:
Embed Size (px)
Citation preview
Computer Vision – Computer Vision – Compression(1)Compression(1)
Hanyang University
Jong-Il Park
Department of Computer Science and Engineering, Hanyang University
Image CompressionImage Compression
The problem of reducing the amount of data required to represent a digital image
Underlying basis Removal of redundant data
Mathematical viewpoint Transforming a 2-D pixel array into a statistically
uncorrelated data set
Department of Computer Science and Engineering, Hanyang University
Topics to be coveredTopics to be covered
Fundamentals Basic concepts of source coding theorem
Practical techniques Lossless coding Lossy coding
Optimum quantization Predictive coding Transform coding
Standards JPEG MPEG Recent issues
Department of Computer Science and Engineering, Hanyang University
History of image compressionHistory of image compression Theoretic foundation
C.E.Shannon’s works in 1940s
Analog compression Aiming at reducing video transmission bandwidth
Bandwidth compression Eg. Subsampling methods, subcarrier modulation…
Digital compression Owing to the development of ICs and computers Early 70s: Facsimile transmission – 2D binary image
coding Academic research in 70s to 80s Rapidly matured around 1990. standardization such
as JPEG, MPEG, H.263, …
Department of Computer Science and Engineering, Hanyang University
Data redundancyData redundancy
Data vs. information
Data redundancy Relative data redundancy
Three basic redundancies1. Coding redundancy
2. Interpixel redundancy
3. Psychovisual redundancy
ratio)n compressio(/ where,1
1 21 nnCC
R RR
D
Department of Computer Science and Engineering, Hanyang University
Coding redundancyCoding redundancy
Code: a system of symbols used to represent a body of information or set of events
Code word: a sequence of code symbols
Code length: the number of symbols in each code word
Average number of bits
)()(1
0kr
L
kkavg rprlL
Department of Computer Science and Engineering, Hanyang University
Eg. Coding redundancyEg. Coding redundancy
Reduction by variable length coding
Department of Computer Science and Engineering, Hanyang University
CorrelationCorrelation
Cross correlation
Autocorrelation
1
0
1
0
),(),(*1
),(),(M
m
N
n
nymxhnmfMN
yxhyxf
),(),(*),(),( vuHvuFyxhyxf
2|),(|),(),(*),(),( vuFvuFvuFyxfyxf
Department of Computer Science and Engineering, Hanyang University
Eg. CorrelationEg. Correlation
Department of Computer Science and Engineering, Hanyang University
Interpixel redundancyInterpixel redundancy
Spatial redundancy Geometric redundancy Interframe redundancy
Department of Computer Science and Engineering, Hanyang University
Eg. Interpixel redundancyEg. Interpixel redundancy
Department of Computer Science and Engineering, Hanyang University
Eg. Run-length codingEg. Run-length coding
Department of Computer Science and Engineering, Hanyang University
Psychovisual redundancyPsychovisual redundancy
+
Department of Computer Science and Engineering, Hanyang University
Image compression modelsImage compression models
Communication model
Source encoder and decoder
Department of Computer Science and Engineering, Hanyang University
Basic concepts in information theoryBasic concepts in information theory
Self-information: I(E)= - log P(E) Source alphabet A and symbols Probability of the events z Ensemble (A, z) Entropy(=uncertainty): Channel alphabet B Channel matrix Q
J
ijjj aPaPH )(log)()(z
Qzv
Department of Computer Science and Engineering, Hanyang University
Mutual information and capacityMutual information and capacity
Equivocation: Mutual information: Channel capacity C
Minimum possible I(z,v)=0 Maximum possible I over all possible choices of source
probabilities in z is the channel capacity
)()(),( v|zzvz HHI )( v|zH
)],([max vzzIC
Department of Computer Science and Engineering, Hanyang University
Eg. Binary Symmetric ChannelEg. Binary Symmetric Channel
Entropy
Mutualinformation
Channel capacity
0 0
1 1
1-pe
1-pe
pe
pe
pbs
1-pbs
BSC
Department of Computer Science and Engineering, Hanyang University
Noiseless coding theoremNoiseless coding theorem
Shannon’s first theorem for a zero-memory source
It is possible to make L’avg/n arbitrarily close to H(z) by coding infinitely long extensions of the source
Efficiency = entropy/ L’avg
Eg. Extension coding Extension coding better efficiency
nH
n
LH avg 1
)('
)( zz
Department of Computer Science and Engineering, Hanyang University
Extension codingExtension coding
A
B
A Efficiency =0.918/1.0=0.918
B Efficiency =0.918*2/1.89=0.97
Better efficiency
Department of Computer Science and Engineering, Hanyang University
Noisy coding theoremNoisy coding theorem
Shannon’s second theorem for a zero-memory channel:
For any R<C, there exists an integer r and code of block length r and rate R such that the probability of a block decoding error is arbitrarily small.
Rate-Distortion theory
The source output can be recovered at the decoder with an arbitrarily small probability of error provided that the channel has capacity C > R(D)+e.
x
x
x
x
NeverFeasible!
feasible
Department of Computer Science and Engineering, Hanyang University
Using mappings to reduce entropyUsing mappings to reduce entropy
1st order estimate of entropy
> 2nd order estimate of entropy
> 3rd order estimate of entropy
….
The (estimated) entropy of a properly mapped image (eg. “difference source”) is in most cases smaller than that of original image source.
How to implement ?
The topic of the next lecture!