Upload
allyson-parks
View
216
Download
0
Embed Size (px)
Citation preview
On Fuzzy image processing
By
A. Lecture KARRAR DH. MOHAMMED
HistoryIn the 1970s, digital image processing proliferated,
whencheaper computers and dedicated hardware became
available.Images could then be processed in real time, forsome dedicated problems such as television standardsconversion. As general-purpose computers became
faster,they started to take over the role of dedicated hardwarefor all but the most specialized and compute-intensive
operations.
History
With the fast computers and signal processors available
in the 2000s, digital image processing has become the
most common form of image processing, and is generally
used because it is not only the most versatile method,
but also the cheapest.
What is an Image1. An image f (x, y) is 2-dimensional light intensity
function ,where f measures brightness at position (x, y).
2. A digital image is a representation of an image by a 2-D array of discrete samples.
3. The amplitude of each sample is represented by a finite number of bits.
4. Each element of the array is called a pixel.
Terminology
Images: An image is a two-dimensional signal whose intensity
at any point is a function of two spatial variables.
Examples are photographs, still video images, radar and
sonar signals, chest and dental X-rays.
An image sequence such as that seen in a
television is a three dimensional signal for
which the image intensity at any point is
a function of three variables: two spatial
variables and time.
1. Digital image processing is a term used to describethe manipulation of image data by a computer.
2. The process of transforming an image to a set ofnumbers, which a computer can utilized, is calleddigitization.
3. Digitization is to divide an image up into severalpicture elements called pixels. A pixel is the smallestresolvable unit of an image which the computerhandles.
4. The value of a pixel is referred to as its gray level and can be thought of as the intensity or brightness (or darkness) of the pixel.
5. The number of different gray-levels a pixel can have varies from system to system, and is determined by the hardware that produces or displays the image.
Why do we process images
Images (and videos) are every where .This includes different imaging modalities such as visual, X-ray, ultrasound, ] etc. Multimedia information will be the wave of the future. Diverse applications in astronomy, biology,
geology, geography, medicine, law enforcement, defense,
Industrial inspection, require processing of images.
Grayscale and Color Images1. For grayscale image, 256 levels or 8 bits/pixel is
sufficient for most applications
2. For color image, each component (R, G, B) needs 256 levels or 8 bits/pixel
3. Storage for typical images
(a) 512 × 512, 8 bits grayscale image: 262,144B
(b) 1024×768, 24 bits true color image: 2,359,296B
Grayscale Image
Color Images
X R (n,m), X G (n,m), X B (n,m)
F(x, y) F(m, n), 0 ≤ m ≤ M − 1,0 ≤ n ≤ N − 1
A digital image can be written as a matrix
1) - N1, - x(M... 1) 1, - x(M0) 1, - x(M
.... .... ....
.... .... ....
.... .... ....
.... .... ....
1) - N x(1,..... 1) (1, x0) x(1,
1) - N x(0,..... 1) (0, x0) x(0,
F
Image Operations can be classified as Linear andnon-linear Operations:
H is a linear operator if if satisfies the superposition
principle:
H(af +bg) = aH(f)+bH(g)
for all images f and g and all constants a and b.
1. Mean filtering: Linear
2. Median filtering: Non-linear
Simple Operations On Images
Digital Negative: Given an image F, the Digital Negative of F is defined as
F Negative (m, n) = 255 − F(m, n)
Feature Enhancement by Subtraction
A Brief History of Lena (Lenna)Anyone familiar with digital image processing will surelyrecognize the image of Lena. While going through someold usenet discussions, I got to know that Lena has ahistory worth all the attention that has been paid to herover the years by countless image processing researchers.Lena Sjblom, (also spelled Lenna by many publications)was the Playboy playmate in November 1972 and roseto fame in the computer world when researchers at theUniversity of Southern California scanned and digitizedher image in June 1973. (Lena herself never know of herfame until she was interviewed by a computer magazinein Sweden where she lives with her husband and children).
A Brief History of Lena (Lenna)
According to the IEEE PCS Newsletter of May/June2001, they were hurriedly searching for a glossy imagewhich they could scan and use for a conference paperwhen someone walked in with a copy of Playboy. Theengineers tore off the top third of the centerfold andscanned it with a Muirhead wire photo scanner (a distantcry from the flatbed scanners of today) by wrapping itaround the drum of the scanner. (Now you know whythe image shows only a small part of the entire picture..discounting of course, the fact that the complete picturewould raise quite a few eyebrows.
Linear Stretching
1. Enhance the dynamic range by stretching the original gray levels to the range of 0 to
2. Example
(a) The original gray levels are [100, 150].
(b) The target gray levels are [0, 255].
(c) The transformation function
g(f) = ((f − 100)/50) ∗ 255 for100 ≤ f ≤ 150
Illustration of Linear Stretching
Image/video Processing Methods
1. Image Enhancement
2. Image Restoration
3. Compression
4. Image reconstruction
5. Morphological image processing
6. Feature extraction and recognition, computer vision
Other Image Operations
Image algebra includes mathematical comparisons, altering values of pixels, thresholding, edge detection and noise reduction.
1. Neighborhood averaging is to avoid extreme fluctuations in gray level from pixel to pixel. It is also very effective tool for noise reduction.
2. Image Scaling is a means of reducing or expanding the size of an image using existing image data.
3. Histogram Equalization is an adjustment of grayscale based on gray-level histogram. This is effectivein enhancing the contrast of an image.4. Edge Detection is an operation of measuring andanalyzing the features in an image by detecting andenhancing the edges of the features. The most commonedge detection method is gradient detection.5. Image Restoration: Given a noisy image y(m, n)y(m, n) = x(m, n)+v(m, n)where x(m, n) is the original image and v(m, n) isnoise. The objective is to recover x(m, n) from y(m, n).
Color Restoration
Photo Restoration
6. Contrast Enhancement: how to enhance the contrast of an image?
1. Low contrast image values concentrated near narrow range (mostly dark, or mostly bright, or mostly medium values)
2. Contrast enhancement change the image value
distribution to cover a wide range
3. Contrast of an image can be revealed by its histogram
Histogram The histogram of an image with L possible
gray levels, f = 0, 1, · · · , L − 1 is defined as:
where
– nl is the number of pixels with gray level l.
– n is the total number of pixels in the image.
n
nllp )(
Examples of Histograms
ApplicationsAstronomy: Hubble Space Telescope : This
telescope has limitation in resolution due to atmospheric turbulence.
Optical problem in a telescope results in blurred, out of focus image. Digital image processing is normally used to recover the desired information from these images.
Applications
Medical Imaging: Most of advanced medical imaging tools are based on DSP tools. X-Ray computerized Tomography (X-ray CT) is capable of generating a cross-sectional display of the body. This involves X-ray generation, detection, digitization, processing and computer image reconstruction. Similarly, NMRCT (nuclear magnetic resonance).
MRI
Ultrasound
FingerprintIn 1684, an English plant morphologist
published the first scientific paper reporting
his systematic study on the ridge and pore structure
in fingerprints.
A fingerprint image may be classified as:
(a) Offline: Inked impression of the fingertip on a paper is scanned(b) Live-scan: Optical sensor, capacitive sensors, ultrasound sensors, ...At the local level, there are different local ridge characteristics. The two most prominent ridge characteristics, called minutiae, are:(a) Ridge termination(b) Ridge bifurcationAt the very-fine level, intra-ridge details (sweat pores) can
be detected. They are very distinctive; however, very high-resolution images are required.
Face Recognition
Face Recognition Methods(a) Template matching using minimum-distance
classifiers metrics
(b) Linear discriminants
(c) Bayesian approach
WatermarkingThe World Wide Web and theprogress in multimedia storage and transmissiontechnology expanded the possibility of illegal copyingand reproducing of digital data. Digital watermarkingrepresents a valid solution to the aboveproblem, since it makes possible to identify thesource, author, creator, owner, distributor or authorizedconsumer of digitized images, video recordingsor audio recordings. A digital watermarkis an identification code, permanently embeddedinto digital data, carrying information pertainingto copyright protection and data authentication.
)a (Copyright protection and authentication
Image Compression Techniques
1. JPEG 2000 standard is based on wavelets2. JPEG (original) is based on the Discrete Cosine
An Example of Image Compression
What does Fuzzy Image Processing mean ?
Fuzzy image processing is not a unique theory. It is a collection of different fuzzy approaches to image processing. Nevertheless, the following definition can be regraded as an attempt to determine the boundaries:
Fuzzy image processing is the collection of all approaches that understand, represent and process the images, their segments and features as fuzzy sets. The representation and processing depend on the selected fuzzy technique and on the problem to be solved( From: Tizhoosh, Fuzzy Image Processing, Springer ( 1997)
Fuzzy image processing has three main stages: image fuzzification, modification of membership values, and, if necessary, image defuzzification see figure below
The general structure of fuzzy image processing .