12
Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur FINAL PROJECT REPORT Ajay Vasudevan, Soumyanil Banerjee {ajayvasu,soumbane}@umich.edu Class EECS 556 Winter 2015 Date 25 April 2015 Abstract — This project proposes to study and replicate the work in the paper titled “Parametric Blur Estimation for Blind Restoration of Natural Images: Linear Motion and Out-of-Focus” [1]. The paper attempts to estimate the parameters of two types of blurs, linear motion (which is when the camera moves in a uniform linear motion at an angle and is characterized by a line of length L and angle θ) and out-of-focus (which is when the camera is out of focus and is modeled as a uniform disk characterized by its radius which specifies by how much the camera is out of focus), for blind restoration of natural images. So, we plan to estimate the parameters L and θ of the PSF (for uniform linear motion) and radius R (for out of focus). While estimating these parameters we assume the images are natural images, i.e., they have an approximately isotropic power spectrum and a power law decay with spatial frequency. The Radon-d & the Radon-c transforms, which are the modified Radon Transforms, are used to identify the motion blur pattern and then an appropriate function is fitted to identify the blur parameters. 0. WHAT IS DECONVOLUTION For simplicity, first let us consider a 1D signal f given by ൌ ሺ… , ାଵ ,…, , ,…, , ାଵ ,…ሻ and a discrete point spread function h given by ൌ ሺ… 0, ାଵ ,…, , ,…, , ,0…ሻ therefore, the blurred signal vector which is the convolution of f and h can be written as, ഥൌ ୀ ାଵ ̅ ൌ ሾ ... For real signals we truncate the signal values so that it closely resembles the original signal. For example, if we have a sound signal, it will probably have a finite extent but if it is very large in dimension then we truncate some of the low values so that the resulting signal is very close to the original signal. Deconvolution is the method to retrieve original signal f from blurred signal g. Now extending this to the 2D domain (for images), the 1D convolution * is replaced by 2D convolution **. In most image processing applications the original image f(x,y) is blurred by a blur Point Spread Function(PSF) h(x,y) and corrupted by an additive noise n, to produce a blurred and noisy image g(x,y). Generally this noise is a set of independent samples of zero-mean Gaussian noise of variance σ 2 . So the general equation is given by g(x,y) = f(x,y) ** h(x,y) + n(x,y) In image restoration problems the same equation above can be written in matrix-vector notation as g = Af + n where A is the system matrix having the blur values as its elements. f is the image we need to estimate from g. Now, deconvolution is the method to retrieve original image f from blurred image g. The deconvolution is called blind deconvolution when the blur PSF is partially known or completely unknown. Blind deconvolution is particularly difficult and the blur PSF is generally estimated using various algorithms depending on the situation. We have learnt inverse filtering and Wiener filtering in class which will be useful for restoration once the blur kernel has been estimated. I. INTRODUCTION Deconvolution/Deblurring is an image processing technique to estimate an original image from an observed image. In standard deconvolution (also called non-blind deconvolution) the blur point spread function (PSF) is known, so the process of deblurring becomes much simpler. But usually this is not the case. The blur PSF is often unknown totally or partially and this process of deconvolution of the image while estimating this unknown PSF is called Blind Image Deconvolution (BID), which is much more difficult than its non-blind counterpart. In this paper [1], the authors propose a blur estimation technique where they obtain a blur estimate from the observed image and then use it in a non-blind deblurring algorithm. By blur estimate, we mean that to estimate the length and angle for linear uniform motion blur and radius for out-of-focus blur.

Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

  • Upload
    others

  • View
    5

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion

and Out-of-Focus Blur FINAL PROJECT REPORT

Ajay Vasudevan, Soumyanil Banerjee {ajayvasu,soumbane}@umich.edu

Class – EECS 556 Winter 2015 Date – 25 April 2015

Abstract — This project proposes to study and replicate the

work in the paper titled “Parametric Blur Estimation for Blind Restoration of Natural Images: Linear Motion and Out-of-Focus” [1]. The paper attempts to estimate the parameters of two types of blurs, linear motion (which is when the camera moves in a uniform linear motion at an angle and is characterized by a line of length L and angle θ) and out-of-focus (which is when the camera is out of focus and is modeled as a uniform disk characterized by its radius which specifies by how much the camera is out of focus), for blind restoration of natural images. So, we plan to estimate the parameters L and θ of the PSF (for uniform linear motion) and radius R (for out of focus). While estimating these parameters we assume the images are natural images, i.e., they have an approximately isotropic power spectrum and a power law decay with spatial frequency. The Radon-d & the Radon-c transforms, which are the modified Radon Transforms, are used to identify the motion blur pattern and then an appropriate function is fitted to identify the blur parameters.

0. WHAT IS DECONVOLUTION For simplicity, first let us consider a 1D signal f given by

… , , … , , , … , , , …   and a discrete point spread function h given by

… 0, , … , , , … , , , 0…   therefore, the blurred signal vector which is the convolution of f and h can be written as,

 

. . .  

For real signals we truncate the signal values so that it closely resembles the original signal. For example, if we have a sound signal, it will probably have a finite extent but if it is very large in dimension then we truncate some of the low values so that the resulting signal is very close to the original signal.

Deconvolution is the method to retrieve original signal f from blurred signal g.

Now extending this to the 2D domain (for images), the 1D convolution * is replaced by 2D convolution **. In most image processing applications the original image f(x,y) is blurred by a blur Point Spread Function(PSF) h(x,y) and corrupted by an additive noise n, to produce a blurred and noisy image g(x,y). Generally this noise is a set of independent samples of zero-mean Gaussian noise of variance σ2. So the general equation is given by

g(x,y) = f(x,y) ** h(x,y) + n(x,y)

In image restoration problems the same equation above can be written in matrix-vector notation as

g = Af + n

where A is the system matrix having the blur values as its elements. f is the image we need to estimate from g.

Now, deconvolution is the method to retrieve original image f from blurred image g. The deconvolution is called blind deconvolution when the blur PSF is partially known or completely unknown. Blind deconvolution is particularly difficult and the blur PSF is generally estimated using various algorithms depending on the situation. We have learnt inverse filtering and Wiener filtering in class which will be useful for restoration once the blur kernel has been estimated.

I. INTRODUCTION Deconvolution/Deblurring is an image processing technique

to estimate an original image from an observed image. In standard deconvolution (also called non-blind deconvolution) the blur point spread function (PSF) is known, so the process of deblurring becomes much simpler. But usually this is not the case. The blur PSF is often unknown totally or partially and this process of deconvolution of the image while estimating this unknown PSF is called Blind Image Deconvolution (BID), which is much more difficult than its non-blind counterpart. In this paper [1], the authors propose a blur estimation technique where they obtain a blur estimate from the observed image and then use it in a non-blind deblurring algorithm. By blur estimate, we mean that to estimate the length and angle for linear uniform motion blur and radius for out-of-focus blur.

Page 2: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

All the methods proposed before [1] for the deblurring algorithms assume some form of prior knowledge about the parameters of the image, which can be obtained manually or from similar images. These parameters are usually expressed by modeling the statistics of some feature(s), such as first order differences, the Laplacian, etc. But the method proposed in this paper does not involve knowledge of any parameter beforehand, thus it is truly blind for the class of blur filters considered. The only weak assumptions are that the original image is natural (has an approximately isotropic power spectrum) and that the blur results from either linear uniform motion or wrong focusing.

The above classes of blurs have well defined patterns of

zeros in the spectral domain. The method proposed in this paper works on the spectrum of the blurred images with the above assumptions taken into consideration. The authors of this paper introduce two modifications to the Radon transform to identify the patterns of linear motion blur and out-of-focus blur. These are termed as Radon-d (for linear motion blur) and Radon-c (for out-of-focus blur). The former is characterized by performing integration over the same area (of maximum inscribed square inscribed inside the image spectrum as shown in Figure 2) of the image spectrum. This means that the area of integration for different θ never exceeds the maximum inscribed square and the area is bounded for all angles (see figure 2). While the latter performs integration along circles. The identification of the blur parameters is made by fitting appropriate functions that account separately for the natural image spectrum and the blur spectrum. The restored images will be compared with those produced by state-of-the-art methods for blind image deconvolution.

From EECS 556 course we know how to calculate the

frequency response of an image using the Fourier transforms. Convolution and filtering will also be useful from the course for Image restoration/deconvolution/deblurring. Also we propose to use the Radon Transform from the course and modify it according to our requirements.

II. RESEARCH HYPOTHESIS

In this project, we state the following hypothesis, “The Parametric Blur Estimation for Blind Restoration of Natural Images will be able to estimate the parameters namely length (for motion blur) and radius (for out-of-focus) to less than a RMSE (Root Mean Squared Error in pixel units) of 0.25 for signal to noise ratio BSNR (Blurred SNR) of more than 20dB” The following images will be used to determine the average normalized mean squared error. 1. Cameraman 2. Lena 3. Barbara 4. Boats 5. Peppers 6. Goldhill (256 x 256) 7. Fingerprint (512 x 512) 8. A Natural Image (as shown in Fig 3 (a))

We will see that the RMSE reduces as the BSNR increases. Also, the parameter estimation quality (of blur length and angle) of the algorithm will decrease with the image size, i.e., we will see that below a certain size (600 x 600 pixels) the performance of this algorithm will reduce.

III. THEORY AND METHODS The paper proposed a method to estimate the blur parameters for linear uniform motion blur and out-of-focus blur. Unique points of proposed method -

This method is truly blind. Only assumptions are that the original image is natural i.e. it has an approximately isotropic power spectrum.

This method can be used to approximate length and direction in case of linear motion blurs and the radius in case of out of focus blurs.

This method introduces two modified radon transforms, the Radon-d (for linear motion blur) and the Radon-c (for out-of-focus blur).

The proposed method claims to perform faster and more accurate than the existing methods.

The basic steps and a flowchart for the proposed algorithm is given below.

Compute one of the modified Radon Transforms (RT) on the logarithm of the power spectrum of the image; for radon-d transform the authors [1] propose to take a min(M,N)/√2 x min(M,N)/√2 square patch of the whole spectrum image for a M x N original image.

Fit a third order polynomial function to the result of Radon-d transform and a two region power law function to the result of the radon-c transform

Estimate the blur parameters (motion length, angle or disk radius).

Angle estimate of the blur is normally (in previous methods) that angle at which RT is maximum [11].

But in this project, to be robust to short blur the motion angle is the direction for which the mean square error of the fitting function (cubic polynomial) is maximum and the motion length is that for which the mean squared error of the fitting function (2 region power law function) is minimum.

For the radius estimate in the out-of-focus blur, the estimated radius is that for which the mean squared error of the fitting function (different from the previous one – a Bessel function) is minimum.

Page 3: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

A. Nomenclature

We start by first defining a list of commonly used symbols in the proposal. The blurred image is generally given by

g = h * f + n …….. (1) where, * refers to 2D convolution.

The authors – the authors of [1]

f(x, y) - Image in the spatial domain that we want to recover.

F(ξ, η) - 2D Fourier Transform of image f(x, y)

(ξ, η) – Frequency domain plane

g(x,y) – Blurred Image in the spatial domain.

G(ξ, η) - 2D Fourier Transform of blurred image g(x, y)

h(x, y) – Blur PSF in the spatial domain

H(ξ, η) - 2D Fourier Transform of blur PSF h(x, y)

N(ξ, η) - 2D Fourier Transform of Noise n, i.e., the noise spectrum

ρ - angle of the family of lines passing through the origin in the (ξ, η) plane

a , b - proportionality parameters.

R(f, ρ, θ) – It is the Radon Transform which is the integral of f along a line forming an angle θ with the x-axis, at a distance ρ from the origin.

B. Natural Image Model A natural image is that image for which the power spectrum is isotropic which implies the spectral power is uniform across all orientations. If we consider a family of lines passing through the origin at different angles to the horizontal axis; along these lines, the power spectrum falls off with distance from the origin independent of the angle; a standard model for this behavior is

log |F(ξ, ξ tan ρ)| a |ξ|b, where a > 0 [1]

C. Linear Uniform Motion Blur In a continuous domain, a linear uniform motion blur PSF is a normalized delta function, supported on a line segment with length L at an angle θ (with respect to the horizontal as shown in Fig. 1 (a)). The angle θ depends on the motion direction, and the length L is proportional to the motion speed and duration of exposure. This model corresponds to considering a bright spot moving along a straight line segment centered at the origin.

[1] Fig 1. (a) This represents a bright spot of light travelling along this line of length L and angle θ. (b) This represents the intensity values of the pixels which are proportional to the length of the line in that pixel. Here black represents high intensity and white represents low intensity. Here L is the blur length and θ is the blur angle. These are the parameters we need to estimate.

A discrete version is obtained by considering this bright spot moving over the image pixels; as this point traverses the different sensors with constant velocity, and assuming that each sensor is linear and cumulative, the response is proportional to the time spent over that sensor. Thus, we obtain the corresponding intensity of each pixel of the blur kernel by computing the length of the intersection of the line segment with each pixel in the grid (see Fig. 1 (b)). Here black represents high intensity and white represents low intensity. D. Out of Focus Blur Here the focal plane is away from the sensor plane resulting in a blurred image. In this case, a single bright spot spreads among its neighboring pixels, yielding a uniform disk. The more unfocused the image is, the larger the radius of this disk. We assume that the focal distance is at infinity. This assumption works reasonably well for the majority of natural images where the scene is far away from the camera. The out-of-focus blur PSF h is a normalized disk given by [1]

, ,

0, .

where R is the blur radius and the parameter we need to estimate. In the discrete domain, each PSF value will be proportional to the intersection area between the continuous blur and the corresponding pixel. So the intensity value will be high when a large portion of the circle intersects the pixel.

E. Blurred Image Spectra Since convolution in space domain is multiplication in the frequency domain, if we take the 2D Fourier Transform of Equation (1), we can write the following [1] G(ξ, η) = F(ξ, η) H(ξ, η) + N(ξ, η) ……… (2) where the symbols are defined in the nomenclature part. We assume that the noise is weak (so taking the logarithm will reduce it further), we can take log of equation (2) on both sides and write [1] log |G(ξ, η)| ≈ log |F(ξ, η) H(ξ, η)| =log |F(ξ, η)| + log |H(ξ, η)| ……..(3) i.e., the coarse behavior of log |G(ξ, η)| depends essentially on log |F(ξ, η)| + log |H(ξ, η)|. Since the original image is natural having an approximate isotropic power spectrum, i.e., log |F(ξ, η)| along lines η = ξ tan ρ in the (ξ, η) plane is approximately independent of angle ρ, the zeros of blur spectra is preserved in the blurred image spectra, i.e., zeros in log |H(ξ, η)| is preserved in log |G(ξ, η)|. However, the presence of noise prevents the spectrum from having exact zeros. So we can look at the local minimas (as values close to zero) in the log |G(ξ, η)| spectrum to get information about the zero pattern of the blur spectrum. Now, linear uniform motion blur is modeled by a line segment and hence the corresponding spectrum is a sinc-like function in the direction of the blur. In this case, the spectrum exhibits zeros along lines perpendicular to the motion direction, separated from each other by a distance that depends on the blur length. But due to the

Page 4: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

presence of noise, the zeros become local minima. We can easily recognize the motion blur pattern by carefully noticing the minima’s (since zeros are distorted by noise) in the log |G(ξ, η)| spectrum. To identify the motion angle, the authors of [1] propose a modified Radon transform (RT). The normal RT of image f(x,y), (defined on , at angle θ, and distance ρ from the origin) which is the collection of all line integral projections through that image is given by [1]

, ρ, θ ρ cos θ s sin θ,ρ sin θ cos θ

The authors use the modified Radon Transform i.e. the Radon-d transform for the angle in linear motion blur and the Radon-c transform for the radius in the out-of-focus blur. The proposed modified transforms are explained below. F. Radon – d Transform This transform is used in the linear uniform motion blur case. The proposed Radon-d modification of the RT [1] performs integration over the same area independently of the direction of integration. Without computing the RT of the whole image and instead integrating on the same area for different angles the authors show that the behavior of natural images along lines that pass through the origin. The authors also show that the modified RT of the logarithm of the power spectrum has approximately the same energy irrespective of the angle. For radon-d transform the authors [1] propose to take d x d square patch of the whole spectrum image for a M x N original image. This is done by setting the limits of integration from –d/2 to +d/2 where d = m/sqrt(2) and m = min{N,M} for an image of dimension N x M and is given by [1]  

, ρ, θ ρcosθ ssinθ, ρsinθ scosθ ds, |ρ| /2

0,

 

 where the modified Radon Transform, i.e., Radon-d is carried over f which is equal to log| |.

  Fig 2: The gray square over which Radon-d transform integration is performed for angle θ [1] To approximate the radon-d transform the authors in their previous work [12] show that

the Radon-d transform of the logarithm power spectrum image should have an approximately line like behavior but due to spectral irregularities and artifacts caused by FFT the authors propose using a cubic polynomial to fit the Radon-d curve given by [1] Rd (log |F|, ρ, θ) ≈ a ρ3 + b ρ2 + c ρ + d ……….. (4) where Rd (log |F|, ρ, θ) are the values of the Radon-d transform

applied on the blurred image spectrum log| | as a function of ρ for a particular value of θ. But we can see from equation 3 that the coarse behavior of log |G| depends essentially on log |F| + log |H|, i.e., although we take the Radon-d transform of the blurred image spectrum log| |, we can replace the argument of Radon-d by log |F| instead of log| | (as they are dependent) as shown in equation 4. So, equation 4 can also be written as Rd (log |G|, ρ, θ) ≈ a ρ3 + b ρ2 + c ρ + d, where coefficients a, b, c, d are estimated later. Also the proposed fitting function has 2 terms as shown in equation 6. Equation 4 gives the fitting function for the Radon-d transform on the blurred image. The fitting function only for the image blur is given by [1] ρ ≅ α log 1 log| ρ | . where αand are parameters defined later. G. Radon – c Transform This transform is used in the out-of-focus blur case. The authors of [1] have modified the radon transform to introduce the Radon-c transform which integrate along circles of radius ρ in the logarithm of power spectrum of blurred image g(x,y). The Radon-c transform is given by [1]

, ρ ρ cos θ, ρ sin θ dθ,

where the authors [1] integrate along the circle for a particular value of radius (use polar co-ordinates) and calculate the value of the transform for that radius. Then the value of the transform is plotted for different values of radius to form a curve. To approximate this curve the authors [1] propose a two-region power law function given by [1]

log| |, ρ ≅ |ρ| , ρ ρ

d|ρ| ,ρ ρ …… (5)

where ρ and ρ ρ and a, b, c,ρ are

parameters for which we make an initial guess. The approximate function given above is continuous at ρ = ρ0 Here, Rc (log |F |, ρ) are the values of the Radon-c transform applied on the blurred image spectrum log| | as a function of ρ. But we can see from equation 3 that the coarse behavior of log |G| depends essentially on log |F| + log |H|, i.e., although we take the Radon-c transform of the blurred image spectrum log| |, we can replace the argument of Radon-c by log |F| instead of log| | (as they are dependent) as shown in equation 5. So, equation 5 can also

be written as log| |, ρ ≅ |ρ| , ρ ρ

d|ρ| ,ρ ρ , where

coefficients a, b, c, ρ are estimated later. Also the proposed fitting function has 2 terms as shown in equation 6. Equation 5 gives the fitting function for the Radon-c transform on the blurred image. The fitting function only for the image blur is given by [1] ρ ≅α log 1 log| ρ | ,where αand are parameters defined later. We will notice that as we increase the radius (move away from the origin), the value of Radon-c transform of log| | (or log |F| as they both are dependent as seen in equation 3) decreases since the spectrum of out-of-focus blur is Bessel like for which the value decreases away from the origin. H. Parameter Estimation For Motion Blur The authors [1] use the Mean Squared Error (MSE) to estimate the

Page 5: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

blur angle. When using the fitting function given by equation 4 to fit the curve obtained from the Radon-d transform, a maximum value of MSE is obtained at a particular angle. This is the estimate of the blur angle given by [1]

θ argmax ρ, θ ρ, θ .

where RG(ρ,θ) denote the integral of log |G(ξ, η)| along a direction perpendicular to θ, i.e., RG(ρ, θ) = Rd (log |G(ξ, η)|, ρ, θ). ρ, θ is the function obtained by fitting a third order

polynomial given by equation 4 to the Radon-d transform RG(ρ, θ). The fitting here is a least-squares curve fitting to the result of the Radon-d Transform. We estimated the coefficients a, b, c, d for the third order polynomial using MATLAB’s polyfit function which constructs a Vandermonde matrix and then solves it via QR factorization [14]. Then we used these coefficients to construct the fitting function ρ, θ using MATLAB’s polyval function. We also tried to implement the ordinary least squares method to construct the polynomial (the fitting function) but that was much slower than MATLAB’s highly optimized function. Subsequently we obtained the Mean Squared Error (MSE) by taking the difference between the result of Radon-d and the fitting function obtained (as given by equation 4), squaring this difference and then taking the mean. We plot the MSE as a function of θ and the angle estimate is that angle for which the MSE is maximum, i.e., the result of Radon- d transform and the fitting function do not match properly at this angle. We note the index of the angle for which the MSE is maximum and this is the angle estimate. After calculating the blur angle estimate, we use it and move forward to estimate the blur length L as given in the subsequent steps. The proposed fitting function must have 2 terms given by [1] ρ log|F|, ρ, θ log| |, ρ, θ …. (6)

where again log|F| is used in place of log|G| as they both are dependent. After estimating blur angle we propose to fit γ (ρ) to RG(ρ, ), where θ is the angle estimate. In this case, (ρ) =

log|F|, ρ, θ and is approximated by (4), and (ρ) = log| |, ρ, θ and is approximated by ρ ≅ α log 1

log| ρ | Here, H(ω) must be proportional to a sinc function, since we know the blur spectrum for motion blur is sinc like, i.e., [1] |H(ω)| ∝ |sinc(λω)|, where λ is the blur length. So, now we have to jointly estimate the parameters a, b, c, d, α, , λ in a least squares sense to get the final fitting function given

by ρ . As this process is highly non-convex, we cannot use any iterative least-squares algorithm to jointly estimate the above parameters. So, first we fix λ and for this particular value of λ, we do the following: We take the previously estimated parameters a, b, c, d obtained during the angle estimate as initial values to fit equation 4 and initialize α, to small positive values ( 1). We then use

MATLAB’s nlinfit function which uses the Levenberg-Marquardt nonlinear least squares algorithm [15] to estimate the final values of the above parameters. To use nlinfit we create an anonymous model function given by equation 6 and feed the initial values of the parameters given above. Finally we feed the estimated final values of the above parameters to our model function to obtain the fitting function as given by equation 6. Again we tried to implement the nonlinear least squares algorithm ourselves but it was much slower than the highly optimized nlinfit function. Subsequently we obtained the Mean Squared Error (MSE) by taking the difference between the result of Radon-d and the fitting function obtained (as given by equation 6), squaring this difference and then taking the mean. We plot the MSE as a function of λ and the length estimate is that λ for which the MSE is minimum, i.e., the result of Radon- d transform and the fitting function matches properly at this length. We note the index of the length for which the MSE is minimum and the length estimate is given by

λ/N where N is the number of different angular frequencies (number of points in FFT) of the N x N image we have taken (we have resized the given image to a square image, otherwise there were some problems with the transforms). I. Parameter Estimation For Out of Focus Blur In this case, to calculate the radius of the out-of-focus blur, we proceed as in the motion blur case by just replacing Radon-d transform with the Radon-c. The fitting function is similar to equation (6) with the Rd replaced by Rc and (ρ) is given by equation (5) and ρ ≅ α log 1 log| ρ | . We know that the blur spectrum for out-of-focus blur is Bessel like function. Hence the |H| in equation 6 is given by [1] |H(ω)| ∝ | λω /λω| where Jδ is the Bessel function of the first kind, with parameter δ. Going exactly the same way as motion blur case, i.e., passing a, b, c,ρ obtained from equation 5 and α, ( 1) to the nlinfit function to get the final parameters and using an anonymous model function as given by equation 6 to get the final fitting function. Subsequently we obtained the Mean Squared Error (MSE) by taking the difference between the result of Radon-c and the fitting function obtained (as given by equation 6), squaring this difference and then taking the mean. We plot the MSE as a function of λ and the radius estimate is that λ for which the MSE is minimum, i.e., the result of Radon- c transform and the fitting function matches properly at this length. We note the index of the length for which the MSE is minimum and the radius estimate is given by

2 λ/N where N is the number of different angular frequencies (number of points in FFT) of the N x N image we have taken (we have resized the given image to a square image, otherwise there were some problems with the transforms).

Page 6: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

IV. EXPERIMENTAL RESULTS

Approximately isotropic power spectrum of original natural image.

Fig 3. (a) Original Natural Image (b) logarithm of the power spectrum of a (c) contour plot of logarithm of power spectrum (d) surf plot showing the profile of the logarithm of power spectrum In figure 3, we took a natural image in (a) and calculated the logarithm of its power spectrum in (b). The logarithm of power spectrum of the image is seen to be approximately isotropic as its value is same in all directions for a particular distance from origin and also it follows the power law decay as seen in (c) and (d). In (c) we can see that at a fixed distance from origin the value of the power spectrum is same in all directions. Also from (c) and (d) we can see that the power decays with spatial frequency. Hence we can apply the estimation algorithm of this paper to this image as it has an approximately isotropic power spectrum.

Given below is an example of an image having a blur and hence it does not have an isotropic power spectrum as seen below in the contour plot to the right of the image.

Power spectrum of image having linear uniform motion blur.

Fig 4. (a) Blur kernel of length L = 10 and angle θ = 45 deg (b) Linear motion blurred image (c) logarithm of the power spectrum of the blurred image (d) surf plot showing the profile of the logarithm of power spectrum for blurred image. In figure 4, we blur the given natural image using motion blur (fspecial (‘motion’) in matlab) of length L = 10 and blur angle θ = 45 degrees. The blur kernel is shown in (a) and the motion blurred image is shown in (b). A very good example of this blur is shown in [13]. Now as per the paper the logarithm of the power spectrum of a motion blurred image must be “sinc like” in the direction of the blur. As seen in (c) the value of the logarithm of the power spectrum is high at the origin (white color) and drops (black color) along a line having an angle of 45 degree with the horizontal. The “sinc like” structure is more clear in (d) when we do a surf plot of the power spectrum of the blurred image.

Power spectrum of image having out-of-focus blur.

Fig 5. (a) Blur kernel of radius r = 10 (b) Out-of-focus blurred image (c) logarithm of the power spectrum of the blurred image. In figure 5, we blur the given natural image using out-of-focus blur (fspecial (‘disk’) in matlab) of radius r = 10. The blur kernel is shown in (a) and the motion blurred image is shown in (b). A very good example of this blur is shown in [13] Now as per the paper the logarithm of the power spectrum of an out-of-focus blurred image must be “Bessel like” in the direction of the blur. As seen in (c) the value of the logarithm of the power spectrum is high at the origin (white color) and drops (black color)

Page 7: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

at intervals away from the origin which corresponds to the “Bessel” function.

Radon-d Transform

Fig 6. (a) A square patch created (b) Radon-d transform applied on (a) at θ = 0 deg (c) Radon-d transform applied on (a) at θ = 45 deg  In figure 6, we create a square patch in matlab as shown in (a) and the apply our modified Radon transform i.e., the Radon-d transform on this patch. (b) shows the value of the Radon-d transform applied on (a) at θ = 0 deg, which looks consistent with the definition of Radon-d. As we start integrating from left to the right the value of Radon-d transform increases abruptly as we come across the white pixels and then stays constant as long as we are inside the square patch and then decreases abruptly after we leave the white patch. Here we integrate along lines perpendicular to the horizontal. In (c) we apply the same Radon-d transform to the square patch at an angle of θ = 45 deg. Here we integrate along lines that are perpendicular to a line which is at an angle of 45 degrees to the horizontal. (c) looks consistent with the definition as when we integrate along lines at an angle of 135 deg(45 + 90) initially we get a small value but the value increases linearly and reaches maximum at the diagonal of the square patch. Then again it decreases gradually and reaches zero. The MATLAB code for Radon-d is given in Appendix-A.

Radon-c Transform

(c) (d)

Fig 7. (a) A circular patch created (b) Radon-c transform applied on (a) for various radius (c) Radon-c transform applied on (a) for various radius but different implementation (d)  Radon-c Transform applied on out-of-focus blurred image of figure 5(c)  In figure 7, we create a circular patch in matlab as shown in (a) and the apply our modified Radon transform i.e., the Radon-c transform on this patch. (b) shows the value of the Radon-c transform applied on (a) for various values of radius on the x-axis, which looks consistent with the definition of Radon-c. As we start integrating with increasing values of radius r the value of Radon-c transform stays constant as long as we are inside the circular patch and then decreases abruptly after we leave the white circular patch. (c) shows another implementation of Radon-c which is exactly according to the paper but we feel (b) gives better results. So we plan to implement both (b) and (c) and check which gives better estimate. Finally (d) shows the Radon-c transform applied on the out-of-focus blurred image of figure 5 (b) with the Fig 7 (b) implementation and Fig 7 (d) is consistent with the paper. We have provided the MATLAB code for both (b) and (c) implementation in Appendix B.

Curve fitting to estimate motion blur parameters.  (a) Angle Estimate 

 Fig 8. (a) Radon-d Transform applied on Fig 4(c) at θ = 0 deg (b) Radon-d Transform applied on Fig 4(c) at θ = 45 deg. Figure 8 (a) shows the application of Radon-d Transform on the power spectrum of linear motion blurred image of Fig 4(c) at θ = 0 degrees. The blue colored curve shows the value of Radon-d applied on Fig 4(c) for increasing distance from the origin whereas the red colored curve shows the fitting 3rd order polynomial to identify the motion blur angle.

0 100 200 300 400 5000

10

20

30

40

50

60

70

80

90

100

0 50 100 150 200 250 300 350 4000

5

10

15

Page 8: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

We used polyfit function in MATLAB to fit a 3rd order polynomial to the Radon-d transform values in least squared sense. Since we blurred the image by an angle of 45 degrees in Fig 4(b), we can see that the error between the fitting polynomial and the Radon-d values in Fig 8 (a) for θ = 0 deg is not much. Figure 8 (b) shows the application of Radon-d Transform on the power spectrum of linear motion blurred image of Fig 4(c) at θ = 45 degrees. The blue colored curve shows the value of Radon-d applied on Fig 4(c) for increasing distance from the origin whereas the red colored curve shows the fitting 3rd order polynomial to identify the motion blur angle. Since we blurred the image by an angle of 45 degrees in Fig 4(b), we can see that the error between the fitting polynomial and the Radon-d values is huge and as per the paper the blur angle is that angle for which this error is maximum. Hence 45 degrees is the estimated blur angle as it has maximum error which matches with our initial blur angle. Hence the Radon-d Transform we have implemented is correct.

Fig 9. Motion blur angle estimation. Figure 9 shows the mean squared error vs the blur angle. We know that the estimated blur angle is that angle for which the mean squared error between the fitted polynomial and the Radon-d transform values is maximum. As we can see in figure 9 the mean squared error is maximum at 45 degrees and hence our estimated blur angle is 45 degrees which matches with our initial blur angle in Figure 4 (b) and (c). Hence we conclude that our motion blur angle estimation from Radon-d Transform is correct. We have found a much faster way to implement the Radon-d Transform in which we rotate the image at the blur angle and then calculate its power spectrum and find the sum along every column. As this uses MATLAB’s highly optimized imrotate function, this process is much faster than the one implemented in the paper [1]. (b) Length Estimate Fig 10. (a) Radon-d Transform applied on Fig 4(c) at θ = 45 deg, L = 15 along with the fitting function. (b) Radon-d Transform applied on Fig 4(c) at θ = 45 deg, L = 21 (c) MSE vs Length for calculating length estimate.

(a)

(b)

(c)

0 50 100 150 200 250 3001500

2000

2500

3000

3500

4000

4500

5000

5500

Radon-d at 45 degFitting function given by equation 6

0 50 100 150 200 250 3001500

2000

2500

3000

3500

4000

4500

5000

5500

6000

Radon-d at 45 degFitting function given by equation 6

0 5 10 15 20 25 30 35 401.2

1.3

1.4

1.5

1.6

1.7

1.8x 10

5

Length

Mea

n s

quar

e e

rror

Motion blur length estimation

Page 9: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

In figure 10 (a) and (b), the red curve shows the application of Radon-d Transform on the power spectrum of linear motion blurred image of Fig 4(c) at θ = 45 degrees, which is the estimated motion blurred angle The blue colored curve shows fitting function given by equation 6 for blur length = 15 in Fig 10 (a) and blur length = 21 in Fig 10 (b) We used nlinfit function in MATLAB to fit equation 6 to the Radon-d transform values in least squared sense. Since we estimated the blur angle as 45 degrees in the previous part, we use this angle estimate to calculate the blur length L. First, we purposely blur the image of Fig 3 (a) by a blur length of L = 21 and a blur angle of 45 degree. We then apply the Radon-d transform on the power spectrum of the image and estimate the blur angle as 45 degree. We then calculate the Radon-d at this angle (shown by the red curve). We try to fit a function given by equation 6 to this Radon – d result using nlinfit function. The blue colored curve shows the fitting function. In fig 10 (a) the fitting function does not fit the red curve as here we use length of L = 15 whereas the actual length is 21. In Fig 10 (b) we see that the fitting function perfectly fits the result of Radon-d as the blur length is 21 now. So, now the error is minimum. Hence the Radon-d Transform we have implemented is correct. Figure 10 (c) shows the mean squared error vs the blur length. We know that the estimated blur length is that length for which the mean squared error between the fitted polynomial and the Radon-d transform values is minimum. As we can see in figure 10 (c) the mean squared error is minimum at L = 19 and L = 21 and hence our estimated blur length is 19 which is very close to our initial blur length L = 21 at which we blurred the image. Hence we conclude that our motion blur length estimation from Radon-d Transform is correct.

Curve fitting to estimate out-of-focus blur parameter.

(a)

(b)

(c)

Fig 11. (a) Radon-c Transform applied on Fig 5(c) at R = 10 along with the fitting function for R = 5. (b) Radon-c Transform applied on Fig 5(c) at R = 10 with fitting function for R = 10. (c) MSE vs Radius for calculating radius estimate. In figure 11 (a) and (b), the red curve shows the application of Radon-c Transform on the power spectrum of out-of-focus blurred image of Fig 5(c) at R = 10, which is the estimated radius for out-of-focus blur. The blue colored curve shows fitting function given by equation 6 for blur radius = 5 in Fig 11 (a) and blur radius = 10 in Fig 11 (b). We used nlinfit function in MATLAB to fit equation 6 to the Radon-c transform values in least squared sense. First, we purposely blur the image of Fig 3 (a) by a blur radius of R = 10 as shown in Fig 5 (b). We then calculate the Radon-c at this radius (shown by the red curve). We try to fit a function given by equation 6 to this Radon – c result using nlinfit function. The blue colored curve shows the fitting function. In fig 11 (a) the fitting function does not fit the red curve as here we use radius of R = 5 whereas the actual radius is 10. In Fig 11 (b) we see that the fitting function perfectly fits the result of Radon-c as the blur radius is 10 now. So, now the error is minimum. Hence the

0 50 100 150 200 250 300 350 4000

5

10

15

20

25

30

Radon-c for R = 10

Fitting function for R = 5 given by equation 6 with Rd replaced with Rc

0 50 100 150 200 250 300 350 4000

5

10

15

20

25

Radon-c for R = 10

Fitting function for R =10 given by equation 6 with Rd replaced with Rc

0 2 4 6 8 10 12 14 16 18 200.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

Radius

Mea

n sq

uare

err

or

Out-of-focus blur radius estimation

Page 10: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

Radon-c Transform we have implemented is correct. Figure 11 (c) shows the mean squared error vs the blur radius. We know that the estimated blur radius is that radius for which the mean squared error between the fitted polynomial and the Radon-c transform values is minimum. As we can see in figure 11 (c) the mean squared error is minimum at R = 10 and hence our estimated blur radius is 10 which is equal to the radius at which we blurred the image. Hence we conclude that our out-of-focus blur radius estimation from Radon-c Transform is correct.

Deblurring of the image from the estimated parameters.

(a)

(b) Fig 12. (a) Blurred Input Image (b) Deblurred output after parameter estimation. In Figure 12 (a), we purposely apply motion blur to the natural image of Fig 3 (a). The motion blur kernel length is L = 10 and blur angle is θ = 45 deg. Then we apply our estimation algorithm given above for motion blur to estimate the blur parameters. After we obtain the blur parameters, we use the Richardson-Lucy Deblurring Algorithm to restore the original image from the blurred image. Figure12 (b) shows the restored image from the blurred image. Clearly we can see that the blur has been removed. The lines visible near the borders of the restored image have come from the Richardson-Lucy method (there must have been implementation of fft and fftshift).

(a)

(b)

Fig 13. (a) Blurred Input Lena Image (b) Deblurred output after parameter estimation. In Figure 13 (a), we purposely apply motion blur to the image of Lena. The motion blur kernel length is L = 10 and blur angle is θ = 45 deg. Then we apply our estimation algorithm given above for motion blur to estimate the blur parameters. After we obtain the blur parameters, we use the Richardson-Lucy Deblurring Algorithm to restore the original image from the blurred image. Figure13 (b) shows the restored image from the blurred image. Clearly we can see that the blur has been removed. The lines visible near the borders of the restored image have come from the Richardson-Lucy method (there must have been implementation of fft and fftshift). Compared to the restoration in Figure 12 this restoration is not proper as the Lena Image is not a natural image and hence the algorithm we implemented does not estimate the parameters accurately.

Page 11: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

We also tested our algorithm on all of the images mentioned in our research hypothesis, namely, Cameraman, Barbara, Boats, Peppers, Goldhill and Fingerprint. But none of these gave a perfect restoration result as the natural image, since the estimation algorithm works best on natural images. In fact, some of the above images gave worst results.

Effect of Image size on the estimation algorithm

(a) (b)

Fig 14. (a) Blurred Input Image of reduced size (b) Deblurred output after parameter estimation. In Figure 14 (a), we purposely apply motion blur to the natural image of Fig 3 (a). The motion blur kernel length is L = 10 and blur angle is θ = 45 deg. But we reduce the size of the image to one-tenth of the original image size. Then we apply our estimation algorithm given above for motion blur to estimate the blur parameters. After we obtain the blur parameters, we use the Richardson-Lucy Deblurring Algorithm to restore the original image from the blurred image. Figure14 (b) shows the restored image from the blurred image. As we can see that the restored image is worse than the blurred image and hence we conclude that the estimation algorithm we presented in this project does not work for very low sized images.

Effect of Noise on the estimation algorithm

(a)

(b)

Fig 15. (a) Angle estimation for a blurred image with gaussian noise added to it with SNR = 10dB. (b) Angle estimation with SNR = 1 dB. In Figure 15, we can see that although the peak value of MSE to estimate the blur angle is at 45 degrees for both cases, it clearly indicates that presence of noise in (b) which has SNR = 1 dB, makes it harder to estimate the parameters as now it has multiple peaks at various angles other than the actual blur angle. Hence we conclude that as noise increases the estimation algorithm implemented in this project fails and gives incorrect results. Hence we have previously stated in the theory part that we assume low noise conditions.

V. CONCLUSIONS

In this project we studied the method described in the paper [1]. We were able to successfully implement the methods and were able to recreate the results thus establishing that the blur parameters can be calculated from the spectrum of an image. We understood the concepts of radon transform and least squares estimation among others. We were able to see the concepts we learnt in EECS 556 applied to a problem.

VI. EXTENSION

As an extension to this project, we tried to make this estimation algorithm work for non-natural images. All non-natural images, say, for example, the image of tables and chairs will have high frequency components due to the presence of edges (All non-natural images will have some sort of ordering). So, we first filtered the non-natural image with low-pass filter (like Gaussian filter) to remove the high frequency components. So, now the image had an approximately isotropic power spectrum. Then applying the estimation algorithm could find the blur parameters but it had very large errors. Also we are trying to find out that if we can estimate the blur parameters, then how canwe restore back the non-natural image as now all the high frequency components have gone.

BIBLIOGRAPHY

[1] João P. Oliveira, Mário A. T. Figueiredo and José M. Bioucas-Dias, “Parametric Blur Estimation for Blind Restoration of Natural Images: Linear Motion and Out-of-Focus,” IEEE Trans. Image Process., vol. 23, no. 1, pp. 466–477, Jan. 2014.

[2] M. Almeida and L. Almeida, “Blind and semi-blind deblurring of natural images,” IEEE Trans. Image Process., vol. 19, no. 1, pp. 36–52, Jan. 2010.

[3] L. Bar, N. Sochen, and N. Kiryati, “Variational pairing of image segmentation and blind restoration,” in Proc. 8th ECCV, 2004, pp. 166–177.

[4] M. Bertero and P. Boccacci, Introduction to Inverse Problems in Imaging. England, U.K.: IOP Publishing, 1998.

[5] B. P. Bogert, M. J. R. Healy, and J. W. Tukey, “The quefrency alanysis of time series for echoes: Cepstrum, Pseudo Autocovariance, Cross-Cepstrum and saphe cracking,” in Proc. Symp. Time Ser. Anal., vol. 15, 1963, pp. 209–243.

[6] R. Bracewell, Two-Dimensional Imaging. Upper Saddle River, NJ, USA: Prentice-Hall, 1995.

[7] P. Campisi and K. Egiazarian, Blind Image Deconvolution: Theory and Applications. Cleveland, OH, USA: CRC Press, 2007.

[8] A. S. Carasso, “Direct blind deconvolution,” SIAM J. Appl. Math., vol. 61, no. 6, pp. 1980–2007, 2001.

[9] T. F. Chan and J. Shen, Image Processing and Analysis - Variational, PDE, Wavelet, Stochastic Methods. Philadelphia, PA, USA: SIAM, 2005.

[10] T. F. Chan and C. K. Wong, “Total variation blind deconvolution,” IEEE

0 50 100 150 2000

1

2

3

4

5

6x 10

4

Angle

Mea

n s

quare

err

or

Motion blur angle estimation

0 50 100 150 2000

0.5

1

1.5

2

2.5

3

3.5x 10

4

Angle

Mea

n s

quare

err

or

Motion blur angle estimation

Page 12: Parametric Blur Estimation for Blind Restoration of Natural ......Parametric Blur Estimation for Blind Restoration of Natural Images Having Uniform Linear Motion and Out-of-Focus Blur

 

    

Trans. Image Process., vol. 7, no. 3, pp. 370–375, Mar. 1998. [11] M. Moghaddam and M. Jamzad, “Motion blur identification in noisy

motion blur identification in noisy images using fuzzy sets,” in Proc. 5th IEEE Int. Symp. Signal Process. Inf. Technol., Dec. 2005, pp. 862–866.

[12] J. P. Oliveira, “Advances in total variation image restoration: Blur estimation, parameter estimation and efficient optimization,” Ph.D. dissertation, Inst. Superior Técnico, Univ. Montana, Missoula, MT, USA, Jul. 2010.

[13] http://www.abtosoftware.com/blog/image‐restoration. [14] http://en.wikipedia.org/wiki/Vandermonde_matrix

http://en.wikipedia.org/wiki/QR_decomposition [15]

http://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm

GROUP EFFORT TABLE

Group member names - 1: Banerjee Soumyanil 2: Vasudevan Ajay Person Task 1 2 ----------------- | 50 | 50 | leadership ----------------- | 50 | 50 | planning/design ----------------- | 40 | 60 | programming ----------------- | 55 | 45 | testing ----------------- | 45 | 55 | analysis ----------------- | 50 | 50 | proposal ----------------- | 50 | 50 | presentation ----------------- | 60 | 40 | report