11
Research Article Gradient-guided color image contrast and saturation enhancement Haiyan Shi 1 , Ngaiming Kwok 2 , Gu Fang 3 , Stephen Ching-Feng Lin 2 , Ann Lee 4 , Huaizhong Li 5 and Ying-Hao Yu 6 Abstract Digital color images are capable of presenting hue, saturation, and brightness perceptions. Therefore, quality improve- ment of color images should be taken into account to enhance all three stimuli. An effective method is proposed that aims at enriching the colorfulness, vividness, and contrast of color images simultaneously. In this method, color correction based on magnitude stretching is carried out first, image enhancement is then derived from an intensity-guided operation that concurrently improves the contrast and saturation qualities. Furthermore, the proposed methodology mitigates the heavy computational burden arising from the need to transform the source color space into an alternative color space in conventional approaches. Experiments had been conducted using a collection of real-world images captured under various environmental conditions. Image quality improvements were observed both from subjective viewing and quantitative evaluation metrics in colorfulness, saturation, and contrast. Keywords Contrast enhancement, saturation enhancement, conversion free, RGB manipulation Date received: 23 March 2016; accepted: 31 March 2017 Topic: Vision Systems Topic Editor: Antonio Ferna ´ndez-Caballero Associate Editor: Mohan Sridharan Introduction Visual sensing has been widely used in industrial applications due to advancement of computer and imagery technologies. For instance, computer vision can be employed in object tracking as a means for human– computer interaction. 1,2 Other examples include the use of robots in manufacturing such as welding. 3 There are applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there are many innovative uses of vision in con- sumer products, and digital color images have become the major medium for capturing, transmitting, and storing of the scene information. 7,8 The essential requirement for color image is to provide a perception of the scene to a human viewer or for a computer to carry out tasks such as object recognition. A 1 School of Mechanical and Electrical Engineering, Shaoxing University, Shaoxing, Zhejiang, China 2 School of Mechanical and Manufacturing Engineering, The University of New South Wales, Sydney, NSW, Australia 3 School of Computing, Engineering and Mathematics, Western Sydney University, Penrith, NSW, Australia 4 Department of Engineering, Macquarie University, Sydney, NSW, Australia 5 Griffith School of Engineering, Griffith University, Gold Coast, QLD, Australia 6 Department of Electrical Engineering, National Chung Cheng University, Min-Hsiung Township, Chiayi, Taiwan Corresponding author: Ngaiming Kwok, School of Mechanical and Manufacturing Engineering, The University of New South Wales, Sydney, NSW 2052, Australia. Email: [email protected] International Journal of Advanced Robotic Systems May-June 2017: 1–11 ª The Author(s) 2017 DOI: 10.1177/1729881417711683 journals.sagepub.com/home/arx Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License (http://www.creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/ open-access-at-sage).

Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

Research Article

Gradient-guided color image contrastand saturation enhancement

Haiyan Shi1, Ngaiming Kwok2, Gu Fang3, Stephen Ching-Feng Lin2,Ann Lee4, Huaizhong Li5 and Ying-Hao Yu6

AbstractDigital color images are capable of presenting hue, saturation, and brightness perceptions. Therefore, quality improve-ment of color images should be taken into account to enhance all three stimuli. An effective method is proposed that aimsat enriching the colorfulness, vividness, and contrast of color images simultaneously. In this method, color correctionbased on magnitude stretching is carried out first, image enhancement is then derived from an intensity-guided operationthat concurrently improves the contrast and saturation qualities. Furthermore, the proposed methodology mitigates theheavy computational burden arising from the need to transform the source color space into an alternative color space inconventional approaches. Experiments had been conducted using a collection of real-world images captured undervarious environmental conditions. Image quality improvements were observed both from subjective viewing andquantitative evaluation metrics in colorfulness, saturation, and contrast.

KeywordsContrast enhancement, saturation enhancement, conversion free, RGB manipulation

Date received: 23 March 2016; accepted: 31 March 2017

Topic: Vision SystemsTopic Editor: Antonio Fernandez-CaballeroAssociate Editor: Mohan Sridharan

Introduction

Visual sensing has been widely used in industrial

applications due to advancement of computer and imagery

technologies. For instance, computer vision can

be employed in object tracking as a means for human–

computer interaction.1,2 Other examples include the use of

robots in manufacturing such as welding.3 There are

applications involving imageries in remote sensing for

road damage detection4 and weather monitoring.5,6 More-

over, there are many innovative uses of vision in con-

sumer products, and digital color images have become

the major medium for capturing, transmitting, and storing

of the scene information.7,8

The essential requirement for color image is to provide a

perception of the scene to a human viewer or for a

computer to carry out tasks such as object recognition. A

1 School of Mechanical and Electrical Engineering, Shaoxing University,

Shaoxing, Zhejiang, China2School of Mechanical and Manufacturing Engineering, The University of

New South Wales, Sydney, NSW, Australia3 School of Computing, Engineering and Mathematics, Western Sydney

University, Penrith, NSW, Australia4 Department of Engineering, Macquarie University, Sydney, NSW,

Australia5 Griffith School of Engineering, Griffith University, Gold Coast, QLD,

Australia6Department of Electrical Engineering, National Chung Cheng University,

Min-Hsiung Township, Chiayi, Taiwan

Corresponding author:

Ngaiming Kwok, School of Mechanical and Manufacturing Engineering, The

University of New South Wales, Sydney, NSW 2052, Australia.

Email: [email protected]

International Journal of AdvancedRobotic Systems

May-June 2017: 1–11ª The Author(s) 2017

DOI: 10.1177/1729881417711683journals.sagepub.com/home/arx

Creative Commons CC BY: This article is distributed under the terms of the Creative Commons Attribution 4.0 License

(http://www.creativecommons.org/licenses/by/4.0/) which permits any use, reproduction and distribution of the work without

further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/

open-access-at-sage).

Page 2: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

high-quality image that could truly represent the captured

object and the scene is therefore crucial to the success of

these tasks. In practice, images are generally coded in terms

of three primary color channels, that is, in the red–green–

blue (RGB) color space.9 However, the human visual sys-

tem is more sensitive in hue (H), saturation (S), and inten-

sity (I) attributes. Therefore, when images are processed,10

many common approaches convert the RGB color space

into some convenient working signal spaces that are close

to human perceptions, for example, the hue–saturation–

intensity (HSI) space.

On the other hand, most display and printing devices

require inputs in the RGB format. Hence, processed images

often need to be converted back. The conversion operations

in both the forward and reverse directions unavoidably

increase the computation load to the image processing task.

To reduce the computation complexity, the relationships

between color channels that can be beneficially used to pro-

duce desirable enhanced effects are investigated. Further-

more, it is desirable that simple operations can be designed

that are able to simultaneously enhance image qualities with

regard to its colorfulness, vividness, and contrast.

In this article, the development of an effective

transformation-free approach is reported. In this method,

the input image color signals are firstly fed to a magnitude

stretching process to mitigate the color bias. The processed

signals are then averaged to produce a temporal intensity

image. The salient features of the intensity image are fur-

ther extracted using a 4-connected Laplacian kernel. These

obtained features are used as guidance indicators to modify

the color channels to produce the enhanced output image.

Particularly, to provide simultaneous saturation, contrast

enhancements, and to reduce the computation complexity,

only two color components are changed directly in the

RGB channels instead of adjusting all colors in a pixel. The

method not only enhances the image contrast but also

improves the saturation and provides color correction.

The rest of the article is organized as follows. In “Color

space conversion and induced complexities” section, defi-

nitions of the HSI color spaces are reviewed and the com-

putation complexity is examined. The proposed

transformation-free color image enhancement is detailed

in “Gradient-guided contrast and saturation enhancement”

section. Test results from a group of images are evaluated

and presented in “Experiments and results” section. The

fifth section contains the conclusion.

Color space conversion and inducedcomplexities

Digital color images are often captured, stored, and trans-

mitted based on an aggregation of signals in a color space

representing the primary red, green, and blue stimulus.

There are some commonly used color spaces in color image

processing, for example, the HSI space.11 The conversion

from RGB to HSI can be obtained as

Iuv ¼Ruv þ Guv þ Buv

3(1)

Suv ¼ 1� 3� minfRuv;Guv;BuvgRuv þ Guv þ Buv

(2)

Huv ¼�uv; Buv � Guv

2p� �uv; Buv > Guv

�(3)

�uv ¼ cos�1 0:5ðDRG þ DRBÞffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiD2

RG þ DRBDGB

q264

375 (4)

where DRG ¼ Ruv � Guv, DRB ¼ Ruv � Buv, and DGB ¼Guv � Buv.

The subscripts uv are the pixel coordinate. For an image

of width U and height V , u 2 ½1; U � and v 2 ½1; V �. The

intensity ðIuvÞ is the average of the three primary signals

from the camera. The saturation ðSuvÞ, representing the

richness of the color, is a function of the ratio of the

minimum of the primary stimulus and their sum. The hue

ðHuvÞ is a nonlinear function of primary color differences

including Ruv � Guv, Ruv � Buv, and Guv � Buv.

If color image enhancement is to be conducted in some

transformed working space, say the HSI space, the input

image in the RGB space needs to be converted to HSI and

back to the RGB space for display as required.7,12,13

Obviously, these operations incur severe computation cost

when the image size is large.

If the image contains N pixels and needs the conversion

between RGB and HSI spaces, it requires two additions and

one division to obtain the I-channel, and its complexity is

OðNÞ. For saturation, the S-channel, a sorting operation is

required for each pixel as well as one multiplication and

division, hence, it requires Oð2NÞ operations. The highest

computation cost arises from calculating the hue component

which involves time-consuming squaring, square rooting,

and cosine inversion calculations. In order to obtain the hue,

it includes two multiplications, a squaring, rooting, division,

and cosine inversion. Thus, the complexity isOð6NÞ and the

overall complexity becomesOð9NÞ. If the same complexity

is assumed in the reverse conversion process, it gives

Oð18NÞ for the conversion-based approach. In fact, the

inversion from HSI to RGB and vice versa would make the

convert/reconvert approach inefficient.11

Note that the enhancement process relies largely on the

algorithm adopted. Hence, irrespective of the enhancement

process, conversion and reverse conversion between color

spaces introduce extra computational burdens and should

be avoided as much as possible.

Gradient-guided contrast and saturationenhancement

In order to simultaneously enhance image contrast and

saturation, a streamlined algorithm is proposed that

2 International Journal of Advanced Robotic Systems

Page 3: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

operates directly on the RGB color channels instead of

carrying out transformation to a different space and recon-

vert to the original RGB space. The procedure contains

three stages, that is, color channel stretching, salience

extraction, and simultaneous saturation and contrast

enhancement, which are detailed in the sequel.

System description

The block diagram of the gradient-guided contrast and

saturation enhancement (GGCSE) algorithm, for color

images, is illustrated in Figure 1. The input image is firstly

passed through the min–max alignment block where the

global pixel magnitude, encompassing each RGB signal,

is shifted and scaled between the normalized range ½0 1�.Then the three channels are averaged to produce the inten-

sity image for gradient extraction through convolving with

a Laplacian kernel. The extracted gradient, whose magni-

tude is to be optimized, is selectively added to the sorted

pixels and produced a temporary image. Its qualities with

regard to contrast and saturation are assessed and used in a

search for an optimal scaled gradient level. The optimiza-

tion or search iteration then stops and the fictitious switch

SW is closed. The sorted and adjusted pixels are then

remapped to the original RGB domain. The temporary

image is further processed for over-range correction and

finally produces the enhanced output image. Details of the

blocks in the overall contrast and saturation enhancement

process are described below.

Min–max alignment and magnitude sorting

Let the input image J be in the RGB color space and

indexed by the pixel coordinate uv, we have J ¼ fJ uvg,J uv ¼ fRuv;Guv;Buvg. To enhance its quality in the con-

text of hue, saturation, and contrast, the input image is

firstly processed using color channel stretching for color

bias removal.

As the dominated illumination color is one of the

sources that degrades image quality, a correction is

needed.14 In nonreferencing corrections, the white-point

assumption is frequently employed.15 Here, we propose

an alternative assumption that there is at least one color

of one pixel whose magnitude is zero, while at least one

pixel whose maximum color magnitude is unity. This

assumption leads to stretching the magnitude of each color

elements of a pixel to span the complete magnitude range.

For example, in the red channel, we have

Ruv Ruv � minfRg

maxfRg � minfRg (5)

The green and blue channels are processed in the same

manner to complete the min–max alignment.

After stretching the pixel magnitudes in the RGB color

space, the color channel magnitudes in each pixel are sorted

in ascending order. A set of three single dimension arrays,

namely I min, I med, and I max, is then formed for subsequent

processing in which no color space conversion is needed.

Note that each pixel still carries its own positional coordi-

nate but the index for RGB is replaced by minimum (min),

median (med), and maximum (max), respectively.

Gradient extraction and clipping

It is observed from the HSI space definitions, the intensity

(I) is the average of all color channel values. Hence, when

the intensity needs to be adjusted for contrast enhancement,

the three color values have to be altered simultaneously to

decouple the influences in the other two attributes. Here,

intensity contrast improvement is obtained from using

unsharp masking filter (UMF).16

The principle of the UMF is to augment amplified sal-

ience along the edges of objects captured in the image. The

resultant increase in sharpness as perceived by the human

visual system is obtained from the noticeable variations in

intensity magnitudes. From the HSI space definition, the

change in intensity can be realized by changing all the RGB

signals with the same amount. Here, a 3� 3 window is

used to extract the pixel gradients as a guide for enhance-

ment. We have

ruv ¼ Iuv2W �K (6)

and Iuv is given by equation (1). The local patch W is the

3� 3 neighboring window, � is the convolution operator,

K is a 4-connected Laplacian kernel.

The complexity is OðNÞ, where N is the number of

pixels. The multiplication with zero is not necessary while

Input

image

Min–Max

align

AverageExtract

gradient

Exp

clip

Min–Max

sort

×

+Temp

image

Search

Output

image

Sorted

to

RGBImin

Imid

ImaxR

G

B

Iuvk

cp

Optimized

Over-range

restore+

+

R

G

B

SW

uv

Figrue 1. System block diagram for gradient-guided color image contrast and saturation enhancement.

Shi et al. 3

Page 4: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

multiplication with �1 is implemented as a sign inversion.

The output from the filter is given by

ruv ¼ 4Iuv � Iuðvþ1Þ � Iuðv�1Þ � Iðuþ1Þv � Iðu�1Þv (7)

Gradient-guided selective adjustment

Pixel magnitude adjustment is carried out based on the

extracted object boundaries in the form of local gradients.

The three color elements in each pixel are adjusted with the

same amount determined by ruv. Because ruv could be

positive or negative, pixel magnitudes are increased or

decreased locally, thus producing the enhancement in

intensity contrast.

For saturation enhancement purpose, the intensity

enhancement process has to be modified. The strategy pro-

posed is that if the adjustment is positive, the minimum of

RGB channel is left unaltered. On the other hand, if adjust-

ment is negative, the maximum RGB channel is not

affected. The pixel is updated according to

’ðRGBÞuv þruv ! J uv (8)

where ’ð�Þ is an operator that outputs two color elements

depending on the polarity of gradient ruv. For example,

when the gradient ruv is positive and if the color magni-

tudes are such that R > G > B with the blue channel mag-

nitude being the lowest, then only R and G of larger

magnitudes are returned from the ’ð�Þ operation. That is,

’ðRGBÞ ! fR;Gg.It can be seen from equation (1) that the intensity is

altered if two of the color channels are modified. Further-

more, when the minimum of the color channels is changed,

the saturation is also changed according to equation (2). On

the other hand, when one of the color channel differences is

not changed by the ’ð�Þ operator, the change in color can be

kept small to maintain the color integrity.

Parameter optimization and over-range restoration

Based on the fact that individual image contains different

contents, the adjustment based on gradient ruv has to be

adapted to the image such that the maximum enhancement

can be obtained. Additionally, higher contrast regions in

the image with greater gradients need not to be adjusted,

otherwise the over-range problem will occur where pixel

magnitudes are driven outside the permitted levels. First,

the extracted gradients are clipped at the high and low end

by an exponential function. That is

rcpðuvÞ ¼ ruv � expð�jruvj � jIuv � 0:5jÞ (9)

where rcpðuvÞ denotes the clipped gradient and Iuv is the

average intensity obtained using equation (1) before the

gradient extraction process. The exponent depends on

the product of the extracted absolute gradient jruvj and the

absolute value of the intensity Iuv shifted by half of

the range. As large gradients together with high- and

low-intensity values are causes of the over-range problem,

it should be reduced. The clipping of adjustment strengths

is introduced to mitigate such drawback. The clipping is

modulated with a gain factor k, such that

’ðRGBÞuv þ krcpðuvÞ ! J uv (10)

Furthermore, the efficient golden section search

algorithm17 is employed to obtain an optimal gain factor

k� so that the image can be enhanced optimally. Temporary

gain factors are obtained within the golden section search

iteration and a temporary image is obtained. An objective

function, F k , that relates to a trial gain factor k and

accounting for image entropy, contrast, and saturation is

defined as

F k ¼ H=8þ sJ þ �S (11)

where H=8 is the normalized entropy obtained from divid-

ing the raw entropy by its maximum value. That is

H ¼ �XL�1

i¼0

pðiÞ log2 pðiÞ bits (12)

where pðiÞ is the probability that a pixel has the i-th

intensity.18 For an 8-bit digital representation of each pixel

magnitude, the maximum entropy is eight.

Another measure of image contrast is the overall stan-

dard deviation sJ denoting the spread of pixel magnitudes

in the image. A larger sJ represents a higher contrast.

Furthermore, the average saturation �S, measuring the color

vividness, is obtained from

�S ¼ 1

N

Xuv

Suv (13)

For a small portion of over-range pixels caused by the

enhancement process, they constitute a penalty function

P ¼ 1� �=N (14)

where � is the number of over-range pixels and N is the

total number of pixels in the image. The overall objective

function then becomes

F k ¼ ðH=8þ sJ þ �SÞ � P (15)

The optimum gain factor is thus

k� ¼ argmaxk

fF kg (16)

Final processing

So far, pixels have been manipulated in the sorted domain

where direct display is not ready. A remapping to the RGB

space is carried out by recalling the pixel min–max sorting

indices and the match to the original color index. Finally, a

color image of enhanced colorfulness, contrast, and satura-

tion is obtained.

4 International Journal of Advanced Robotic Systems

Page 5: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

The result of an example image is shown in Figure 2

together with plots of distributions corresponding to the

input and output of the color aligned and enhanced stages.

It can be observed that the output quality of the image,

Figure 2(b), is enhanced over the input image in Figure

2(a). The distributions representing hue, saturation, and

intensity, Figure 2(c) to (e), illustrate that the permissible

ranges are more fully covered and an increased amount of

information is carried from the scene to viewer.

Experiments and results

The effectiveness of the proposed method, that is, GGCSE,

was verified using 300 images of size 360� 480 captured

under natural environment with various illumination and

color characteristics. A consumer grade digital camera was

used and the images were stored in the 24-bit JPG color

format. The test was carried out on the Matlab 2015b plat-

form running on a general purpose personal computer with

an i5-CPU at 1.70 GHz and 4 GB memory with Windows 7

operation system. The proposed approach was also com-

pared to several currently available methods in the context

of color image enhancement. These include the adaptive

histogram equalization (ADPHEQ), smoothing-based histo-

gram equalization (SMHEQ),8 UMF, and saturation feed-

back–based enhancement (SFBEN).19 The ADPHEQ and

UMF tests were adopted from Matlab build-in functions.

The enhancements found in the processed images are

assessed both qualitatively and quantitatively.

Qualitative analysis

Four sample images and the outputs from the proposed

enhancement method are depicted in Figures 3 to 6. Figures

3(a) to 6(a) show the input images. These images are captured

under imperfect conditions including effects due to haze and

backlighting. They appear with low contrast, low saturation,

and low information content. Outputs from the ADPHEQ

method are given in Figures 3(b) to 6(b). Since the underlying

enhancement is provided by uniform histogram equalization,

regions of over-enhancement appear where objects are

either too dark or too bright and some details are lost.

Figures 3(c) to 6(c) are results processed by the SMHEQ

algorithm based on a specification of a smoothed histo-

gram. The over-enhancements are reduced while a slight

improvement on saturation can be noticed. Results from the

UMF process are shown in Figures 3(d) to 6(d). It can be

observed that object boundaries are sharpened but the

saturation is not improved. It is because of that in the design

of the UMF only contrast enhancement is addressed while

saturation improvement is not.

From the SFBEN approach, results are given in Figures

3(e) to 6(e). Although saturation is involved in the feedback

for contrast enhancement, the saturation itself has not been

boosted by design, hence, its increment is not noticeable.

Results from the proposed GGCSE approach are shown in

Figures 3(f) to 6(f). From the output images, it can be seen

clearly that the contrast and, particularly, the saturation are

both enhanced. From these results, it is evident that the pro-

posed method is effective in providing simultaneous color

image contrast and saturation enhancements.

(a) (b)

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.5

1

1.5

2

2.5 × 104

Hue

InputColor EqEnhanced

(c)

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

2000

4000

6000

8000

10000

12000

14000

Saturation

InputColor EqEnhanced

(d)

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

500

1000

1500

2000

2500

3000

Intensity

InputColor EqEnhanced

(e)

Figure 2. Example image: (a) original image, (b) enhanced image, (c) hue distribution, (d) saturation distribution, and (e) intensity distribution.

Shi et al. 5

Page 6: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

Quantitative analysis

In addition to the above qualitative comparison, the perfor-

mance of the proposed method is also evaluated using four

commonly used metrics. These metrics are chosen to assess

the effectiveness of the proposed approach in enhancing

image contrast, saturation, information content, and

colorfulness.

The contrast of an image, as a performance indicator,

takes into account the average intensities and their disper-

sions around a center pixel. This criterion is formulated as

the human visual system perceives contrast on the basis of

Figure 3. Test image 1. (a) original, (b) ADPHEQ, (c) SMHEQ, (d) UMF, (e) SFBEN, and (f) GGCSE. ADPHEQ: adaptive histogramequalization; SMHEQ: smoothing-based histogram equalization; UMF: unsharp masking filter; SFBEN: saturation feedback–basedenhancement; GGCSE: gradient-guided contrast and saturation enhancement.

Figure 4. Test image 2. (a) original, (b) ADPHEQ, (c) SMHEQ, (d) UMF, (e) SFBEN, and (f) GGCSE. ADPHEQ: adaptive histogramequalization; SMHEQ: smoothing-based histogram equalization; UMF: unsharp masking filter; SFBEN: saturation feedback–basedenhancement; GGCSE: gradient-guided contrast and saturation enhancement.

6 International Journal of Advanced Robotic Systems

Page 7: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

the differences between an object and its neighboring region.

A higher value in contrast represents that objects captured in

the image are more distinctive. This metric is given by20

T ¼ 1

N

Xuv

I 2uv �

1

N

Xuv

Iuv

!2

(17)

In addition to the notion of contrast, human perception

also concerns with saturation as a measure of color vivid-

ness as an attribute of image quality. This metric is adopted

from the S-component of the HSI color space.9 The average

saturation of all pixels is obtained from equation (13).

Higher saturation denotes a more vivid image. Note that

the conversion from RGB to HSI color space is only

Figure 5. Test image 3. (a) original, (b) ADPHEQ, (c) SMHEQ, (d) UMF, (e) SFBEN, and (f) GGCSE. ADPHEQ: adaptive histogramequalization; SMHEQ: smoothing-based histogram equalization; UMF: unsharp masking filter; SFBEN: saturation feedback–basedenhancement; GGCSE: gradient-guided contrast and saturation enhancement.

Figure 6. Test image 4. (a) original, (b) ADPHEQ, (c) SMHEQ, (d) UMF, (e) SFBEN, and (f) GGCSE. ADPHEQ: adaptive histogramequalization; SMHEQ: smoothing-based histogram equalization; UMF: unsharp masking filter; SFBEN: saturation feedback–basedenhancement; GGCSE: gradient-guided contrast and saturation enhancement.

Shi et al. 7

Page 8: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

conducted off-line for performance evaluation purpose and

is not required in the enhancement process.

One of the functions demanded from an image is to

convey the scene information to the viewer. Therefore, a

logical and popular measure is the information content or

entropy given by equation (12). A higher entropy value

represents the desirable higher information content carried

in the image.

The metric in colorfulness can quantify an image for the

information content conveyed as color to the viewer. This

measures is defined as21

C ¼ srgyb þ 0:3� �rgyb (18)

where

srgyb ¼ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffis2

rg þ s2yb

q; �rgyb ¼

ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi�2

rg þ �2yb

q(19)

It depends on the standard deviations srg, syb of color

differences of the enhanced color image, that is,

Drg ¼ R� G and Dyb ¼ 0:5ðRþ GÞ � B. Furthermore,

�rg and �yb are the averages of Drg and Dyb. This measure

will be small when an outdoor object is captured under poor

image capturing conditions. A larger C value indicates an

image with better color information.

The test results obtained using 300 images in contrast,

saturation, entropy, and colorfulness are depicted as box

plots and are shown in Figure 7. In Figure 7(a), the plots

on contrast are shown. With this assessment metric, all

methods produce a higher contrast than the input image,

whose mean value is at 0:065. The contrast statistics indi-

cates that the GGCSE output at 0.075 is the second highest,

while UMF output is 0.076. However, it is noted in

the qualitative analysis that UMF is not able to provide

enhancements in saturation. Figure 7(b) shows the

statistical results of compared methods with respect to

the saturation metric. While the input image carries an

average saturation of 0.190, histogram-based methods are

not able to increase saturation due to their design focuses.

On the other hand, the UMF and SFBEN methods produce

a slight improvement on saturation. The proposed method,

GGCSE, provides the highest gain in the saturation up to

0.283 averaged over the test images.

The box plot of entropy is given in Figure 7(c). The

input image entropy is 7.278 and the highest metric is

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

Inpu

t

AD

PHEQ

SMH

EQU

MF

SFBEN

GG

CSE

Co

ntr

ast

µ= 0.065 0.066 0.068 0.076 0.073 0.075(a)

0.1

0.2

0.3

0.4

0.5

0.6

0.7

Inpu

t

AD

PHEQ

SMH

EQU

MF

SFBEN

GG

CSE

Sat

ura

tion

µ= 0.190 0.190 0.190 0.201 0.204 0.283(b)

6

6.5

7

7.5

8

Inpu

t

AD

PHEQ

SMH

EQU

MF

SFBEN

GG

CSE

Entr

opy

µ= 7.278 7.612 7.108 7.379 7.356 7.456(c)

0

0.1

0.2

0.3

0.4

0.5

Inpu

t

AD

PHEQ

SMH

EQU

MF

SFBEN

GG

CSE

Co

lorf

uln

ess

µ= 0.097 0.121 0.107 0.097 0.095 0.141(d)

Figure 7. Result statistics in box plots: (a) contrast, (b) saturation, (c) entropy, and (d) colorfulness.

8 International Journal of Advanced Robotic Systems

Page 9: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

obtained from the ADPHEQ method. Similar to the con-

trast measurements, the ADPHEQ method is vulnerable to

the over-enhancement problem. On the other hand, the

GGCSE method produces high entropy at 7.456 without the

over-enhancement problem. Statistics of colorfulness are

plotted in Figure 7(d). Colorfulness is strongly related to

saturation while the former is more concerned with color

harmony. When the original image colorfulness mean value

is 0.097, the GGCSE method produces colorfulness at 0.141

which is the highest among the tested enhancement methods.

Hypothetical analysis

Hypothetical tests on the obtained results are also con-

ducted to examine the statistical significance of the

improvement by using the two-variable t-test at 0:05 con-

fidence level for hypotheses:

H0: The distributions of the input image characteristic

are equal to the enhanced image.

H1: The distributions of the input image characteristic

are not equal to the enhanced image.

Test results are given with annotations above the plots

in Figure 8. The first row shows the hypotheses and

the second row contains the corresponding p values, see

also Table 1.

In the test on contrast, the null hypothesis was rejected

in the UMF, SFBEN, and GGCSE methods, indicating that

the result metric distribution is not equal to the distribution

of the input image. The GGCSE method had a p value

approaching zero that supports the rejection. In the test for

saturation, only the GGSCE method rejected the null

hypothesis with a p value tends to zero. The null hypothesis

was rejected by all methods with the test for entropy. Based

on the p value, all methods had equivalent performances. In

the test for colorfulness, the ADPHEQ, SMHEQ, and the

GGCSE methods rejected the null hypothesis. The

ADPHEQ and GGSCE methods also had near zero p val-

ues; however, the GGCSE method was free from over-

enhancements as observed from the qualitative analysis.

From this analysis, it can be seen that the proposed GGCSE

method is the only method that has rejected the null hypoth-

esis, meaning that it changed distribution characters of

images in all four measured.

0 0.05 0.1 0.15 0.20

5

10

15

20

25

30

Contrast

No. of

imag

es

H= 0.000 0.000 1.000 1.000 1.000P= 0.712 0.147 0.000 0.001 0.000

Input

ADPHEQ

SMHEQ

UMF

SFBEN

GGCSE

(a)

0 0.2 0.4 0.6 0.80

5

10

15

20

25

30

Saturation

No. of

imag

es

H= 0.000 0.000 0.000 0.000 1.000P= 0.924 0.944 0.193 0.124 0.000

Input

ADPHEQ

SMHEQ

UMF

SFBEN

GGCSE

(b)

5.5 6 6.5 7 7.5 80

5

10

15

20

25

30

35

Entropy

No. of

imag

es

H= 1.000 1.000 1.000 1.000 1.000P= 0.000 0.000 0.001 0.007 0.000

Input

ADPHEQ

SMHEQ

UMF

SFBEN

GGCSE

(c)

0 0.1 0.2 0.3 0.4 0.50

5

10

15

20

25

30

35

40

Colorfulness

No. of

imag

es

H= 1.000 1.000 0.000 0.000 1.000P= 0.000 0.043 0.945 0.813 0.000

Input

ADPHEQ

SMHEQ

UMF

SFBEN

GGCSE

(d)

Figure 8. Statistics of results in distributions and hypothetical tests: (a) contrast, (b) saturation, (c) entropy, and (d) colorfulness.

Shi et al. 9

Page 10: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

Complexity

The complexity involving floating point operations,

namely multiplication, division, exponentiation, and trigo-

nometric operations, are considered for the approaches

compared with the proposed GGCSE. The complexity in

RGB–HSI transformation, Oð18NÞ, is given in “Color

space conversion and induced complexities” section.

In the ADPHEQ method, the intensity image is divided

into tiles. The tiles are enhanced using histogram equaliza-

tion and then interpolated to prevent artifacts. For each

pixel, it requires one multiplication in the equalization pro-

cess, and its complexity is OðNÞ. If the image is divided

into M �M tiles, the interpolation complexity isOðM 2NÞ.Together with the RGB–HSI conversion, the total complex-

ity isOðð18þ 1þM 2ÞNÞ. For instance, if M ¼ 8, then the

complexity becomes Oð83NÞ.For SMHEQ, the smoothing is carried over the intensity

levels and is independent on the number of pixels. The

equalization process requires OðNÞ. The overall complex-

ity is Oðð18þ 1ÞNÞ ¼ Oð19NÞ.In the UMF algorithm, the kernel adopted contains non-

integer elements, hence the complexity to obtain the salience

pixels is Oð9NÞ. Together with the color space transforma-

tions, the total complexity is Oðð18þ 9ÞNÞ ¼ Oð27NÞ.For the SFBEN method, the salience extraction kernel

is an average filter, then the complexity is OðNÞ. The

enhancement in contrast and saturation require two

multiplications, thus the complexity is Oð2NÞ. The overall

complexity is Oðð18þ 3ÞNÞ ¼ Oð21NÞ.The proposed GGCSE method contains a color channel

stretching operation, the complexity is Oð3NÞ. To obtain

the intensity image, an operation of OðNÞ is needed. With

the purpose-specific gradient extraction kernel, the com-

plexity isOðNÞ. The exponential clipping process demands

a complexity of OðNÞ. With regard to the determination of

the optimal gain factor, its complexity depends on how the

objective function is evaluated and the number of loops

required. The former involves the calculation of the satura-

tion OðNÞ and applying the gain with OðNÞ. From the test

of 300 images, the gain factor range is in ½0:5 3�. With a

tolerance of 0:1 adopted in the golden section search,

the loop required is L ¼ logð0:1=2:5Þ= logð0:618Þ 7.

Hence, the complexity is Oðð7� 2ÞNÞ. The overall com-

plexity is Oðð7� 2þ 6ÞNÞ ¼ Oð20NÞ.All algorithms have a linear complexity with respect to

the number of pixels. The complexity of the GGCSE is less

than ADPHEQ, UMF, and SFBEN but is slightly higher

than SMHEQ. However, it should be noted that the com-

pared algorithms are not designed with an optimization

routine and their performances are hence suboptimum.

Conclusion

A transformation-free approach had been proposed that can

achieve color image enhancement by improving contrast

and color vividness simultaneously. The method manipu-

lates pixel values directly in the source RGB color space.

Unlike conventional transformation–based approaches, the

conversion between color spaces is not involved and there-

fore reduces the implementation complexity. Furthermore,

simple magnitude stretching and feature gradient-guided

magnitude adjustments on each color channel are found

effective in providing enhanced images in terms of

improved color harmony, saturation, and contrast. The

complexity analysis has shown that the proposed GGCSE

method is lower than most of the compared algorithms and

is comparable to the algorithm with lowest complexity.

Promising improvements on image qualities were obtained,

evaluated both qualitatively and quantitatively, from a

large set of images captured in natural scenes.

Declaration of conflicting interests

The author(s) declared no potential conflicts of interest with respect

to the research, authorship, and/or publication of this article.

Funding

The author(s) received no financial support for the research,

authorship, and/or publication of this article.

References

1. Yu YH, Ha Q and Kwok N. Chip-based design for real-time

moving object detection using a digital camera module. In:

2nd international congress on image and signal processing,

2009, CISP’09, Tianjin, China, 17–19 October 2009, pp. 1–5.

IEEE.

2. Su F and Fang G. Moving object tracking using an adaptive

colour filter. In: 2012 12th international conference on control

Table 1. Hypothetical test results.

Method

Input ADPHEQ SMHEQ UMF SFBEN GGCSE

ContrastHypothesis N/A 0 0 1 1 1p Value N/A 0.712 0.147 0.000 0.001 0.000Mean 0.065 0.066 0.068 0.076 0.073 0.075

SaturationHypothesis N/A 0 0 0 0 1p Value N/A 0.924 0.944 0.193 0.124 0.000Mean 0.190 0.190 0.190 0.201 0.204 0.283

EntropyHypothesis N/A 1 1 1 1 1p Value N/A 0.000 0.000 0.001 0.007 0.000Mean 7.278 7.612 7.108 7.379 7.356 7.456

ColorfulnessHypothesis N/A 1 1 0 0 1p Value N/A 0.000 0.043 0.945 0.813 0.000Mean 0.097 0.121 0.107 0.097 0.095 0.141

ADPHEQ: adaptive histogram equalization; SMHEQ: smoothing-basedhistogram equalization; UMF: unsharp masking filter; SFBEN: saturationfeedback–based enhancement; GGCSE: gradient-guided contrast andsaturation enhancement.

10 International Journal of Advanced Robotic Systems

Page 11: Gradient-guided color image contrast and saturation ......applications involving imageries in remote sensing for road damage detection 4 and weather monitoring. 5,6 More- over, there

automation robotics and vision (ICARCV), Guangzhou, China,

5–7 December 2012, pp. 1048–1052. IEEE.

3. Xu Y, Fang G, Lv N, et al. Computer vision technology for

seam tracking in robotic GTAW and GMAW. Robot Comput

Integr Manuf 2015; 32: 25–36.

4. Ma H, Lu N, Ge L, et al. Automatic road damage detection

using high-resolution satellite images and road maps. In:

Geoscience and remote sensing symposium (IGARSS), 2013

IEEE international, Melbourne, VIC, Australia, 21–26 July

2013, pp. 3718–3721. IEEE.

5. Wang K, Li X and Ge L. Locating tropical cyclones with

integrated SAR and optical satellite imagery. In: Geoscience

and remote sensing symposium (IGARSS), 2013 IEEE

international, Melbourne, VIC, Australia, 21–26 July 2013,

pp. 1626–1629. IEEE.

6. Xu M, Jia X and Pickering M. Automatic cloud removal for

Landsat 8 OLI images using cirrus band. In: Geoscience

and remote sensing symposium (IGARSS), 2014 IEEE

international, Quebec City, QC, Canada, 13–18 July 2014,

pp. 2511–2514. IEEE.

7. Tsai CM and Yeh ZM. Contrast enhancement by automatic

and parameter-free piecewise linear transformation for color

images. IEEE Trans Consum Elect 2008; 54(2): 213–219.

8. Kwok N, Jia X, Wang D, et al. Visual impact enhancement

via image histogram smoothing and continuous intensity

relocation. Comput Elect Eng 2011; 37(5): 681–694.

9. Gonzalez RC and Woods RE. Digital image processing.

3rd ed. Upper Saddle River: Prentice-Hall Inc, 2006.

10. Ghimire D and Lee J. Color image enhancement in HSV

space using nonlinear transfer function and neighborhood

dependent approach with preserving details. In: 2010 fourth

Pacific-rim symposium on image and video technology

(PSIVT), Singapore, 14–17 November 2010, pp. 422–426.

11. Russ JC. The image processing handbook. Florida: CRC

press, 2015.

12. Song G and Qiao XL. Color image enhancement based on

luminance and saturation components. In: Congress on image

and signal processing, 2008, CISP’08, Sanya, Hainan, China,

27–30 May 2008, vol. 3, pp. 307–310. IEEE.

13. Hanumantharaju M, Ravishankar M, Rameshbabu D, et al.

Adaptive color image enhancement based geometric mean

filter. In: Proceedings of the 2011 international conference

on communication, computing and security, Rourkela,

Odisha, India, 12–14 February 2011, pp. 403–408. ACM.

14. Kwok N, Shi H, Ha QP, et al. Simultaneous image color

correction and enhancement using particle swarm optimiza-

tion. Eng Appl Artif Intell 2013; 26(10): 2356–2371.

15. Kwok N, Wang D, Jia X, et al. Gray world based color

correction and intensity preservation for image enhancement.

In: 2011 4th international congress on image and signal

processing (CISP), Shanghai, China, 15–17 October 2011,

vol. 2, pp. 994–998. IEEE.

16. Kim YH and Cho YJ. Feature and noise adaptive unsharp

masking based on statistical hypotheses test. IEEE Trans

Consum Elect 2008; 54(2): 823–830.

17. Chong EK and Zak SH. An introduction to optimization,

vol. 76. Hoboken: John Wiley & Sons, 2013.

18. Yao Y, Abidi B, Doggaz N, et al. Evaluation of sharpness

measures and search algorithms for the auto focusing of

high-magnification images. In: International society for

optics and photonics defense and security symposium,

Orlando (Kissimmee), FL, USA, 17 April 2006, pp.

62460G–62460G.

19. Thomas BA, Strickland RN and Rodriguez JJ. Color image

enhancement using spatially adaptive saturation feedback.

In: IEEE international conference on image processing,

Santa Barbara, CA, USA, 26–29 October 1997, p. 30. IEEE.

20. Tang JR and Isa NAM. Adaptive image enhancement based

on bi-histogram equalization with a clipping limit. Comput

Elect Eng 2014; 40(8): 86–103.

21. Hasler D and Suesstrunk SE. Measuring colorfulness in

natural images. In: Electronic imaging 2003. International

society for optics and photonics, Santa Clara, CA, USA, 20

January 2003, pp. 87–95.

Shi et al. 11