9
On Extended Depth of Field to Improve the Quality of Automated Thermographic Measurements in Unknown Environments by Samuel Soldan* *University of Kassel, Faculty of Mechanical Engineering, Measurement and Control Department, Mönchebergstraße 7, 34125 Kassel, Germany, [email protected] Abstract Focusing is essential for the quality of thermal imaging. But due to physical constraints only a small distance area around the focal distance, called the depth of field (DOF), appears acceptably sharp in a single thermogram. For scenes containing multiple objects at different distances from the camera or one along the optical axis outstretched object it is hard to have all parts of the image sharp within one measurement. This is impossible if the distance between the closest and the furthest region is larger than the depth of field. This work describes a solution to get an all-in-focus measurement by taking a measurement series with changed focal settings and combining the sharp parts using digital image processing. Different possibilities for this process are discussed and examples are given. 1. Introduction Motivation Thermography has become and widely used tool for industrial inspections. Application areas include electrical (e.g. high resistance, short or open circuits), mechanical (e.g. friction, wear, valve/pipe blockage) as well as buildings and infrastructure inspection (e.g. insulation, moisture, leaks) [1]. Inspection work is usually carried out by a human operator on-site. Especially in hazardous (e.g. radiation, explosive or toxic exposure) or “unpleasant” environments (e.g. high ambient temperature) and on dangerous objects (e.g. in-service work on high voltage power lines or high temperature objects) or in tight or remote places (e.g. underground pipelines, high-up equipment such as pipe-bridges or wind turbine towers/blades) it can be desired to replace the human operator by an unmanned mobile industrial inspection robot [2]. Using thermal imaging this robot could autonomously inspect an area and report issues to a control station. The automated operation of a thermographic camera however causes challenges not only in the autonomous processing of the acquired data (e.g. object detection/classification, fault detection) but also in the correct operation of the camera. Here the correct focusing is a fundamental challenge as an out of focus measurement cannot be corrected by software. Problem statement Many thermographic cameras nowadays are equipped with autofocus systems that determine the correct focusing for a given region of interest (ROI) in the field of view (FOV) of the camera. These systems however do not work well if the object of interest (OOI) is not in the ROI, the OOI does not provide enough contrast (i.e. temperature/surface variations in thermal imaging) or different OOI in various distances to the camera have to be considered. Because for one focus setting only a certain area (DOF) appears sharp, for the latter problem no focus setting will provide a satisfactory result if the objects are further away from each other than the DOF or outstretched objects with dimensions larger than the DOF along the optical axis have to be considered (see Fig. 1). This article addresses the question how these scenarios can be captured in a way that the resulting measurement appears sharp in all regions of the scene. Fig. 1. Left: The object (b) within the depth of field appears sharp, while the objects in front of (a) or beyond (c) the depth of field appear blurry. Right: Only a small part of the tilted ruler (d) is sharp in the thermogram (temperature variation is due to the different emissivity of the bare metal ruler and the printed numbers). Depth of field Camera a b c d d 11 th International Conference on Quantitative InfraRed Thermography

On Extended Depth of Field to Improve the Quality of Automated

Embed Size (px)

Citation preview

Page 1: On Extended Depth of Field to Improve the Quality of Automated

On Extended Depth of Field to Improve the Quality of Automated Thermographic Measurements in Unknown Environments

by Samuel Soldan*

*University of Kassel, Faculty of Mechanical Engineering, Measurement and Control Department, Mönchebergstraße 7, 34125 Kassel, Germany, [email protected]

Abstract

Focusing is essential for the quality of thermal imaging. But due to physical constraints only a small distance area around the focal distance, called the depth of field (DOF), appears acceptably sharp in a single thermogram. For scenes containing multiple objects at different distances from the camera or one along the optical axis outstretched object it is hard to have all parts of the image sharp within one measurement. This is impossible if the distance between the closest and the furthest region is larger than the depth of field. This work describes a solution to get an all-in-focus measurement by taking a measurement series with changed focal settings and combining the sharp parts using digital image processing. Different possibilities for this process are discussed and examples are given.

1. Introduction

Motivation

Thermography has become and widely used tool for industrial inspections. Application areas include electrical (e.g. high resistance, short or open circuits), mechanical (e.g. friction, wear, valve/pipe blockage) as well as buildings and infrastructure inspection (e.g. insulation, moisture, leaks) [1]. Inspection work is usually carried out by a human operator on-site. Especially in hazardous (e.g. radiation, explosive or toxic exposure) or “unpleasant” environments (e.g. high ambient temperature) and on dangerous objects (e.g. in-service work on high voltage power lines or high temperature objects) or in tight or remote places (e.g. underground pipelines, high-up equipment such as pipe-bridges or wind turbine towers/blades) it can be desired to replace the human operator by an unmanned mobile industrial inspection robot [2]. Using thermal imaging this robot could autonomously inspect an area and report issues to a control station. The automated operation of a thermographic camera however causes challenges not only in the autonomous processing of the acquired data (e.g. object detection/classification, fault detection) but also in the correct operation of the camera. Here the correct focusing is a fundamental challenge as an out of focus measurement cannot be corrected by software.

Problem statement

Many thermographic cameras nowadays are equipped with autofocus systems that determine the correct focusing for a given region of interest (ROI) in the field of view (FOV) of the camera. These systems however do not work well if the object of interest (OOI) is not in the ROI, the OOI does not provide enough contrast (i.e. temperature/surface variations in thermal imaging) or different OOI in various distances to the camera have to be considered. Because for one focus setting only a certain area (DOF) appears sharp, for the latter problem no focus setting will provide a satisfactory result if the objects are further away from each other than the DOF or outstretched objects with dimensions larger than the DOF along the optical axis have to be considered (see Fig. 1). This article addresses the question how these scenarios can be captured in a way that the resulting measurement appears sharp in all regions of the scene.

Fig. 1. Left: The object (b) within the depth of field appears sharp, while the objects in front of (a) or beyond (c) the depth of field appear blurry. Right: Only a small part of the tilted ruler (d) is sharp in the thermogram (temperature

variation is due to the different emissivity of the bare metal ruler and the printed numbers).

Depth of field

Camera

a

b

c

d

d

11th International Conference on

Quantitative InfraRed Thermography

Page 2: On Extended Depth of Field to Improve the Quality of Automated

State of the art

Systems to automatically determine and adjust the correct focus for cameras have been around since the 1960s. Different sensors/techniques have been proposed and very precise and quick solutions are common in modern single-lens reflex (SLR) visual cameras. Infrared optical systems are more challenging but simple autofocus (AF) systems can be found in many modern thermal cameras. Using a search algorithm (e.g. hill climbing method [3]) and focus measure functions (FMF) the correct setting can be found and adjusted automatically by the cameras processing unit [4]. The FMF analyses an image and computes a sharpness value [5], [6].

The combination/fusion of image series that vary in illumination, polarization, spectral sensitivity or similar can result in high quality images that are not achievable with only one single image [7]. Advances in image fusion have been made in many fields. In thermography two image fusion techniques are encountered often: fusing normal images (or depth images) and thermograms (e.g. picture-in-picture) [8] and time-series fusion in non-destructive examination/testing (NDE/NDT) [9].

The combination of focus series can be used to estimate the distance of certain objects (depth from focus/defocus [10], [11]) or to improve the overall sharpness and obtain an all-in-focus image [12], [13]. While the technique to extend the DOF is fairly common in photography and microscopy, there is no known application in thermal imaging. Also, most articles for extended DOF in visual images only describe the use of a low amount (e.g. � � 5) of images to fuse and don’t present an automated image acquisition process.

2. Background

Quantitative effects of incorrect focus in thermal imaging

Like any optical system the thermographic camera has to be focused before measurements are taken. In thermography if the camera is focused incorrectly not only the measured objects become blurred, but also the measured temperatures may change drastically. Fig. 2 shows a simple example of a soldering iron that has been placed in front of a thermographic camera. The images vary in the focus setting of the camera while Fig. 2c is correctly focused. The tip of the soldering iron has the same temperature in all measurements (��� � 140°C) but the measured maximum temperature differs significantly: ����,� � 77°C in Fig. 2a, ����,� � 118°C in Fig. 2b and ����,� � 141°C in Fig. 2c. This divergence is caused by the defocus and shows why it is not only important to have the correct focus setting but even essential in quantitative thermography.

Fig. 2. Thermograms of a soldering iron with different focal settings. The left corner of the red labels shows the

location of the maximum temperature. The color scale on the right-hand side ranges from 25°C to 140°C. (a): wrong focus (max. temperature ����,� � 76.78°�), (b): better focus (����,� � 118.31°C), (c): correct focus (����,� � 141.43°C)

Focus and depth of field

The distance between the principal plane of a thin convex lens and the point where collimated electromagnetic radiation (e.g. thermal radiation from an object that has a large (near infinite) distance to the camera lens) converges is called the focal length � while the point where the radiation converges is called focal point (or focus) which lies on the focal plane (see Fig. 3a). For objects with a given distance �� to the lens the sharp image is located at the image plane (plane of focus) at a distance to the lens of �� as shown in Fig. 3b which are related by the ‘thin lens equation’ [14]

1� � 1�� 1�� . (1)

As the image plane can only be adjusted to one distance �� at a time, only objects with a distance of �� appear sharp, all other objects are out of focus. For a point source object they exhibit a so called circle of confusion on the plane of focus (Fig. 3c and Fig. 4). Due to a limited spatial resolution of the human eye or a matrix detector the circle of confusion can have a certain diameter and still be perceived as a sharp image. The circle of confusion diameter limit ! is defined by the desired acceptable sharpness. As one pixel of a common matrix detector has a specific size (called detector pitch or pixel size) the circle of confusion can have the same or smaller size without change of sharpness. A

a b c

11th International Conference on Quantitative InfraRed Thermography, 11-14 June 2012, Naples Italy

Page 3: On Extended Depth of Field to Improve the Quality of Automated

detector with a size of ℎ� � 16mm by ℎ$ � 12mm and a resolution of 640 × 480 pixels for example has an acceptable

circle of confusion diameter ! � %&mm&'( � %)mm'*( � 25μm (the pixels are typically rectangular). For a given object distance �� the near distance that is still acceptably sharp �, and the far distance to be still perceived as sharp �- can be calculated with the following simplified formulas [14]

�,(��) � �� ∙ �)�) 1 ∙ ! ∙ �� and (2)

�-(��) � �� ∙ �)�) − 1 ∙ ! ∙ �� (3)

with 1 being the f-number of the lens (ratio between focal length and effective aperture diameter of the lens) [14]. The absolute distance between the nearest and the farthest objects that appear sharp in a measurement is called the depth of field (DOF):

�34-(��) � �-(��) − �,(��). (4)

For most thermal cameras the DOF is usually quite small due to the low f-number: For a standard camera with � �30mm, 1 � 1.0 and ! � 25μm for certain distances �� the depth of field is �34-(�� � 1m) � 0.0556m, �34-(�� �2m) � 0.223m and �34-(�� �5m) � 1.42m. Beyond a certain distance �5 all objects are acceptably sharp (�-(�� ≥ �5) → ∞, �34-(�� ≥ �5) → ∞). This is the hyperfocal distance

�5 � �)1 ∙ !. (5)

For the abovementioned example the hyperfocal distance is �5 � 36m.

Fig. 3. Geometry of a thin lens with different parameters: focal length � (a), distance, focal point and plane of

focus (POF) of objects (b) and example for objects causing a blur on the POF (dotted and dashed lines in c).

Fig. 4. Geometry of object and lens. The squares show the image of the object and display the circle of

confusion for objects closer (a) and farther (c) than the focused distance �� as well as for the correct focus setting (b).

From equations (2) to (4) one can see that the DOF is larger if • the f-number 1 is larger, • the focal length � of the lens is smaller, • the pixel size (acceptable circle of confusion diameter) ! is larger and • the object is farther away (�� is larger).

Optics for thermography cameras typically have low f-numbers between 1 � 1 and 1 � 2 since the irradiation intensity is proportional to the square of the inverse f-number and therefore larger f-numbers would increase the noise level of the detector [15]. Also [14] states that large-aperture lenses are necessary to reduce diffraction effects which are some 20 times more significant in the infrared than in the visible spectrum. Decreasing the aperture to change the f-number is not intended in thermography since it would decrease the incoming radiation and would require a recalibration of the camera and reduce sensitivity.

The focal length and the object distance are related, for a measurement object width in 9-direction ��, by

� � �� ∙ ℎ��� ℎ� (compare with eq. (1)). (6)

f du dv ∞ dN dF

a

Object

a b c

POF POF

blur

a b c

11th International Conference on Quantitative InfraRed Thermography, 11-14 June 2012, Naples Italy

Page 4: On Extended Depth of Field to Improve the Quality of Automated

Therefore decreasing the focal length would require the camera to be closer to the object (decrease ��) to have the same resolution of the object (keeping �� constant) which would in return reduce the DOF. The advances in sensor array technology will result in higher resolutions with even smaller pixel sizes (e.g. 25μm for 640 × 480 pixels and 17μm for 1024 × 768 pixels) which are also not favorable for a large DOF [16]. If increasing the distance to an object and thereby lowering the objects resolution is not an option, it is difficult to get a quantitative measurement for all areas of interest, particularly, if objects with dimensions larger than the DOF or multiple objects placed at different distances, larger than the DOF, have to be considered. A solution to this problem is to take different measurements varying in focus distances �� (that each part of the image is sharp in at least one sample), to select the sharp parts within the measurements and to stitch them together and thereby get an extended depth of field which is described in the next section.

Extended depth of field

Measurements with different focus distances can be combined to artificially extend the depth of field. In digital image processing for images in the visual spectrum this technique is sometimes referred to as ‘Focus Stacking’ and often used for macro photography or microscopy [17]. One of the first to propose this method were Pieper and Korpel in 1983 [12]. The general process for extended DOF is schematically depicted in Fig. 5 and illustrated in Fig. 6 for a simple example of four soldering irons that have been placed at different distances to a thermographic camera. In the example the soldering iron on the left side in the images is farthest to the camera while the distance decreases with each object to the right side. It can be seen that as the focal point changes the measured temperature changes as well and not one single image provides correct information for all soldering irons. If the distances to all objects is known individual measurements can be taken for each focus distance ��,: or if only the distance to the closest (��;<) and the farthest object (����) is known (or fixed values are taken if nothing is known about the environment) the necessary focus points in between can be calculated and the measurements can be taken. The image series can be filtered in the next step to reduce noise or enhance certain features. Afterwards a focus measure function e.g. an edge detection filter, grayscale variance, max/min or the entropy of color histogram is used in all images to determine sharp areas [12], [18]. This information is used to build a decision map �( (Fig. 6g) and after consistency checks this map is used to put together the final image. From the resulting image in the example (Fig. 6h) it can be seen that the result is better than any source image by itself.

Fig. 5. Possible process steps to extend the depth of field. The raw data is used for determining the correct

combination and to form the final image. Both processes require different preprocessing hence the two paths of the data.

Fig. 6. Four soldering irons have been placed in different distances to the thermographic camera. In the top row

the focal point is set to each iron individually. In the bottom row a picture of the setting (e), a birds-eye view (f; 1: soldering irons, 2: thermographic camera), the decision map (g) and the combined image are shown (h).

Data series =:(>, ?) Measurement

data acquisition Preprocessing Data series =:@AB Focus measure

function (FMF) Focus info C: Max/min

identification

Decision

map �( Consistency

check Decision

map �( Image

composition Extended DOF image

Preprocessing Data series =:@AB) Post

processing

Final image

a b c d

e f g h

�� � 620mm �� � 514mm �� � 407mm �� � 300mm

11th International Conference on Quantitative InfraRed Thermography, 11-14 June 2012, Naples Italy

Page 5: On Extended Depth of Field to Improve the Quality of Automated

3. Processing steps

In this chapter different options for the steps of the process to generate an extended depth of field (Fig. 5) are discussed. For the measurements the intensity/temperature at a pixel coordinate (>, ?) is =(>, ?) with

> � 1 to D and D being the horizontal array size in pixels and

? � 1 to E and E being the vertical array size in pixels.

The number of images in one series is F with � � 1 to F and =:(>, ?) representing the intensity of the �th image at the pixel coordinates (>, ?) as shown in Fig. 7. For some algorithms the image is subdivided into D- × E- image blocks =- with their own local pixel coordinate systems > � 1 to D- and ? � 1 to E- (Fig. 7).

Fig. 7. Relation between image coordinates (>, ?) and slice number � (a and b). The image can be subdivided

into image blocks =- (c).

Acquiring measurements

The objective is to take measurements in a way that each part of a scene is sharp in at least one measurement. If the distances to the objects are known (e.g. if data from an additional depth sensor is available) than these values can be taken. If this is not the case, measurements have to be taken for all necessary distances ��,: between a minimum (��;<) and a maximum (����) or alternatively the hyperfocal distance �5. Rearranging equation (2) yields

��(�,) � �)�)�, − 1 ∙ !. (7)

Starting at �,,:G% � ��;< the focus distance ��,:(�,,:) and the next distance limit �,,:H% � �-,: can be calculated. Iteratively all necessary steps can be calculated this way. If overlapping of the DOF areas is desired

�,,:H% � �-,: − I ∙ (�-,: − �,,:) (8)

with an overlapping factor 0 ≤ I � 1 where 0 means no overlap and 0.5 means 50% overlap of the DOF can be used.

Preprocessing

The preprocessing step is optional and can be used for various reasons such as: • noise reduction, • trend estimation and detrending, • enhancement of certain features or contrast, • scaling/spreading/normalization, • segmentation, • discretization or • reduction of resolution or bit depth.

For extended depth of field in particular noise reduction, scaling/discretization (e.g. into 255 steps so that algorithms from ‘normal’ image processing can be used more easily) or segmentation (to compensate lens/focus breathing) are of interest. For noise reduction common noise filters or averaging several measurements with the same focus setting can be considered. If required by the fusion software the data can be exported as images (e.g. *.jpg or *.png).

Focus Detection

In this step the sharpness or clarity of each pixel or image block is determined. Different options have been proposed that can be classified into filters that work in the image space or in the frequency space. The focus measures can be calculated pixel-wise or for an image block. Most focus measure functions (FMF) work similar to edge detection in image processing but provide a rational number instead of a Boolean value.

Because at focus the intensity/temperature is at a maximum/minimum (see Fig. 2 and Fig. 6) Pieper and Korpel [12] proposed to search for the maximum/minimum value through all measurements for each pixel coordinate. Based on the distance to the intensity average it was decided whether to take the maximum or minimum value. Simple statistical

...

=(>, ?) =:(>, ?) >

? �c > ?

a b c

=-

11th International Conference on Quantitative InfraRed Thermography, 11-14 June 2012, Naples Italy

Page 6: On Extended Depth of Field to Improve the Quality of Automated

FMF e.g. histogram entropy or variance for a region can also be used. In [13] the Tenengrad focus measure, which takes the intensity of the neighboring pixels into consideration, is applied to every pixel for each measurement:

CK,:(>, ?) � LM�,:) M<,:) (9)

where M: � =: ∗ O is the result of the convolution (written as an asterisk) often with Sobel masks O� in > and ? direction:

O�,� � P−1 0 1−2 0 2−1 0 1Q and O�,< � P1 2 10 0 0−1 −2 −1Q. Huan and Jing [18] provide a comparison of different FMF to be used in

focus series combination. The paper concludes that the Sum-modified-Laplacian (SML) and Energy of Laplacian (EOL) provide better results than Tenengrad for their experimental setup and their tested parameters. The EOL is computed by

convoluting the measurements with the symmetric kernel OR4S � P1 4 14 −20 41 4 1Q. Sometimes the opposite signs in the

kernel tend to cancel each other out and here the SML compensates with absolute values for the measure in > and ? direction. The SML is computed as the sum for a block/window of the image and not pixel-wise. The spatial frequency (SF) [19], [20] is also calculated for a block =- with the dimensions D- and E-.

C�-,-,: � LTUC-,:V) T�C-,:V)withrow and column frequenciesUCand�C: (10)

UC- � W 1D- ∙ E-X X Y=-(>, ?) − =-(>, ? − 1)Z),[<G)

\[�G% , (11)

�C- � W 1D- ∙ E-X X Y=-(>, ?) − =-(> − 1, ?)Z),[<G%

\[�G) . (12)

Antunes et al. [21] describe an algorithm with an adaptive block size. Other recent publications have proposed multiscale transforms (e.g. wavelets) to analyze the sharpness of images [17], [22]. These methods seem to work well but are complex and introduce a big number of design parameters that have to be fine-tuned to achieve good performance.

Generation and revision of the decision map

Given the extracted features a map �((>, ?) can be created that denotes the composition of the final image. The entries refer to the number of the image � used for each pixel. Depending on the selected focus measure for each pixel, either the minimum or the maximum from the feature filtered image stack C will be used to determine the number. If the order of the images is de- or ascending with the focus distance than the decision map also gives an idea about the depth of the objects (this is basically how depth from focus works). To account for misallocated pixels in the decision map mathematical morphology or a majority filter can be used which replaces presumably erroneous pixels based on the majority of their contiguous neighbors [20].

Image composition

The composite image is simply constructed by selecting the intensity for each pixel from the image stated in �(: =R]34-(>, ?) �X =:̂ _`)(>, ?) ∙ a:(>, ?)^

:G% where (13)

a:(>, ?) � 1for� � �((>, ?)0for� ≠ �((>, ?). (14)

For the composition the original data = is used and specifically processed (=:̂ _`)) because the already preprocessed data (=:̂ _`) was filtered for a different reason (i.e. improve the result of the FMF).

On the composed image optional post-processing can be applied if necessary e.g. to smoothen transitions.

4. Case Study

This case study examines the feasibility and open problems of an automated unassisted process to extend the DOF for a thermographic camera. The camera used for this article is a state of the art Infratec VarioCAM® hr head with a spectral range f�;< � 7.5μm to f��� � 14μm a resolution of 640 × 480 pixels (detector size 16mm by 12mm) at 50Hz and � � 30mm optics with 1 � 1.0 and Cij � 30° × 23°. The camera has a motorfocus and can be fully computer controlled via either a gigabit Ethernet or a FireWire (IEEE 1394) connection.

11th International Conference on Quantitative InfraRed Thermography, 11-14 June 2012, Naples Italy

Page 7: On Extended Depth of Field to Improve the Quality of Automated

The distance between the lens and the image plane �� can be controlled by a function that accepts a discrete natural number ��,kl\ ∈ n0,1,… , 1174,1175p that has an affine relationship to ��:

�� � q ∙ ��,kl\ r. (15)

The parameters q and r have been determined empirically as no information was provided from the manufacturer. Given a minimum and a maximum object distance (��;< and ����) the necessary focus settings ��,kl\,: were

calculated using equations (1), (8) and (15). A simple routine then adjusts the focus and acquires one measurement for each step �. For ��;< � 1m and ���� � 10m with no overlap (I � 0) F � 17 steps are necessary. Due to the decreasing DOF for objects closer to the camera F � 20 steps are necessary to capture a scene from ��;< � 0.1m to ���� � 1m. Adjusting the focus and acquiring the data takes the camera less than a second for one step. The acquisition will therefore only produce meaningful data if the scene is quasi-static. A continuous movement of the image plane and a parallel data acquisition would be desirable but unfortunately the camera does not provide for setting of the focus motor speed (a parallel focusing and capturing is possible though).

To investigate the behavior of the different image processing steps an artificial scene with a variable number of soldering irons (due to their pointy hot spots which makes them ideal for focusing) at variable distances from the camera was considered (see Fig. 6). The tips have been painted black to provide for a high emissivity and suppress reflections. For preprocessing each measurements was (based on the adjusted focus) automatically cropped and rescaled to compensate for focus breathing (change of angle of view due to focusing).

In Fig. 8 one soldering iron was placed at a distance of � � 0.7m from the camera and measurements with the focus adjusted from ��;< � 0.5m to ���� � 1m were taken. For two points (>, ?) � (80,90) and (>, ?) � (296,209) the temperature and different FMFs are shown in Figs. 8b and 8c. The algorithms seem to detect sharp areas equally well (Fig. 8c) and show good performance in finding the sharpest image at � � 0.7m in the series. But on areas, like the wall approximately 6cm behind the soldering iron, with little features and insufficient temperature variations, the algorithms fail and do not indicate a maximum at � � 0.76m (Fig. 8b). The wrongly selected frames are marked with a big red dot in Fig. 8b and are at � � 0.85m when looking at the maximum temperature, � � 0.53m using Tenengrad (eq. (9)) and � � 0.58m with the SF method (eqs. (10) - (12)). Because thermal images have more areas with little change than normal photographs, the correct frame for the wall behind the soldering iron is not detected correctly. This does not seem to be a problem as with little temperature variation even wrongly focused measurements produce good results. However if the scene doesn’t provide enough variations a wrongly focused object can spread and cause variations in the data which can be wrongly identified as the slide that is in focus. To illustrate this problem only a region around the tip of the soldering iron is shown in Fig. 9. The fused image (Tenengrad method used) exhibits a sharp representation for the soldering iron but fails to correctly identify the background and has a big glow around the tip. Looking at the decision map and a point in the vicinity of the tip it is obvious that the algorithm has falsely interpreted the intensity variation due to an increased blur of the hot object as a sign of image sharpness.

Fig. 8. (a): Single thermogram of a focused soldering iron in � � 0.7m (light colors = cold, dark colors = hot). 6cm behind the iron is a wall; text labels indicate the position of different points. The temperature (blue line is the mean)

and different focus measures are shown for different focus distances for a random point in the background (b) and for the tip of the soldering iron (c).

In a parameter study a threshold value for the sharpness measure as well as different block and filter sizes were examined. Using a threshold it is possible to suppress the glow around sharp images in the composed image. Also using blocks and mosaicking the final image has the same effect and provides for a more intuitive result and faster processing. However, values for the threshold and block/filter sizes that are suitable for all scenes could not be determined. In addition using blocks adds problems if steps between image slices are not axially parallel.

Another similar problem comes with the occlusion of objects. Fig. 10 shows different buildings in the background (� u �5) and one soldering iron in the foreground (� � 0.3m). The size of the foreground object has spread and covers quite a big area. Even if all slices are identified correctly the blur from the unfocused object in the foreground affects objects in the background. This scenario however is not very common.

(296,209)

(80,90)

(300,209)

a b c Distance in m Distance in m

Tem

pera

ture

in °C

Tem

pera

ture

in °C

Ten

engr

ad

Ten

engr

ad

Spa

tial f

requ

ency

Spa

tial f

requ

ency

(80,90) (296,209)

11th International Conference on Quantitative InfraRed Thermography, 11-14 June 2012, Naples Italy

Page 8: On Extended Depth of Field to Improve the Quality of Automated

Fig. 9. Left: Region around the tip of the soldering iron with different focus settings (a) as well as fused image (b) and an unrevised decision map (c) where light colors mean measurements with a focus close to ���� � 1m and dark colors mean that the focus is close to ��;< � 0.5m. Right: Temperature profile and results of focus measures for a point

next to the tip of the soldering iron (indicated by an arrow in the left images).

Fig. 10. Thermogram of soldering iron in front of buildings. (a): Focus close, (b): focus middle and (c): focus far.

The arrows are at the same position in all images and indicate the actual size of the tip and illustrate how big the disturbed area in the background is. Please note the different temperature scaling ((a): 5 - 100, (b): 5 – 30, (c): 4 - 25 °C).

Several more common scenarios have been investigated and Fig. 11 shows two slices of an inspection scenario of different pipes. Also the fused final image is shown and here the potential of the approach is visible as all parts of the image are in focus in the fused image. The improvement is not quantifiable as no ground truth data is available. However comparing the sum of the FMFs for all pixels of the best single image with the sum of the sharpness funcion of the fused image indicates improvements between 6% and 220%.

Fig. 11. Thermograms of a long (approx. 35 m) utility shaft in a basement corridor with different supply pipes.

(a): Focus close, (b): focus far, (c): combined image from F � 40 measurements.

5. Summary and outlook

In this article focusing and the depth of field are discussed as it is a very important issue in quantitative thermography in unknown environments. A process to extend the DOF with several options is presented. In different experiments the big potential of this method, general problems and problems with existing algorithms are highlighted.

One big constraint is that the time it takes for the camera to adjust the focus and acquire a measurement does not allow capturing moving objects or scenes with rapid thermal change. Also, objects close to the camera will affect an area larger than the one they are occluding due to the size of their induced blur circle. The fusion algorithm can only put together what is in the source data and not see behind objects.

The fusion algorithms found in literature provide a good starting point and decent results have been achieved with the standard Tenengrad focus measure. However the characteristics of thermographic measurements are different from normal visual images. Most notably very little variation in intensity is measured in most parts of the measurement

a b c

a

b c

a b c

(300,209)

Distance in m

Tem

pera

ture

in °C

T

enen

grad

S

patia

l fre

quen

cy

(300,209)

11th International Conference on Quantitative InfraRed Thermography, 11-14 June 2012, Naples Italy

Page 9: On Extended Depth of Field to Improve the Quality of Automated

making it hard for the FMF to determine the correct image slice. Therefore special algorithms for extended DOF for thermography will not only have to consider the focus measure for one image slice but also have to assess the behavior of the measured intensity for one pixel over different distances. Also, in the fusion step the results from the FMF could be used as weights so that each final pixel could be put together as a weighted sum from all slices to get smoother looking images. Another very reliable approach could be the use of an additional depth sensor (e.g. depth camera, laser scanner, stereo camera or normal camera using depth from focus) and after registration to simply fuse the thermography measurements based on this information.

Acknowledgements

This work is part of the RoboGasInspector project, which is funded by the Federal Ministry of Economics and Technology due to a resolution of the German Bundestag. The author would like to thank Dipl.-Ing. Ludwig Harsch for his help with basics of this work.

REFERENCES

[1] Kaplan H., “Practical Applications of Infrared Thermal Sensing and Imaging Equipment”. 3rd edition, SPIE, Bellingham, Washington, 2007. ISBN: 978-0819467232.

[2] Kroll A., “A survey on mobile robots for industrial inspection”. Proc. of Int. Conf. on Intelligent Autonomous Systems (IAS10), Baden-Baden, Germany, pp. 406-414, 2008.

[3] Zhang Rui-zhi, Wang Tao, Zhao Ling, Fan Wei, Zhao Tong, “Thermography autofocus applied in air leak location”. Int. Conf. on Fluid Power and Mechatronics (FPM), pp. 196-201, 2011.

[4] Hellstrand M., “Device and a Method for an Infrared Image Analyzing Autofocus”. US Patent 7 110 035, 2006. [5] Krotkov E., “Focusing”. International Journal of Computer Vision, vol. 1, no. 3, pp. 223-237, 1988. [6] Ng Kuang Chern N., Poo Aun Neow, Ang Jr. M.H., “Practical issues in pixel-based autofocusing for machine

vision”. Proc. of IEEE Int. Conf. on Robotics and Automation (ICRA), vol. 3, pp. 2791-2796, 2001. [7] Beyerer J., Léon F.P., “Bildoptimierung durch kontrolliertes Aktives Sehen und Bildfusion”. at –

Automatisierungstechnik, vol. 53, pp. 493-502, 2005. [8] Schaefer G., Tait R., Zhu S.Y., “Overlay of thermal and visual medical images using skin detection and image

registration”. Int. Conf. of Engineering in Medicine and Biology Society, 2006. EMBS '06, pp. 965-967, 2006. [9] Maldague X. P. V., “Theory and practice of infrared technology for nondestructive testing”. John Wiley & Sons,

Inc., New York, USA, 2001. ISBN: 978-0471181903. [10] Nayar N. S., Nakagawa Y., “Shape from Focus”. IEEE Transaction on Pattern Analysis and Machine

Intelligence, vol. 16, pp. 824-831, 1994. [11] Subbarao M., Tyan J.-K., “Selecting the optimal focus measure for autofocusing and depth-from-focus”. IEEE

Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 8, pp. 864-870, 1998. [12] Pieper R.J., Korpel A., “Image processing for extended depth of field”. Applied Optics, vol. 22, no. 10, pp.

1449-1453, 1983. [13] Seales W.B., Dutta S., “Everywhere-in-focus image fusion using controllable cameras”. Proc. SPIE, vol. 2905,

pp. 227-234, 1996. [14] Ray S. F., “Applied Photographic Optics: Lenses and optical systems for photography, film, video, electronic

and digital imaging”. Focal Press, Oxford, 2002. ISBN: 978-0240515403. [15] Breitenstein O., Warta W., Langenkamp M., “Lock-in Thermography”. 2nd edition, Springer Heidelberg, 2010.

ISBN: 978-3-642-02416-0. [16] Li C., Han C. J., Skidmore G. D., Hess C., “DRS uncooled VOx infrared detector development and production

status”. Proc. SPIE, vol. 7660, 2010. [17] Forster B., Van De Ville D., Berent J., Sage D., Unser M., “Complex Wavelets for Extended Depth-of-Field: A

New Method for the Fusion of Multichannel Microscopy Images”. Microscopy Research and Technique, vol. 65, no. 1-2, pp. 33-42, 2004.

[18] Huang W., Jing Z., “Evaluation of focus measures in multi-focus image fusion”. Pattern Recognition Letters, vol. 28, no. 4, pp. 493-500, 2007.

[19] Eskicioglu, A.M., Fisher, P.S., “Image quality measures and their performance”. IEEE Transactions on Communications, vol. 43, no. 12, pp. 2959-2965, 1995.

[20] Li S., Kwok J.T., Wang Y., “Combination of images with diverse focuses using the spatial frequency”. Information Fusion, vol. 2, no. 3, pp. 169-176, 2001.

[21] Antunes M., Trachtenberg M., Thomas G.,Shoa T., "All-in-Focus Imaging Using a Series of Images on Different Focal Planes". Lecture Notes in Computer Science: Image Analysis and Recognition, Springer Heidelberg, vol. 3656, pp. 174-181, 2005. ISBN: 978-3-540-29069-8.

[22] Valdecasas A.G., Marshall D., Becerra J.M., Terrero J.J., “On the extended depth of focus algorithms for bright field microscopy”. Micron, vol. 32, no. 6, pp. 559-569, 2001.

11th International Conference on Quantitative InfraRed Thermography, 11-14 June 2012, Naples Italy