Printer Friendly

Fusion of IR and Visual Images Based on Gaussian and Laplacian Decomposition Using Histogram Distributions and Edge Selection.

1. Introduction

Multisensor fusion is widely used for signal, image, feature, and symbol combination [1]. To fuse images, variable images, such as visual image, infrared (IR) image, millimeter wave (MMW) image, X-ray image, and depth image are used for concealed weapon detection [2, 3], remote sensing [4, 5], multifocus imaging [6], and so forth [1, 7, 8].

Forward looking infrared (FLIR) cameras can sense IR radiation (i.e., thermal radiation). Therefore, IR images can contain useful information that is not apparent in visual images. Alternatively, detailed information within the visual band is not included in IR images. Therefore, the fusion of IR and visual images can provide the advantages of both types of images. Also, the fusion of IR and visual images can apply various research fields, such as night vision [9, 10], face recognition [11], human detection [12], and detecting concealed object [13].

Visual and IR images can be easily fused by averaging; however, in this scheme, the advantages of the two sources are eliminated and the details are annihilated in the worst case.

To preserve the dominant advantages in the fused images, two methods can be utilized. Various methods using multiscale decomposition, such as Laplacian pyramid [14, 15], discrete wavelet transform (DWT) [16, 17], and contrast pyramid [18] can be used. One stage of Laplacian pyramid decomposition is shown in Figure 1. Additionally, some methods using region-segmentation [19-21] have also been proposed.

The multiscale decomposition-based methods are performed with low computation; however, selecting the correct and distinct values of the low frequency information is not easily determined. So, many stages of pyramid and DWT are used to select significant values, and distinct values are only detected from Laplacian images or high frequency bands [22]. Although distinct values can be selected from both low and high frequency images, most of all methods used only strong intensity and predetermined weights.

On the other hand, the latter methods can simply select the distinct values of the low frequency information; accurate segmentation is not guaranteed, and segmentation methods have a higher computation complexity. In addition, seam boundary regions should be blended by two image sources to prevent discontinuities.

In this paper we are aimed at developing a fusion method that has the advantages of the methods mentioned above. To do this, we use simple Gaussian and Laplacian decomposition and utilize histogram distributions and edge selection to determine the distinct values of the low and high frequency information, respectively. Because we use histogram distributions, significant low frequency information can be selected, such as locally hot or cold regions of IR images and locally bright or dark regions of visual images. In addition, only one decomposition is used and fast averaging filters are used, so processing speed is fast enough for real-time applications.

2. Fusion of IR and Visual Images

The proposed fusion method is similar to methods that use the Laplacian pyramid or DWT. We use Gaussian smoothing, as opposed to Gaussian or wavelet scaling, so our method is identical to methods that perform single scaling. A Gaussian image [bar.X] is the filtered image of an original image X by the Gaussian convolution, and a Laplacian image [[nabla].sup.2]X is calculated by X - [bar.X] (X = [bar.X] + [[nabla].sup.2]X) as shown in Figure 2.

To obtain a fused image F = [bar.F] + [[nabla].sup.2]F, the distinct information of [[nabla].sup.2]F is related to the magnitude; a larger [absolute value of [[nabla].sup.2]F] indicates a strong edge. To determine the distinct value of [bar.F], we use the histogram distributions of the visual and IR Gaussian images.

To fuse a visual image V and an IR image I, we first compute the Gaussian images [bar.V] and [bar.I] and decompose the Laplacian images [[nabla].sup.2]V = V - [bar.V] and [[nabla].sup.2]I = I - [bar.I], and then the Gaussian and Laplacian image of fused image are generated by selecting distinct values using histogram distributions and edge selection as shown in Figure 3.

The Laplacian component of the fused image [[nabla].sup.2]F is easily determined by comparing edge strength such as the absolute Laplacian values at each pixel. If the large absolute Laplacian pixels are directly used, boundary discontinuities have an evil effect in terms of visuality. This can cause the fused images to contain discontinuous artefacts. Accordingly, we use weight map of the absolute Laplacian values. To compute this weight map [bar.[L].sub.w], we compute the binary weight map [L.sub.w] calculated by

[mathematical expression not reproducible]. (1)

Finally, we determine [[nabla].sup.2]F using [bar.[L].sub.w] or [L.sub.w] as follows:

[[nabla].sup.2]F(x, y) = [[[[nabla].sup.2]I(x, y)[bar.[L].sub.w](x, y)]/255] + [[[[nabla].sup.2]V(x, y)(255 -[bar.[L].sub.w](x, y))]/255], (2a)

[mathematical expression not reproducible], (2b)

where (2a) can be used for the fusion image having smooth boundaries of objects, and (2b) can be used for the fusion image having distinct boundaries of objects.

The distinct intensity values of most images have a low population. In particular, intensities with a low population in the IR images show the highest or the lowest temperature as shown in Figure 4. Therefore, we use the histogram distribution to select [bar.F].

To select the Gaussian component of the fusion image, we use the low population map [M.sub.h](x, y) given by

[mathematical expression not reproducible], (3)

where [His.sub.[bar.V]]() and [His.sub.[bar.I]]() denote the histogram distribution functions for [bar.V] and [bar.I], respectively However, [M.sub.h] has extreme discontinuities, so we substitute [bar.[M].sub.h] for [bar.F].

In all processes for Gaussian filter, we use a fast mean filter using spatial buffers to reduce computation complexities.

3. Experimental Result

The proposed fusion method was tested with three image sequences of TNO image fusion dataset [23]: "UN Camp," "Dune," and "Trees" (360 x 270, grayscale). These test images have small intensity rages, so we tested modified images by linear histogram normalization instead of original images. In this experiment, we used two parameter sets as shown in Table 1, where the first three parameters are the mask sizes. The optimal parameters are experimentally determined to make visually nature fusion images not having artificial discontinuities.

Examples of our results using the optimal parameters are shown in Figures 5, 6, and 7. Edge discontinuities are observed in [L.sub.w] and [M.sub.h], while they are blurred in [bar.[L].sub.w] and [bar.F]. It seems that the distinct information of V and I is well fused in F. In [[nabla].sup.2]F images, strong high frequency components are well selected. In [bar.F] images, the blurred distinct values are well revealed regardless of [[nabla].sup.2]F images. Particularly, isolated dark regions having low frequency are distinctly shown in F images.

More results for the three image sequences using optimal parameters

are shown in Figure 8; the red fused regions are more influenced by the visual images, while orange fused regions are more influenced by the IR images.

To objectively evaluate the proposed method, we consider two evaluation metrics: entropy (E) and the Xydeas and Petrovic index ([Q.sup.VI/F.sub.P]) [24]. The performance comparison with the averaging method is shown in Table 2. To compare our results with those reported in [12], we compared the improvement in the ratio of the averaging method as shown in Table 3, because the existing methods used a different image enhancement methods which were not appeared in literature.

Although our method using the optimal parameters does not yield the best objective performance, these metrics do not completely agree with human subjective evaluation as shown in Figure 9. The performance of Figure 9(b) is higher than Figure 9(a); however, Figure 9(b) has many discontinuities as indicated by the circles. In addition, in Figure 9(c), the stitching of the two source images (after clipping and mean filtering) yields the highest E compared to the other results. Therefore, [Q.sup.VI/F.sub.P] and E for segmentation-based fusion methods may be overestimated to the human subjective evaluation.

Visual comparison of two results using two parameter sets is shown in Figure 10. As shown in Table 3, [Q.sup.VI/F.sub.P] and E of the results using the large parameters are higher than the results using the optimal parameters; however the results using the optimal parameters show better visuality.

Using single thread processing with a 3.60 GHz CPU, the average computation time of all of the sequences is 5.237 msec. This processing time is inconceivable for segmentation-based methods. Therefore, the proposed method can be used for real-time fusion applications.

4. Conclusion

In this paper, we have proposed a novel fusion method for IR and visual images based on Gaussian and Laplacian decomposition using histogram distributions and edge selection. This method can easily determine the distinct values of Gaussian and Laplacian images. The distinct values of Laplacian images are selected by edge strength, and the distinct values of Gaussian images are selected by using histogram distributions. So, the fused images can contain the dominant characteristics of two source images and can be obtained via relatively simple computation. In addition, we showed that the object evaluation, entropy and Xydeas and Petrovic index, does not completely agree with human visual evaluation by showing the results using two different parameter sets. Therefore, the proposed method can be used for image fusion and blending instead of other existing methods.

http://dx.doi.org/10.1155/2016/3130681

Conflict of Interests

The authors declare that they have no competing interests.

References

[1] R. S. Blum, Z. Zue, and Z. Zhang, "An overview of image fusion," in Multi-Sensor Image Fusion and Its Applications, pp. 1-36, CRC Press, Boca Raton, Fla, USA, 2005.

[2] Z. Xue and R. S. Blum, "Concealed weapon detection using color image fusion," in Proceedings of the 6th International Conference on Information Fusion (FUSION '03), pp. 622-627, Cairns, Australia, July 2003.

[3] J. Yang and R. S. Blum, "A statistical signal processing approach to image fusion for concealed weapon detection," in Proceedings of the International Conference on Image Processing, vol. 1, pp. 513-516, 2002.

[4] T.-M. Tu, S.-C. Su, H.-C. Shyu, and P. S. Huang, "Efficient intensity-hue-saturation-based image fusion with saturation compensation," Optical Engineering, vol. 40, no. 5, pp. 720-728, 2001.

[5] G. Simone, A. Farina, F. C. Morabito, S. B. Serpico, and L. Bruzzone, "Image fusion techniques for remote sensing applications," Information Fusion, vol. 3, no. 1, pp. 3-15, 2002.

[6] X. Zhang, X. Li, Z. Liu, and Y. Feng, "Multi-focus image fusion using image-partition-based focus detection," Signal Processing, vol. 102, pp. 64-76, 2014.

[7] F. Laliberte, L. Gagnon, and Y. Sheng, "Registration and fusion of retinal images-an evaluation study," IEEE Transactions on Medical Imaging, vol. 22, no. 5, pp. 661-673, 2003.

[8] C. E. Reese and E. J. Bender, "Multispectral image-fused head-tracked vision system (HTVS) for driving applications," in Helmet- and Head-Mounted Displays VI, vol. 4361 of Proceedings of SPIE, pp. 1-11, August 2001.

[9] A. Toet, "Natural colour mapping for multiband nightvision imagery," Information Fusion, vol. 4, no. 3, pp. 155-166, 2003.

[10] A. M. Waxman, A. N. Gove, D. A. Fay et al., "Color night vision: opponent processing in the fusion of visible and IR imagery," Neural Networks, vol. 10, no. 1, pp. 1-6, 1997.

[11] J. Heo, S. G. Kong, B. R. Abidal, and M. A. Abidi, "Fusion of visual and thermal signatures with eyeglass removal for robust face recognition," in Proceedings of the Conference on Computer Vision and Pattern Recognition Workshop (CVPRW '04), p. 122, Washington, DC, USA, June 2004.

[12] L. Jiang, F. Tian, L. E. Shen et al., "Perceptual-based fusion of IR and visual images for human detection," in Proceedings of the International Symposium on Intelligent Multimedia, Video and Speech Processing (ISIMP '04), pp. 514-517, October 2004.

[13] Z. Xue, R. S. Blum, and Y. Li, "Fusion of visual and IR images for concealed weapon detection," in Proceedings of the 5th International Conference on Information Fusion, pp. 1198-1205, IEEE, Annapolis, Md, USA, July 2002.

[14] P. J. Burt and E. H. Adelson, "The Laplacian pyramid as a compact image code," IEEE Transactions on Communications, vol. 31, no. 4, pp. 532-540, 1983.

[15] P. J. Burt and R. J. Kolczynski, "Enhanced image capture through fusion," in Proceedings of the 4th International Conference on Computer Vision (ICCV '93), pp. 173-182, IEEE, Berlin, Germany, May 1993.

[16] H. Li, B. S. Manjunath, and S. K. Mitra, "Multisensor image fusion using the wavelet transform," Graphical Models and Image Processing, vol. 57, no. 3, pp. 235-245, 1995.

[17] G. Pajares and J. M. de la Cruz, "A wavelet-based image fusion tutorial," Pattern Recognition, vol. 37, no. 9, pp. 1855-1872, 2004.

[18] A. Toet, L. J. van Ruyven, and J. M. Valeton, "Merging thermal and visual images by a contrast pyramid," Optical Engineering, vol. 28, no. 7, pp. 789-792, 1989.

[19] J. J. Lewis, R. J. O'Callaghan, S. G. Nikolov, D. R. Bull, and N. Canagarajah, "Pixel- and region-based image fusion with complex wavelets," Information Fusion, vol. 8, no. 2, pp. 119-130, 2007.

[20] N. Cvejic, D. Bull, and N. Canagarajah, "Region-based multimodal image fusion using ICA bases," IEEE Sensors Journal, vol. 7, no. 5, pp. 743-751, 2007.

[21] J. Saeedi and K. Faez, "Infrared and visible image fusion using fuzzy logic and population-based optimization," Applied Soft Computing Journal, vol. 12, no. 3, pp. 1041-1054, 2012.

[22] M. I. Smith and J. P. Heather, "A review of image fusion technology in 2005," in Thermosense XXVII, vol. 5782 of Proceedings of SPIE, pp. 29-45, April 2005.

[23] A. Toet, TNO Image Fusion Dataset, Figshare, 2014.

[24] C. S. Xydeas and V. S. Petrovic, "Objective pixel-level image fusion performance measure," in Sensor Fusion: Architectures, Algorithms, and Applications IV, vol. 4051 of Proceedings of SPIE, pp. 89-98, Orlando, Fla, USA, April 2000.

Seohyung Lee (1) and Daeho Lee (2)

(1) Department of Electronics and Radio Engineering, Kyung Hee University, Yongin 17104, Republic of Korea

(2) Humanitas College, Kyung Hee University, Yongin 17104, Republic of Korea

Correspondence should be addressed to Daeho Lee; nize@khu.ac.kr

Received 3 December 2015; Revised 25 January 2016; Accepted 26 January 2016

Academic Editor: Zhike Peng

Caption: Figure 1: One stage of Laplacian pyramid decomposition.

Caption: Figure 2: Gaussian and Laplacian decomposition.

Caption: Figure 3: The proposed fusion method of IR and visual images.

Caption: Figure 4: Intensities with a low population: (a) IR image and its histogram, (b) the binary image I(x, y) > [[tau].sub.1] of the IR image (a), and (c) the binary image I(x, y) < [[tau].sub.2] of the IR image (a).

Caption: Figure 5: An example of the fusion result ("UN Camp," the intensity range of [[nabla].sub.2]F is [-128, 128]).

Caption: Figure 6: An example of the fusion result ("Dune," the intensity range of [[nabla].sup.2]F is [-128, 128]).

Caption: Figure 7: An example of the fusion result ("Trees," the intensity range of [[nabla].sub.2]F is [-128, 128]).

Caption: Figure 8: Examples of the proposed fusion method (from top to bottom: "UN Camp," "Trees," and "Dune"): (a) visual images, (b) IR images, and (c) fused images using the proposed method.

Caption: Figure 9: Limitations of objective evaluation metrics: (a) result of the proposed method (optimal parameters): [Q.sup.VI/F.sub.P] = 0.479, E = 7.100, (b) result of the proposed method (large parameters): [Q.sup.VI/F.sub.P] = 0.556, E = 7.161, and (c) stitching result: [Q.sup.VI/F.sub.P] = 0.550, E = 7.214.

Caption: Figure 10: Comparison of the proposed method with averaging; (a-c) results of the proposed method (optimal parameters), results of the proposed method (large parameters), and results of the averaging method.
Table 1: Parameter sets.

Parameter sets       [bar.V]([bar.I])   [bar.[L].sub.w]

Optimal parameters       31 x 31              3 x 3
Large parameters        131 x 131             5 x 5

Parameter sets       [bar.[M].sub.h]   [[nabla].sup.2]2F condition

Optimal parameters       31 x 31                  (2a)
Large parameters         51 x 51                  (2b)

Table 2: Comparison with the averaging method.

                    Methods                         UN    Dune   Trees
                                                   Camp

[Q.sup.VI/F.sub.P]  Averaging                      0.312  0.315  0.297
                    Proposed (large parameters)    0.487  0.497  0.550
                    Proposed (optimal parameters)  0.437  0.430  0.476

E                   Averaging                      6.275  6.769  6.413
                    Proposed (large parameters)    7.191  7.453  7.086
                    Proposed (optimal parameters)  7.093

Table 3: Comparison with the existing methods (the improved
ratio of the averaging method; the performance ranking is shown
in parentheses).

                     Methods                 UN Camp      Dune

[Q.sup.VI/F.sub.P]   Li's [16]              1.277 (5)   1.154 (5)
                     Lewis's [19]           1.515 (3)   1.435 (3)
                     Saeedi's [21]          1.663 (1)   1.568 (2)
                     Proposed               1.562 (2)   1.578 (1)
                     (large parameters)
                     Proposed               1.400 (4)   1.365 (4)
                     (optimal parameters)

E                    Li's [16]              1.075 (4)   1.033 (4)
                     Lewis's [19]           1.059 (5)   1.019 (5)
                     Saeedi's [21]          1.178 (1)   1.120 (1)
                     Proposed               1.146 (2)   1.101 (2)
                     (large parameters)
                     Proposed               1.130 (3)   1.093 (3)
                     (optimal parameters)

                     Methods                  Trees

[Q.sup.VI/F.sub.P]   Li's [16]              1.402 (5)
                     Lewis's [19]           1.575 (4)
                     Saeedi's [21]          1.795 (2)
                     Proposed               1.851 (1)
                     (large parameters)
                     Proposed               1.603 (3)
                     (optimal parameters)

E                    Li's [16]              1.015 (4)
                     Lewis's [19]           1.011 (5)
                     Saeedi's [21]          1.099 (2)
                     Proposed               1.105 (1)
                     (large parameters)
                     Proposed               1.087 (3)
                     (optimal parameters)
COPYRIGHT 2016 Hindawi Limited
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2016 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Research Article
Author:Lee, Seohyung; Lee, Daeho
Publication:Mathematical Problems in Engineering
Date:Jan 1, 2016
Words:2965
Previous Article:Optimizing Ship Speed to Minimize Total Fuel Consumption with Multiple Time Windows.
Next Article:Control Problem of Nonlinear Systems with Applications.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters