Segmentation of 2D AND 3D textures from estimates of the local orientation.
ABSTRACT
We use a method to estimate local orientations in the ndimensional space from the covariance matrix of the gradient, which can be implemented either in the image space or in the Fourier space. In a second step, two methods allow us to detect sudden changes of orientation in images. The first one uses an index of confidence of the estimated orientation, and the second one the detection of minima of scalar products in a neighbourhood. This is illustrated on 2D Transmission Electrons Microscope images of cellulose cryofracture (to display the organisation of cellulose whiskers and the points of germination), and to 3D images of a TA6V alloy (lamellar microstructure) obtained by microtomography. Keyword: 3D image analysis, covariance matrix, Fast Fourier Transform, gradient, oriented texture, watershed segmentation. INTRODUCTION Some microstructures in materials or in biological specimens show a pronounced local orientation. For instance, cellulose whiskers are made of a patchwork of small cells with a given orientation (cf. Fig. 1). It is interesting to extract on such textures germination points, crystalline fluxes, and other characteristic criteria. To achieve this task, the knowledge of the local orientation of cells is a prerequisite. Then, germination points are given by zones where almost all orientations are observed. Crystalline fluxes are obtained as zones with fast changes of orientation. Other materials like TA6V alloys show a 3D lamellar texture (titanium alloy with 6% of aluminium and 4% of vanadium in Fig. 2). This kind of texture shows a local orientation in 3D and this information can be useful to perform a 3D segmentation before a 3D characterization. This requires the detection and quantification of local orientations. Other textures of this kind, like lamellar eutectics and martensite textures were studied in 2D (Kurdy and Jeulin, 1989; Jeulin and Kurdy, 1992). In Germain et al. (2003) an instructive multiscale method to study the local orientation is proposed, but it is limited to 2D textures. In this paper, we introduce algorithms to estimate a local orientation, which can be implemented in the Fourier space or in the image space. Then, methods are proposed to locate fast changes of orientation, and to make a segmentation of textures showing multiple orientations. This approach differs from the common use of local information on anisotropy used to extract contours or oriented objects. Our approach is tested and illustrated in 2D on cellulose whiskers observed in TEM and on a 3D image of TA6V obtained by microtomography. [FIGURE 1 OMITTED] [FIGURE 2 OMITTED] [FIGURE 3 OMITTED] PRINCIPLE OF ESTIMATION OF THE LOCAL ORIENTATION IN IMAGES Orientations in images can be perceived when some contours are present and show alignments. A local orientation is described by a vector. Starting from a scalar image I, it is common to generate a vector field v(x) by computing the gradient of the image. The modulus of the gradient is sensitive to the local contrast, and allows us to detect boundaries in the image. The gradient is oriented in a direction orthogonal to the boundary. Local orientation can also be estimated by means of directional mathematical operations (Soille and Talbot, 1998; 2001), or in the Fourier domain by examination of the energy of the power spectrum affected by appropriate orientationselective linear filter (Kass and Witkin, 1987). The estimation of the second order derivatives (Danielsson et al., 2001), can be used to enhance and filter oriented structure (Perona and Malik, 1990; Frangi et al., 1998; Sato et al., 1998). However in what follows we will make only use of first order derivatives, less sensitive to noise. Information on the gradient is commonly used in image analysis to account for orientation. In Jeulin and Kurdy (1992), this type of information was used to perform oriented morphological transformations and filters, or to use orientation information in a watershed transformation. However, the orientation on the level of pixels is usually unreliable, as a result of noise in images, enhanced by the estimation of partial derivatives. Therefore a kind of filtering of the gradient must be performed, involving some non local information. In the following subsections, we explain how to extract information on the local orientation contained in images, and how to detect local changes of orientation. Local orientation in a vector field We consider a vector field (e.g. obtained by the gradient of an image), defined by a vector v(x) (with components vi , i = 1, n) for each point x in the ndimensional space [R.sup.n]. A domain W(x) (e.g. a square in [R.sup.2] or a cube in [R.sup.3]) is located around every point x. When working on a grid of points, a cloud of points [M.sub.j] connected to the origin by the vectors v([x.sub.j]) for [x.sub.j] [member of] W(x) is generated. A convenient way to study the average orientation inside W(x) is to use the matrix of inertia of the cloud of points, and to extract its main axis of inertia by eigenvectors decomposition. This tensorial approach of the orientation is commonly followed in fluid mechanics for velocity fields (Hand, 1962). It is also used by Vliet and Verbeek (1995) for estimation of orientation in 2D images. In what follows, the matrix of inertia J is normalised (every moment being divided by the number m of points of the grid inside W(x)). Therefore the term [J.sub.ij] is an estimation of the statistical second order moments E([v.sub.i] (x) [v.sub.j] (x)). The eigenvector corresponding to the larger eigenvalue [lambda]1 gives the main orientation of the cloud of points, namely the direction of its first axis of inertia. To illustrate this way to estimate a local orientation, we consider two extreme cases in [R.sup.3] (similar results being obtained in [R.sub.n]). * For a constant vector v (with components v1, v2, [v.sub.3]) inside W(x), the matrix J is given by: [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] This matrix has a single eigenvalue [lambda]1 [not equal to]0 ([[lambda].sub.1] = [parallel] v [[parallel].sup.2] = [v1.sup.2] + [v2.sup.2] + [v3.sup.2]) for an eigenvector collinear to v, the two other eigenvalues being equal to zero. Therefore the main (and single) orientation of the vectors v([x.sub.j]) for [x.sub.j] .W(x) is recovered. * For unit vectors v([x.sub.j]) with a uniform distribution of orientations, we obtain on average: [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] The matrix is diagonal with three equal eigenvalues (1/3), and no preferential orientation (the texture is therefore invariant by rotation, as a result of the isotropy). Calculation of the local orientation from the gradient Starting from the gradient of an image (with components [delta]I/[delta][x.sub.i] we can estimate the matrix of inertia, with the components E [delta]I/[delta][x.sub.i] [delta]I/[delta][x.sub.j] E , E being the local mathematical expectation, estimated by averaging in W(x). It turns out that each component is given by partial second derivatives of the covariance C(x, x+h) of the image. We have: C(x,x + h) = E(I(x)I(x + h)) and [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] A convenient way to estimate the local matrix of inertia in an image is to use a Fast Fourier Transform (FFT). The power spectrum f([k.sub.1],a, [k.sub.n]) (in the frequency space) of the local covariance inside the domain W(x) is obtained by the square of the modulus of the Fourier transform of the image I. Using standard rules of derivation in the Fourier space and Eq. 1, we get: [summation] = E [[delta]I/[delta]x.sub.i] [delta]I/[delta][x.sub.j] [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] From Eq. 2, the expectation of the matrix of inertia of the gradient of the image is estimated in the Fourier space by the matrix of inertia of the power spectrum of its covariance. This is the way followed by (Bigun et al., 1991; Bergonnier, 2005) to characterize the local orientation in 2D images. A comparison of the results obtained by the two approaches (image space and Fourier space) will be given and illustrated below for the cellulose image (Fig. 1). Detection of changes of orientation in a vector field To compare the orientations of two unit vectors, it is convenient to compute their scalar product: when the two vectors u and v are almost colinear (resp. orthogonal), the absolute value of their scalar product u.v is close to one (resp. 0). To detect changes of local orientation in a vector field, we consider a neibourhood V(x) of every point x. We compute the minimum m of the absolute value of the scalar products between u(x) and v([x.sub.j]) for [x.sub.j] [member of] V(x). If m is close to 1, the points in V(x) have similar orientations. If m is close to 0, we can say that x belongs to the boundary between zones with different orientations. The minimum can be replaced by the maximum or the average value in V(x), but it turns out that in the present applications the minimum produced the best results for the segmentation. IMPLEMENTATION AND EXAMPLE OF APPLICATION TO 2D IMAGES 2D local orientation studied by local Fourier transform To characterize cellulose whiskers as shown in Fig. 1, we consider local orientations of small whiskers of the structure. From these orientations it is possible to detect, for instance, germination points or crystalline fluxes. The modulus of the Fourier transform of a small area containing a weak number of whiskers shows a cloud of points with its main orientation related to the orientation of whiskers, as previously discussed (cf. Fig. 3). The main axis of inertia of the cloud of points is then obtained from its eigenvectors. Practically, we work on the image I defined in the domain D. For every point x[member of]D, we consider a square neighborhood Iv centered in x, with size equal to a power of 2, to which is applied a Fast Fourier Transform (FFT). The modulus of this image gives image [Iv.sub.M]. [Iv.sub.M] = [ F[ Iv ] .sup.2]. From image IvM in the Fourier space ([k.sub.1],[k.sub.2]) is computed the matrix of inertia [summation] corresponding to Eq. 2, ([k.sub.1c], [k.sub.2c]) being the coordinates of the center of [Iv.sub.M]: [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] The two eigenvalues of the matrix [summation], [[lambda].sub.1] and [[lambda].sub.2] ([[lambda].sub.1] < [[lambda].sub.2]) are obtained from det ([summation]  [lambda] I) = 0. The local orientation is given by the angle between the eigenvector with eigenvalue [lambda]1 and a reference vector. In practice, it can be useful to display the obtained orientations by means of a color code. This one has to be cyclic, to represent without any discontinuity angles in the range [[pi]/2, [pi]/2]. For instance we use the RGB color palette given below (cf. Fig. 4) for the illustrations given later. [FIGURE 4 OMITTED] The local orientations detected by this method depend on the size of the analysed neighbourhood IV. This is illustrated in Fig. 5a by results obtained when changing this parameter. It appears that a 32 x 32 size for Iv gives a smooth result. This size directly depends on the average size of particles in the image. Local 2D orientation from the gradient As explained before a similar analysis of orientation can be made directly in the image space. In each point (x,y) of the image I in the domain D is centered a neigbourhood V. For each point is calculated a local matrix of orientation MV according to Eq. 1 (with 1 [less than or equal to] i [less than or equal to] 2 and 1 [less than or equal to] j [less than or equal to] 2), using the average over V. The local orientation is given by the angle between the eigenvector corresponding to the largest eigenvalue [[lambda].sub.1] and a reference vector. This method requires the computation of the two components of the gradient on the full image, and of local averages on each neighbourhood, while the previous method requires to implement a FFT in each neigbourhood. Therefore, the approach based on gradient calculation is much faster. For instance is given on Fig. 7 a comparison of results obtained by the two approaches for neighbourhood with the same size. We obtain very similar results with quite different computation times: 2 sec for the gradient and 50 sec fo the FFT based method. The test image has 256 x 256 pixels and the size of the neigbourhood is 32 x 32. We used a PC computer with a Pentium IV processor (2.6 Ghz). To compute the gradient images, it is possible to use a directional gradient (Prewitt, Sobel or morphological type). We are using in point (x,y) of image I the horizontal gradient defined by [delta]I (x,y)/[delta]x = I (x  1, y) I(x+1,y) and the vertical gradient defined by [delta]I(x,y)/[delta]y = I(x,y1) I(x,y+1) Index of confidence of the 2D local orientation An index of confidence on the estimated orientation is given from the ratio [[lambda].sub.1]/[[lambda].sub.1] + [[lambda].sub.2] This ratio increases to 1 for a more significant measured local orientation. It enables us to detect germination points, which show the lowest index of confidence (cf. Fig. 6). Local scalar product As explained before, the minima of the scalar products in a neigbourhood are used to detect sudden changes of local orientations, characterizing for the cellulose case the boundaries of crystalline fluxes (cf. Fig. 6). [FIGURE 5 OMITTED] IMPLEMENTATION AND EXAMPLE OF APPLICATION FOR 3D IMAGES Local 3D orientation from the gradient The method followed in 2D to estimate the local orientation from the calculation of the gradient is similar in 3D. The calculation of the eigenvalues [[lambda].sub.1] [[lambda].sub.2] and [[lambda].sub.3], of the 3 x 3 local orientation matrix is given in the part 8 appendix. This calculation is made for every voxel and must be fast enough to be able to efficiently implement the method in 3D. The components of the gradient image in point (x,y,z) of image I are estimated by (other types of directional gradients can be used): [delta]I (x,y,z)/[[delta].sub.x] = I(x1,y,z)I(x+1,y,z) [delta]I (x,y,z)/[[delta].sub.x] = I(x,y1,z)I(x+1,y,z) [delta]I (x,y,z)/[[delta].sub.x] = I(x,y,z1)I(x+1,y,z) This is illustrated in Fig. 7 by the local 3D orientation on Xray images obtained by microtomography on a TA6V alloy (cf. Fig. 2). [FIGURE 6 OMITTED] [FIGURE 7 OMITTED] Index of confidence of the 3D local orientation An index of confidence on the 3D orientation is obtained from the ratio [[lambda].sub.1]/[[lambda].sub.1 + [[lambda].sub.2] + [[lambda].sub.3]. It reaches the value 1 for a single orientation over the points in the neighbourhood. This can be useful to detect boundaries between zones with a homogeneous orientation. Indeed these boundaries show fuzzy orientations (some particles in the neighbourhood being oriented in a given direction, while other particules showing a different orientation). They are obtained for a low index of confidence (typically a ratio lower than 0.5). In Fig. 8, the local 3D orientation of the TA6V alloy (cf. Fig. 2) and the index of confidence were calculated by means of different sizes of neighbourhood ([10.sup.3], [20.sup.3] and [30.sup.3]). We can notice that a correct (but fuzzy) detection of boundaries is obtained for a large enough neighbourhood. On the TA6V images, useful information on boundaries are shown by light thin zones. However, the correct segmentation of zones with a homogeneous orientation seems difficult with this index of confidence. Therefore, another method will be used in the next section. [FIGURE 8 OMITTED] Local 3D scalar product As in 2D, we propose to use scalar products in neighbourhoods, to detect sudden changes of local orientations and therefore boundaries between domains with a homogeneous orientation. Applied to the TA6V image (cf. Fig. 2), this method allows us to detect boundaries. The best results are obtained by [30.sup.3] neighbourhoods to estimate the local orientation, and [5.sup.3] for the minimum of scalar products (cf. Fig. 9), giving the image Ips. In order to segment homogeneous zones, we propose to operate on image Ips. First, a rough detection of boundaries between zones with a homogeneous orientation is obtained by a binary threshold of image Ips, keeping voxels with value less than 0.01 (cf. Fig. 10). Then this image is inverted, and eroded by a cube with size [5.sup.3]. This new image is used as marker to generate a constrained segmentation by 3D watershed on the inverse of image Ips (Beucher and Meyer, 1993). Figs. 11 illustrate the obtained result. In the case of oversegmentation, it is then possible to merge adjacent zones by comparing their vectors of average orientation. Further examples of the use of the full algorithmic sequence to extract and study colonies in Widmanstatten microstructures of a titanium alloy are published in Vanderesse et al. (2008). CONCLUSION We proposed a fast method to estimate local orientations from the covariance matrix of the gradient. This method was applied to 2D and 3D images. Two methods were also worked out to detect sudden changes of orientation. The first one uses an index of confidence of the estimated orientation (ratio between the larger eigenvalue of the matrix of orientation and its trace). The second one uses the detection of minima of scalar products in a neighbourhood. We applied these methods to 2D Transmission Electrons Microscope images of cellulose cryofracture (to display the organisation of cellulose whiskers and the points de germination), and to 3D images of a TA6V alloy (lamellar microstructure) obtained by microtomography. The estimation of the local orientation requires to define a size of neighbourhood around every studied pixel or voxel: this one must contain several oriented objects for the method to be efficient. [FIGURE 9 OMITTED] APPENDIX LOCAL 3D ORIENTATION FROM THE GRADIENT The calculation of the eigenvalues [[lambda].sub.1], [[lambda].sub.2] and [[lambda].sub.3] of the 3 x 3 local orientation matrix is made for every voxel and must be fast enough to be able to efficiently implement the method in 3D. In each point (x,y,z) [member of] D, the size of a centered neigbourhood V is given. We calculate as in part 3.2 a matrix of local orientation MV, derived by averaging as defined by Eq. 1 (with 1 [less than or equal to]i [less than or equal to] 3 and 1 [less than or equal to] j [less than or equal to] 3) The three eigenvalues [[lambda].sub.1], [[lambda].sub.2] and [[lambda].sub.3] ([[lambda].sub.1] > [[lambda].sub.2] > [[lambda].sub.3]) of this matrix are the roots of the characteristic equation: det (MV  [lambda] I) = 0 [m.sub.1] [m.sub.4] [m.sub.5] By noting [M.sub,v] (x,y,z) = [m.sub.4] [m.sub.2] [m.sub.6] [m.sub.5] [m.sub.6] [m.sub.3] det ([M.sub.V]  [lambda] I) = [[lambda].sub.3]  ([m.sub.1] + [m.sub.2] + [m.sub.3]) [[lambda].sub.2] + ([m.sub.1][m.sub.2] + [m.sub.1][m.sub.3] + [m.sub.2][m.sub.3]  [[m.sub.4].sup.2] [[m.sub.5].sup.2]  [[m.sub.6].sup.2]) [lambda] + [m.sub.3][[m.sub.4].sup.2] + [m.sub.2][m.sub.5].sup.2] + [m.sub.1][m.sub.6].sup.2]  [m.sub.1][m.sub.2][m.sub.3]  2[m.sub.4][m.sub.6][m.sub.5]. The matrix MV being real, this expression is similar to the cubic equation with real coefficients solved in (Abramowitz and Stegun, 1970): [[lambda].sub.3] + a2 [[lambda].sub.2] + [a.sub.1] [lambda] + [a.sub.0]. By identification: [a.sub.0] = [m.sub.3][m.sub.4].sup.2] + [m.sub.2][m.sub.5].sup.2 + [m.sub.1][m.sub.6].sup.2 [m.sub.1][m.sub.2][m.sub.3]  2[m.sub.4][m.sub.6][m.sub.5] a1 = [m.sub.1][m.sub.2] + [m.sub.1][m.sub.3] + [m.sub.2][m.sub.3]  [m.sub.4].sup.2]  [[m.sub.5].sup.2]  [[m.sub.6].sup.2] a2 =  ([m.sub.1] + [m.sub.2] + [m.sub.3]). We define: q = 1/3 a1 1/9 [a2.sup.2] and r = 1/6 (a1a23a0)1/27[a2.sup.3] [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] The eigenvalues values [[lambda].sub.1], [[lambda].sub.2] et [[lambda].sub.3] are given by: [MATHEMATICAL EXPRESSION NOR REPRODUCIBLE IN ASCII] For each [lambda]i, a normalized eigenvector u(ux, uy, uz) is obtained by: [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] ACKNOWLEDGEMENTS The authors are grateful to J.L. Putaux (CERMAV CNRS UPR5301, Grenoble, France) for micrographs of cryofracture of cellulose material. The authors would also like to thank N. Vanderesse and M. Darrieulat, (ENSMSE, SaintEtienne, France), for the Xray microtomography images made in the ESRF Synchrotron (Grenoble) on a TA6V alloy. [FIGURE 10 OMITTED] [FIGURE 11 OMITTED] (Accepted June 30, 2008) REFERENCES Abramowitz M, Stegun IA (1970). Handbook of Mathematical Functions, 9th printing. New York: Dover. Bergonnier S (2005). Relations entre microstructure et proprietes mecaniques de materiaux enchevetres. Thesis, Universite Pierre et Marie Curie. Beucher S, Meyer F (1993). The morphological approach to image segmentation: The watershed transformation. E. R. Dougherty Mathematical Morphology in Image Processing, 43381. Bigun J, Granlund G, Wiklund J (1991). Multidimensional orientation estimation with applications to texture analysis and optical flow. IEEEPAMI 13:775990. Danielsson PE, Lin Q, Ye QZ (2001). Efficient detection of seconddegree variations in 2D and 3D images. JVCIR 12:255305. Frangi AF, Niessen WJ, Vincken KL, Viergever MA (1998). Multiscale vessel enhancement filtering. Proceedings of MICCAI'98, LNCS 1496:1307. Germain G, Da Costa JP, Lavialle O, Baylou P (2003). Multiscale estimation of vector field anisotropy application to texture characterization. Signal Process 83: 1487503. Hand GL (1962). A theory of anisotropic fluids. J Fluid Mech 13:3346. Jeulin D, Kurdy MB (1992). Directional mathematical morphology for oriented image restoration and segmentation. Proc. 8th ISS Congress, Irvine, CA, 1991. Acta Stereol 11(suppl I):54550. Kass A, Witkin A (1987). Analyzing oriented patterns. Comput Vis Graph Image Process 37(3):36285. Kurdy MB, Jeulin D (1989). Directional mathematical morphology operations. Proc. ECS 5, Freiburg, RFA, 38 September 1989. Acta Stereol 8(2):47380. Perona P, Malik J (1990). Scalespace and edge detection using anisotropic diffusion. IEEE PAMI 12(7):62939. Sato Y, Nakajima S, Shiraga N, Atsumi H, Yoshida S, Koller T, et al. (1998). 3D multiscale line filter for segmentation and visualization of curvilinear structures in medical images. Med Image Anal 2(2):14368. Soille P, Talbot H (1998). Image structure orientation using mathematical morphology. 14th International Conference on Pattern Recognition 2:1467769. Soille P, Talbot H (2001). Directional morphological filtering. IEEE PAMI 23(11):131329. Vanderesse N, Maire E, Darrieulat M, Montheillet F, Moreaud M, Jeulin D (2008). 3D Microtomographic study of Widmanstatten microstructures in alpha / beta titanium alloy. Scripta Materialia 58:5125. Vliet LJ, Verbeek PW (1995). Estimators for orientation and anisotropy in aigitized images. ASCI'95, Proceedings of the first Conference of the Advanced School for Computing and Imaging, Heijen Netherlands: 44250. Dominique Jeulin AND Maxine Moreaud Centre de Morphologie Mathematique, Ecoles des Mines de Paris, 35, rue Saint Honore, F77305 Fontainebleau, France email: dominique.jeulin@ensmp.fr, maxime.moreaud@minesparis.org 

Reader Opinion