# Augmented reality to support on-field post-impact maintenance operations on thin structures.

1. IntroductionAugmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated input such as sound, graphics, images, or video data. AR was first used for military, industrial, and medical applications, but it was soon applied to numerous commercial and entertainment areas [1]. Numerous studies, developments and applications of AR have been proposed, as reported in the surveys by Azuma [2], Azuma et al. [3], Krevelen and Poleman [4], and Wang et al. [5], and modern trends on AR can be found in some very recent papers [6-8]. However, to the best of the authors' knowledge, AR has been scarcely used in nondestructive testing and structural health monitoring (NDT/SHM) applications, probably due to the required multidisciplinary expertises including but not limited to solid mechanics, numerical simulation, signal processing, and data visualization. The idea to harness AR for developing, supporting and improving NDT/SHM is an innovative topic which should be better addressed by the literature.

The use of augmented reality (AR), in fact, could boost the usability of some NDT/SHM applications in both technical and economic sense. For instance, AR can be used in conjunction with visual based techniques to gain insight on the structural health status from the visual appearance of cracks [9] or to provide information in dead angle areas such as black walls or partitions [10]. Alternatively, in ultrasonic based approaches, AR can be exploited to provide to the inspector an immediate visual representation of the target of the ultrasonic inspection, generally a flaw/hole/damage, overimposed to the structure under testing. This would support the inspector in the inspection/diagnosis and maintenance process through

(i) facilitating the understanding of the results of the inspection;

(ii) providing an immediate real size dimension of the damage compared to that of the structure;

(iii) avoiding delay and possible mistakes while transferring the inspection results to the structure.

In this paper, an AR approach is proposed to visualize the outcomes of a nondestructive Lamb waves based impact detection methodology directly on plate-like structures. For the sake of validating the proposed approach, the plate is impacted through the stroke of an instrumented hammer by the experimenter in known positions and the AR visualization is performed in real time. In a realistic industrial scenario, such AR approach could be exploited during the maintenance phase, after the impact has taken place, driving the operator on the impact position to check if the component has been damaged or not, and supporting the final decision whether maintenance actions are required.

Impact detection in plate-like structures via guided waves has been the focus of many researches over the last years [11-20]. Generally a network of piezoelectric sensors is used to detect in passive mode the Lamb waves produced in the plate by the impact. The information gathered by the sensors is sent to a central unit where the acquired responses are processed to estimate the position of the impact and eventually its energy.

Thanks to the low weight and low power consumption of the technology, such approaches can be proficiently used in SHM of plate-like structures. In practice, whenever an impact occurs, the SHM system activate an alert. Once the alert is detected, an operator should retrieve the acquired information and take a decision about the maintenance strategy. In particular, the operator has to carefully verify whether the impact has damaged or not the monitored structure. This is particularly important for composite plate-like structures, since the impact effects may generate damage invisible or barely visible at human eyes, complicating thus the maintenance operator tasks. Therefore, the effectiveness of methods and tools, such as AR, aimed at improving the damage detectability, should be investigated.

In such direction, this paper proposes a first attempt on the use of AR for post-impact data visualization on the structure. In particular, from the acquired waveforms, the implemented tool provides to the inspector the estimated impact position, a measure of the uncertainty in the localization, and an estimation of the impact energy. To the authors' opinion, both pieces of information could avoid false alarms of the SHM system and prevent false assumptions during the inspection phase. A case study is proposed to show the methodology and its final outcome.

The work is organized as follows. Basic information on AR is provided in Section 2. The proposed algorithm for impact localization including uncertainty and impact energy estimation is presented in Section 3. In Section 4, the AR environment and tools are presented and an experimental validation is proposed through a case study in Section 5. The conclusions end the paper.

2. Augmented Reality

Augmented reality (AR) is a real-time technique [21, 22] allowing the experimenter to see virtual objects or scenario superimposed on real-world images interactively [2, 3, 23]. Image acquisition, Calibration, Tracking, Registration, and Display are the main process steps required for AR [24]. A brief description of each step is provided in the following.

Image Acquisition. The Image acquisition is usually obtained through a camera embedded in a device generally carried on the experimenter's head, as the one represented in Figure 1(a).

Calibration. The Calibration step is required to measure precisely the internal camera parameters and to evaluate and correct image distortion. This operation must be performed only once since it depends on the camera features only.

Tracking. Tracking is required to evaluate the pose (i.e., orientation in space in terms of pitch, roll, and yaw angles) and position of the camera with respect to an external reference system: several techniques can be applied depending on the need of the final application, but all these methods can be grouped in sensor-based and vision-based tracking techniques. Sensor-based tracking techniques require GPS/accelerometers, magnetometers, acoustical, and optical or mechanical devices to detect the pose and position of the experimenter [25]. Vision-based tracking techniques are based upon the evaluation of the size and optical deformation of a geometrical marker to estimate the relative position between the marker and the camera reference system [26].

Registration. The Registration is the procedure applied to synchronize the virtual image or scenario with the external view (real-word image) accordingly with the user's head movements [27]. It uses the information gathered by Calibration and Tacking phases and exploits the spatial coordinates transformation from the 3D scene to a 2D image.

For this purpose, first the simple pinhole camera model (see Figure 2 and [28]), based on a perspective projection, is used to describe the relationship between the coordinates of the generic point Q [equivalent to] {[x.sub.c], [y.sub.c], [z.sub.c]} in the 3D space and its projection onto the 2D screen plane q [equivalent to] {[x.sub.s] [y.sub.s]}. Simple geometry shows that if the distance of the screen plane to the centre of projection [O.sub.c], that is, the ideal pinhole of the camera, is denoted by f, the screen coordinates are related to the object coordinates as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (1)

Next, each point q on the screen plane can be related to a pixel of an image if the dimensions of the screen plane are known. Denoting by u and v the row and column of the pixel with respect to [u.sub.0] and [v.sub.0], the row and column of the pixel in the origin of the screen plane coordinate system ([O.sub.s]), and being [S.sub.x] and [S.sub.y] scale factors in [x.sub.s] and [y.sub.s] axis (pixel/meter), respectively, such relation is

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (2)

Combining (1) and (2) and exploiting homogeneous coordinates (4-dimensional), a transformation between pixels and coordinates in the 3D space can be obtained as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (3)

in which the parameter [gamma], related to the camera features, is introduced to account for the distortion effects between the axis u and v [29]. The parameters of the matrix P, known as camera's perspective matrix, are obtained by means of the Calibration process.

Subsequently, the relative position in the 3D space between the marker and the camera coordinate systems is introduced:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (4)

where the 3 x 3 Rotation matrix (R) and a 3 x 1 Translation vector (T), computed in real time by the Tracking phase, describe the pose and position of the camera with respect to the marker reference system. The matrix M is also known as the external parameters matrix. Finally, substituting (4) into (3) yields a dynamic relation between each point described in 3D marker coordinate system and its representation on a 2D virtual image:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (5)

Since the marker and the plate reference systems are parallel and shifted of known quantities along the x and y axis, that is [[DELTA].sub.x] and [[DELTA].sub.y] (see Figure 2), the coordinates of a point p [equivalent to] {[x.sub.p], [y.sub.p]} with respect to to the plate coordinate system can be directly related to u and v by means of (5) taking [x.sub.m] = [x.sub.p] - [[DELTA].sub.x] and [y.sub.m] = [y.sub.p] - [[DELTA].sub.y]. In this application, the point p is the impact point.

Display. Finally, the Display phase is meant to show to the experimenter the AR scene. The available devices can be divided into one of the two main groups, depending whether they show a computer generated image of the external world and synthetic environment (Head Mounted Display (HMD)), or a synchronized combination of real-world eyesight and computer-generated image (Optical See-Through devices). HMDs are composed by a dark head helmet, equipped with a pair of projectors displaying external world images acquired by a camera mounted on the HMD itself: the device is so able to combine a background image streaming coming from the camera (external view) and a foreground synthetic image, synchronized with the localized optical markers. On the other hand, see-through displays are usually equipped with a camera, projectors, and semitransparent lenses: the experimenter can see through the lenses the real-world eyesight, while the virtual scene, represented by the virtual elaboration of the image acquired by the camera, is projected onto the lenses.

According to [30], also new devices like mobile platforms (e.g., Android) can be used for AR, in which the augmented image is displayed onto the screen of a mobile phone or a tablet, using the device camera to acquire the external image.

3. Impact Detection Methodology

In this section, the guided waves based methodology to estimate the impact position {[x.sub.p], [y.sub.p]}, the uncertainty in such estimation [[delta].sub.cr], and impact energy [E.sub.p] is presented.

3.1. Hyperbolic Positioning. Hyperbolic positioning, also called multilateration, is a powerful method to locate the impact position ([x.sub.p], [y.sub.p]) in a plate. Given the positions of at least three sensors ([x.sub.i], [y.sub.t]) on the plate, with i = 1, 2, 3, such method exploits the differences in distance of propagation (DDOP) traveled by the waves from the impact point to the sensors [[DELTA].sub.ij]:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (6)

in order to determine hyperbolas on which the impact point must lie. The intersection of the three different hyperbolas [[DELTA].sub.12], [[DELTA].sub.13], and [[DELTA].sub.23], obtained by solving the system of equations with the Levenberg-Marquardt algorithm [31], is taken to be the impact position.

Generally, the difference in distance of propagation (DDOP) is obtained by multiplying the difference in time of arrival (DTOA), measured from the acquired signals as the first arrival over a certain threshold, by a nominal wave propagation speed. Unfortunately, the potential of such approach is limited in plates where several dispersive modes, that is, with a frequency-dependent velocity and attenuation, appear simultaneously in the received signals and the selection of a wave speed and thus the transformation from DTOA to DDOP is not trivial.

To overcome this detrimental effect, it was shown in [19] that processing the acquired signals with a suitable transform, namely, the Warped Frequency Transform (WFT), a robust estimation of the DDOP is obtained without the need of measuring the DTOA (some details are given in the next subsection).

3.2. DDOP Estimation via Warped Frequency Transform. Let us consider a guided wave signal s(t, D) detected passively in an isotropic plate, where t denotes the time and D is the unknown traveled distance, and assume that the [A.sub.0] mode is within the signal, as generally happen when a plate undergoes to an impact. The group velocity curve [c.sup.M.sub.g](f) of the [A.sub.0] mode can be used to compute a warping map w(f) that defines uniquely a Frequency Warping operator [W.sub.w] [32]. Such operator, applied to a s(t, D), yields to a so-called warped signal [s.sub.w](t, D) = [W.sub.w][s(t, D)} whose frequency transform is defined as

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (7)

being [??](f) the first derivative of the warping map, S(w(f), 0) the warped frequency spectra of the exciting pulse at the point of impact (zero traveled distance), and K a warping map normalization parameter. The dispersive signal s(t, D) is thus transformed as in (7) where the dispersive effect of the distance is converted into a simple warped time delay KD proportional to the distance itself.

If so, the cross-correlation in the frequency domain of two warped signals acquired at sensors i = 1 and i = 2 is

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (8)

where the DDOP [[DELTA].sub.12] = [D.sub.1] - [D.sub.2] appears in the phase term. Thus, the abscissa value at which the cross-correlation envelope of two signal peaks in the frequency warped domain can be directly related to the DDOP of the dispersive signal acquired passively at two different sensors.

3.3. Theoretical Estimation Error of the Impact Position. The proposed impact localization strategy has an intrinsic uncertainty due to the imperfect knowledge of the material properties of the plate, the limited sampling frequency, and the unavoidable presence of noise in the measurements. Such uncertainty is strictly related to the sensor topology, and its lower bound, that is, the minimum theoretical estimation error of the source positions, can be estimated by using the Cramer-Rao algorithm.

Considering m = 3 DDOP measurements, the Cramer-Rao value for a point p [equivalent to] {[x.sub.p], [y.sub.p]} on the plate is calculated as

[[delta].sub.cr]([x.sub.p], [c.sub.p]) = [[[C.sup.T][A.sup.T] x [(A x R x [A.sup.T]).sup.-1] x AC].sup.-1], (9)

C is the mx 2 matrix whose elements are

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII], (10)

where [c.sub.w] = K x f. is the equivalent wave speed obtained through the time distance mapping described in the warping procedure, K is the warping map normalization parameter, and [f.sub.s] is the considered sampling frequency.

A is the full-rank (m - 1) x m matrix:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII]. (11)

R is a m x m diagonal matrix which represents the error in the measurements. The measurement errors are considered as independent identically distributed Gaussian random variables with zero mean and standard deviation [[sigma].sub.d] = 2[c.sub.w]/[f.sub.s], so that the nonnull elements of R are equal to [[sigma].sup.2.sub.d].

3.4. Impact Energy Estimation. Because of the acoustic energy confinement in the waveguide, Lamb waves experience a low attenuation with distance. In particular, the acquired signal amplitude is inversely proportional to the squared root of the distance of propagation. The estimation of impact energy deserves attention since it can be related to the level of damage in the component. Since the WFT is a unitary transform which preserves the energy of the signal [19], the impact energy [E.sub.p] can be estimated from the energy of the warped signals as

[E.sub.p] = [k.sub.E] [3.summation over (i=0)] [[integral].sup.t.sub.0] [s.sub.wi][(t, [D.sub.i]).sup.2]dt, (12)

where [s.sub.wi](t, D) with i [member of] {1, 2, 3}, are the warped versions of the signals acquired by the three sensors, and [k.sub.E] is a calibration constant.

4. Impact Data Visualization in AR

Hardware and software, settings, and testing procedures used in this work to visualize the outcomes of the impact detection methodology in AR are described in the following.

An aluminum 1050A square plate 1000 mm x 1000 mm and 3 mm thick was considered (see Figure 3). The nominal properties considered for the plate are Young's modulus E = 69 GPa, Poisson's coefficient v = 0.3, and material density [rho] = 2700 kg/[m.sup.3]. Three piezoelectric sensors (PZT discs PIC181, diameter 10 mm, thickness 1 mm) were bonded to the plate surface, at positions reported in Table 1.

In this study, a vision-based tracking technique is adopted. Such approach is usually preferred to sensor-based techniques in internal applications due to its higher precision and easier hardware implementation. To such purpose, the marker of Figure 1(b) is fixed on the plate at the {x, y} = {0, 0} mm coordinates. The marker has been simply sketched with graphical software and printed on a paper sheet. Different marker shapes could be used, provided they are asymmetric, black and white, and easily recognizable by the AR software. The tracking is performed by means of the AR toolkit, an available library for augmented reality shared by the University of Washington [33]. The AR toolkit is used to detect the marker, to measure the image distortion, and to relate in time the marker reference system to the image.

The experimenter wears a pair of Vuzix STAR 1200 glasses (see Figure 1(a)), a see-trough device specifically designed for AR applications [34]. These glasses include a camera in the front and two miniaturized projectors displaying images on the transparent lenses; two USB plugs connect the Vuzix glasses with a notebook or a PC. The resolution of the camera of this see-through glasses can be changed up to 1920 x 1080 pixels, and the two projectors support video display resolutions up to 1280 x 720. At first, an image in which the marker is framed with a preset distance and pose (known position) of the Vuzix glasses camera is taken; the AR toolkit macros automatically detect the marker in the image, and performs the Calibration computing y (the other camera features are generally known).

The experimenter strikes the plate with the instrumented hammer (PCB Piezotronics), used to simulate impacts, and the guided waves are signals gathered by the sensors through an oscilloscope (LC534 series LeCroy) at a sampling frequency of [f.sub.s] = 300 kHz. Acquisitions are triggered when the signal received from one of the sensors reaches a threshold level of 140 mV; pretrigger recording is enabled to obtain the previous history of each signal. The detected waveforms are directly processed in a PC to find the position {[x.sub.p], [y.sub.p]}, position uncertainty [[delta].sub.cr] and energy [E.sub.p] of the impact. Such data are passed to the AR algorithm for the final visualization.

During tracking, each frame acquired by the glasses' camera is processed by the AR toolkit software following these steps: application of threshold, detection of connected objects, detection of objects contours, detection of marker edges, computation of image distance and pose with respect to the camera reference system by the evaluation of the marker distortion, and projection of the impact point coordinates {[x.sub.p], [y.sub.p]} on the image in terms of u and v.

Finally, a designed virtual symbol or image, centered in the right position (w and v), can be projected to the real world image in real time (see Figure 4) exploiting the glasses projectors.

The marker detection procedure does not require that the experimenters' point of view is perpendicular to the surface on which the marker lies, so that the experimenter can view the impact point from several points of view, rotating the head in a natural way.

5. Results of the Case Study

5.1. Impact Location Estimation. To process the acquired signals with the WFT, first a guided mode must be selected and its warping map w(f) is designed. Here, the fundamental mode [A.sub.0] was selected since the contribution of [A.sub.0] in the signal is expected to be much more relevant than the one of [S.sub.0] for two main reasons: (i) the wavelength tuning effect imposed by the sensors/plate properties filters out most of the [S.sub.0] wave; (ii) due to the out-of-plane excitation the energy in the [A.sub.0] mode is considerably greater than that retained by the [S.sub.0] mode. Therefore, the group velocity curve of the [A.sub.0] mode was predicted [35], used to compute the warping map w(f) and the normalization parameter K, as well as the proper Frequency Warping operator [W.sub.w]. At this point, for the given coordinates of the transducers {[x.sub.i], [y.sub.i]}, the selected sampling frequency [f.sub.s], and the parameter K, the Cramer-Rao lower bound was computed for each point of the plate and represented in Figure 5 as a contour plot.

Then, the plate was impacted, the signals acquired, and the cross-correlation (Xcorr) of the warped signals computed. For instance, the Xcorr between [s.sub.w1](t, [D.sub.1]) and [s.sub.w2](t, [D.sub.2]) for an impact [x.sub.p] = 395 mm and [y.sub.p] = 554 mm is depicted in Figure 6(a) together with its envelope computed as the modulus of the Hilbert Transform. Similarly, in Figure 6(b) is shown the Xcorr and its envelope for [s.sub.w1](t, [D.sub.1]) and [s.sub.w3](t, [D.sub.3]).

As can be seen, the abscissas of envelope maxima correspond to the actual difference in distance of propagation [[DELTA].sub.12] and [[DELTA].sub.13], respectively. Similar accuracy was obtained for [[DELTA].sub.23] not shown in Figure 6. Such DDOP, define the hyperbolas used to estimate the impact position.

Such approach was used next to locate 18 impacts on the plate surface. The results of the proposed procedure can be seen in Figure 7 where the crosses denote the positions where the plate has been impacted, the centre of the circles the impact positions estimated by the proposed procedure, and whereas the circles' radii were set equal to the value of [[delta].sub.cr]([x.sub.p], [y.sub.p]) for the estimated impact position. It is worth noting how the circles always surround the true impact positions.

5.2. Impact Energy Estimation. The impact energy [E.sub.p] for the 18 different impacts considered was estimated as detailed in Section 3.4 and represented in Figure 8 as small circles. Such estimated impact energies were compared with those measured by the instrumented hammer [E.sub.h]. In particular, [E.sub.h] was computed as the squared value of the velocity timehistory, obtained by integrating the acceleration time-history registered by the instrumented hammer. As can be seen, a very good agreement between the two quantities was found, thus verifying that an efficient impact energy estimation can be performed from warped waveforms [s.sub.wi]. The discrepancy in the energy estimation that appears in cases 10 and 14 may be due to several factors, such as the suboptimal selection of the calibration constant [k.sub.E], or the jamming effect due to edge reflections. Most likely, the discrepancy is due to the difference in the frequency response between the impact hammer (which acts as low pass filter in the 0-4 kHz range) and the piezoelectric sensors (which have a much broader frequency response).

5.3. AR Visualization. Following the descried AR approach, the experimenter wears a pair of AR glasses and frames both the marker and the zone in which the impact is applied. The aluminum panel is hit with the hammer, the impact detection procedure starts, and the estimated coordinates of the impact point with respect to the marker system [x.sub.m] = [x.sub.p] - [[DELTA].sub.x] and [y.sub.m] = [y.sub.p] - [[DELTA].sub.y] are passed to the visualization algorithm along with the [[delta].sub.cr] and [E.sub.p].

As shown in Figure 9, a symbol formed by three squares has been designed to represent the data obtained from the impact methodology: the centre of the red square shows the estimated impact position {[x.sub.p], [y.sub.p]}, the side length of the green square denotes the uncertainty in the estimated position [[delta].sub.cr]([x.sub.p], [y.sub.p]), whereas the yellow square denotes the strength of the impact: a yellow square close to the red square means low energy [E.sub.p] [approximately equal to] 0 while a yellow square close to the green square means high energy [E.sub.p] [approximately equal to] 1. The experimenter can thus visually evaluate the energy at the impact in an intuitive way. Figure 9 shows the view through the glasses lenses after the augmentation.

6. Conclusions

In this work, an augmented reality (AR) strategy is proposed to visualize on plate-like structures the outcomes of an impact detection methodology. In particular, the impact position and energy are considered. To such aim, first previous contributions of the authors devoted to signal processing strategies for impact localization are exploited and extended to the estimation of the impact energy. The experimental results on an isotropic plate show the accuracy of the methodology on both impact location and energy estimation. Next, as a new original contribution to the research, the visualization in augmented reality (AR) of impact position, uncertainty, and impact energy, is provided. The experimental results show how such data can be viewed by the experimenter in real-time directly overimposed on the structure.

The proposed strategy shows good potential to be used in damage detection and maintenance operations. In fact, beside the precise and low computational cost for impact detection (position and energy) it can ease the maintenance practice since data can be presented directly on the component to monitor in an effective and intuitive way. The AR is preferred to the Virtual Reality since the augmented scene is obtained by adding virtual objects to the real-world eyesight, mitigating thus the unphysical feeling that the end-user experience from a complete virtual approach while still supporting him during the operations in the real-world environment.

Further studies are needed to assess the reliability of the proposed technology and the applicability to more complex structures (e.g., stiffened plates) and to anisotropic materials.

http://dx.doi.org/10.1155/2013/619570

Conflict of Interests

The authors did not receive any financial support from the commercial identities mentioned in the work. The authors declare that there is no conflict of interests regarding the publication of this paper.

References

[1] https://en.wikipedia.org/wiki/augmented reality.

[2] R. T. Azuma, "A survey of augmented reality," Presence, vol. 6, no. 4, pp. 355-385, 1997

[3] R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, and B. MacIntyre, "Recent advances in augmented reality," IEEE Computer Graphics and Applications, vol. 21, no. 6, pp. 34-47, 2001.

[4] D. Krevelen and R. Poleman, "A survey of augmented reality technologies, applications and limitations," The International Journal of Virtual Reality, vol. 9, no. 2, pp. 1-20, 2010.

[5] X. Wang, M. J. Kim, P. E. Love, and S. C. Kang, "Augmented reality in built environment: classification and implications for future research," Automation in Construction, vol. 32, pp. 1-13, 2013.

[6] H. L. Chi, S. C. Kang, and X. Wang, "Research trends and opportunities of augmented reality applications in architecture, engineering, and construction," Automation in Construction, vol. 33, pp. 116-122, 2013.

[7] S. Benbelkacem, M. Belhocine, A. Bellarbi, N. Zenati-Henda, and M. Tadjine, "Augmented reality for photovoltaic pumping systems maintenance tasks," Renewable Energy, vol. 55, pp. 428-437, 2013.

[8] J. M. Antonio, S. R. J. Luis, and S. P. Faustino, "Augmented and virtual reality techniques for footwear," Computers in Industry, 2013.

[9] A. Farhidzadeh, E. Dehghan-Niri, A. Moustafa, S. Salamone, and A. Whittaker, "Damage assessment of reinforced concrete structures using fractal analysis of residual crack patterns," Experimental Mechanics, pp. 1-13, 2012.

[10] B. Koo, H. Choi, and T. Shon, "Wiva: Wsn monitoring framework based on 3D visualization and augmented reality in mobile devices," in Ambient Assistive Health and Wellness Management in the Heart of the City, M. Mokhtari, I. Khalil, J. Bauchet, D. Zhang, and C. Nugent, Eds., vol. 5597 of Lecture Notes in Computer Science, pp. 158-165, Springer, Berlin, Germany, 2009.

[11] J. L. Rose, Ultrasonic Waves in Solid Media, Cambridge University Press, 1999.

[12] R. Seydel and F.-K. Chang, "Impact identification of stiffened composite panels: I. System development," Smart Materials and Structures, vol. 10, no. 2, pp. 354-369, 2001.

[13] B. Wang, J. Takatsubo, Y. Akimune, and H. Tsuda, "Development of a remote impact damage identification system," Structural Control and Health Monitoring, vol. 12, no. 3-4, pp. 301-314, 2005.

[14] Z. Su, L. Ye, and Y. Lu, "Guided Lamb waves for identification of damage in composite structures: a review," Journal of Sound and Vibration, vol. 295, no. 3-5, pp. 753-780, 2006.

[15] T. Kundu, S. Das, and K. V Jata, "Point of impact prediction in isotropic and anisotropic plates from the acoustic emission data," Journal of the Acoustical Society of America, vol. 122, no. 4, pp. 2057-2066, 2007

[16] T. Kundu, S. Das, and K. V. Jata, "Detection of the point of impact on a stiffened plate by the acoustic emission technique," Smart Materials and Structures, vol. 18, no. 3, Article ID 035006, 2009.

[17] F. Ciampa and M. Meo, "Acoustic emission source localization and velocity determination of the fundamental mode A0 using wavelet analysis and a newton-based optimization technique," Smart Materials and Structures, vol. 19, no. 4, Article ID 045027, 2010.

[18] S. Salamone, I. Bartoli, P. di Leo et al., "High-velocity impact location on aircraft panels using macro-fiber composite piezoelectric rosettes," Journal of Intelligent Material Systems and Structures, vol. 21, no. 9, pp. 887-896, 2010.

[19] A. Perelli, L. de Marchi, A. Marzani, and N. Speciale, "Acoustic emission localization in plates with dispersion and reverberations using sparse PZT sensors in passive mode," Smart Materials and Structures, vol. 21, no. 2, Article ID 025010, 2012.

[20] E. D. Niri, A. Farhidzadeh, and S. Salamone, "Nonlinear kalman filtering for acoustic emission source localization in anisotropic panels," Ultrasonics, 2013.

[21] A. Liverani and A. Ceruti, "Interactive GT code management for mechanical part similarity search and cost prediction," Computer-Aided Design and Applications, vol. 7, no. 1, pp. 1-15, 2010.

[22] A. Liverani, A. Ceruti, and G. Caligiana, "Tablet-based 3D sketching and curve reverse modelling," International Journal of Computer Aided Engineering and Technology, vol. 5, no. 2-3, 2013.

[23] S. Debernardis, M. Fiorentino, M. Gattullo, G. Monno, and A. E. Uva, "Text readability in head-worn displays: color and style optimization in video vs. optical see-through devices," IEEE Transactions on Visualization and Computer Graphics, vol. PP, no. 99, p. 1, 2013.

[24] F. Zhou, H. B.-L. Dun, and M. Billinghurst, "Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR," in Proceedings of the 7th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 08), pp. 193-202, Cambridge, UK, September 2008.

[25] G. Welch and E. Foxlin, "Motion tracking: no silver bullet, but a respectable arsenal," IEEE Computer Graphics and Applications, vol. 22, no. 6, pp. 24-38, 2002.

[26] M. Pressigout and E. Marchand, "Hybrid tracking algorithms for planar and non-planar structures subject to illumination changes," in Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR '06), pp. 52-55, IEEE Computer Society, Santa Barbara, Calif, USA, October 2006.

[27] P. P. Valentini and E. Pezzuti, "Interactive multibody simulation in augmented reality," Journal of Theoretical and Applied Mechanics, vol. 48, no. 3, pp. 733-750, 2010.

[28] A. David and P Jean, Computer Vision: A Modern Approach, 2002.

[29] H. Wu, Z. Cai, and Y. Wang, "Vision-based auxiliary navigation method using augmented reality for unmanned aerial vehicles," in Proceedings of the 10th IEEE International Conference on Industrial Informatics (INDIN '12), pp. 520-525, Beijing, China, July 2012.

[30] S. Yuen, G. Yaoyuneyong, and E. Johnson, "Augmented reality: an overview and five directions for ar in education," Journal of Educational Technology Development and Exchange, vol. 4, no. 1, pp. 2119-2140, 2011.

[31] D. Marquardt, "An algorithm for least-squares estimation of nonlinear parameters," Journal of the Society For Industrial and Applied Mathematics, vol. 11, no. 2, pp. 431-441, 1963.

[32] S. Caporale, L. de Marchi, and N. Speciale, "Frequency warping biorthogonal frames," IEEE Transactions on Signal Processing, vol. 59, no. 6, pp. 2575-2584, 2011.

[33] http://www.hitl.washington.edu/artoolkit/.

[34] http://www.vuzix.com/augmented-reality/products_star1200. html.

[35] P Bocchini, A. Marzani, and E. Viola, "Graphical user interface for guided acoustic waves," Journal of Computing in Civil Engineering, vol. 25, no. 3, pp. 202-210, 2011.

Luca De Marchi, (1) Alessandro Ceruti, (2) Alessandro Marzani, (3) and Alfredo Liverani (2)

(1) Department of Electrical, Electronic and Information Engineering (DEI), University of Bologna, Viale del Risorgimento 2, 40136, Italy

(2) Department of Industrial Engineering (DIN), University of Bologna, Viale del Risorgimento 2, 40136, Italy

(3) Department of Civil, Chemical, Environmental and Materials Engineering (DICAM), University of Bologna, Viale del Risorgimento 2, 40136, Italy

Correspondence should be addressed to Alessandro Marzani; alessandro.marzani@unibo.it

Received 25 July 2013; Accepted 18 September 2013

Academic Editor: Salvatore Salamone

TABLE 1: Sensor topology. PZT-1 PZT-2 PZT-3 [x.sub.i] [mm] 250 250 750 [y.sub.i] [mm] 250 750 250

Printer friendly Cite/link Email Feedback | |

Title Annotation: | Research Article |
---|---|

Author: | De Marchi, Luca; Ceruti, Alessandro; Marzani, Alessandro; Liverani, Alfredo |

Publication: | Journal of Sensors |

Date: | Jan 1, 2013 |

Words: | 5667 |

Previous Article: | Cooperative spectrum sensing and localization in cognitive radio systems using compressed sensing. |

Next Article: | Grey-level cooccurrence matrix performance evaluation for heading angle estimation of moveable vision system in static environment. |

Topics: |