Printer Friendly

On the Interpretation of 3D Gyroscope Measurements.

1. Introduction

With the ongoing developments in gyroscope-manufacturing technology, their use in various fields is observably increasing [1-29]. 3D gyroscopes are integral components of inertial navigation units [5-8]; have been shown feasible for motion capture, classification, and analysis [9-24]; and are essential elements of the assistive, rehabilitative, and wearable health technology [21, 22]. However, the utility of 3D gyroscopes can be diminished without an associated meaningful interpretation of the measured values and computation of the angular orientation.

A 3D gyroscope measures the angular velocity of its rotation in a reference frame along its three sensitivity axes. For an ideal sensor, these outputs are equal to the projections of the rotation angular velocity on the sensitivity, that is, intrinsic coordinate system axes. In the measurement context, the sensor is hence simultaneously rotating around its three coordinate system axes. Properly combined, these three rotations are equivalent to the actual rotation of the sensor in the reference frame.

Misleading practices arise from interpreting the three measured angular velocities as being sequential, that is, Euler rotations. As the three angular velocities that a 3D gyroscope provides represent simultaneous rotations, in general, the use of Euler angles introduces a systematic error in the calculated angular orientation. It is only in the case of infinitesimal rotations, which are shown to be commutative, that the sequential and simultaneous rotations around orthogonal axes result in the same angular orientation [30, 31].

There are a number of literature contributions [23-28] that present otherwise promising applications of 3D gyroscopes that adopt the above-emphasized misleading interpretations. For this reason, we find it necessary to emphasize in more detail the realistic effects of the gyroscope measurement interpretation. Our objective is therefore to investigate the systematic error that arises from erroneously interpreting the 3D gyroscope measurements and quantify its effect on the accuracy of the computed angular orientation.

The paper is organized as follows. In Section 2, we resolve the problem of the correct interpretation of 3D gyroscope measurements by invoking the expressions for the rotation that is equivalent to the three simultaneous orthogonal rotations measured by a 3D gyroscope, as derived in [30], and present proper means for computing the angular orientation.

To isolate and quantify the deleterious effect of the erroneous rotation interpretation on the angular orientation accuracy, we consider, in Section 3, an ideal 3D gyroscope and a simple rotation at a constant angular velocity around a fixed rotation axis. In Section 4, we present angular orientation error reduction that can be achieved when using a real, imperfect 3D gyroscope in real-case measurements in which the described systematic error obfuscates the accuracy of the computed angular orientation along with typical errors attributed to an inaccurate sensor and random measurement noise.

Although the manufacturing technologies of low-cost gyroscopes is improving rapidly, these sensors have been shown to suffer from output bias drift and inaccurate sensitivities and alignments of the sensor sensitivity axes. As it is our aim to quantify the deleterious effect of the erroneous rotation interpretation, we here consider a properly calibrated sensor. Improving the accuracy of 3D gyroscopes has been the subject of a number of ongoing research works and is presented elsewhere [32-35]. Finally, in Section 5, we summarize our results and draw the conclusions.

In all subsequent sections, we obey the following notation rules: bold letters denote matrices and vectors, and large and small italics denote scalars.

2. 3D Gyroscope Measurements Interpretation

2.1. Simultaneous Rotations. Let us consider an ideal 3D gyroscope that rotates at constant angular velocity [omega] around a rotation axis v in a 3D reference frame. It provides measurements of simultaneous angular velocities [[omega].sub.x], [[omega].sub.y], and [[omega].sub.z] around its three orthogonal sensor intrinsic coordinate system axes, x, y, and z, respectively. In [30,36], we have derived a rotation vector [PHI] called Simultaneous Orthogonal Rotation Angle (SORA), which we can use to interpret the measured values correctly. The components of this vector are equal to the rotation angles of the three simultaneous rotations around the sensor axes:

[PHI] = T[[[omega].sub.x] [[omega].sub.y] [[omega].sub.z]].sup.T] = [[[phi].sub.x] [[phi].sub.y] [[phi].sub.z]].sup.T], (1)

where [[omega].sub.x], [[omega].sub.y], and [[omega].sub.z] are the three measured angular velocities; [[phi].sub.x], [[phi].sub.y], and [[phi].sub.z] are the rotation angles; and T is the measurement interval.

As long as the axis v of rotation is constant during the measurement interval T, the orientation and magnitude of vector SORA [PHI] (1) are respectively equal to the axis and angle of the actual rotation of the gyroscope. It holds that

[mathematical expression not reproducible], (2)

[phi] = [square root of [[omega].sub.x.sup.2]+[[omega].sub.y.sup.2]+[[omega].sub.z.sup.2]] = [square root of [[phi].sub.x.sup.2]+ [[phi].sub.y.sup.2]+ [[phi].sub.z.sup.2]]. (3)

From (1)-(3), we can conclude that the gyroscope measures the projections of its rotation angular velocity vector

[omega] = [PHI]/T = [[[[omega].sub.x] [[omega].sub.y] [[omega].sub.z]].sup.T], (4)

on its sensitivity axes. Note that, in general, due to rotation noncommutativity, the rotation (1) and angular velocity vectors (4) cannot be considered as real vectors. This is possible only in the particular case of a constant rotation axis.

Using the rotation axis (2) and angle (3), we can compute the angular orientation of the sensor intrinsic coordinate system axes in the reference frame. We introduce R([phi],v) to denote a 3 x 3 rotation matrix associated with the axis (2) and angle (3) of rotation:

[mathematical expression not reproducible], (5)

where c and s respectively denote cos([phi]) and sin([phi]).

We further introduce 3 x 3 matrices [S.sub.init] and [S.sup.(sim).sub.fin] to represent the sensor initial [S.sub.init] and [S.sup.(sim).sub.fin] final orientation in the reference frame. The columns of Sinit and sf"m are 3 x 1 unit vectors representing the initial and final orientations, respectively, of the sensor coordinate axes. We can write

[s.sup.(sim)] = [S.sub.init] x R ([phi], v). (6)

The order of matrices [S.sub.init] and R([phi],v) in multiplication in (6) accounts for the fact that the rotation axis v (2) is given in the 3D gyroscope intrinsic coordinate system.

2.2. Sequential Rotations. As opposed to the three rotations represented with SORA (1), Euler angles represent a sequence of three elemental rotations, that is, rotations about the axes of the (intrinsic or reference) coordinate system. If only rotations around different intrinsic coordinate system axes are considered, six sequences are possible: x-y-z, y-z-x, z-x-y, x-z-y, z-y-x, and y-x-z. Because rotations are in general not commutative, each of these six sequences brings to a different final angular orientation, none of which is correct when all three rotations are simultaneous, as is the case when the 3D gyroscope rotates around an arbitrary axis. Using any of these sequences in calculating the angular orientation of the sensor introduces a systematic error.

As the three coordinate axes of the sensor are equivalent, we can use any of the aforementioned sequences to quantify the error in the calculated angular orientation. We chose the sequence z-y-x, also known as the aerospace sequence with rotation angles called yaw, pitch, and roll.

Using angles [[phi].sub.z], [[phi].sub.y], and [[phi].sub.x] obtained with a 3D gyroscope in place of the yaw, pitch, and roll, respectively, we can calculate the final orientation of the sensor [S.sup.(seq).sub.fin] according to the sequential rotation interpretation as

[S.sup.(seq).sub.fin] = [S.sub.init] x [R.sub.E]([[phi].sub.z], [[phi].sub.y], [[phi].sub.x]). (7)

[R.sub.E]([[phi].sub.z], [[phi].sub.y], [[phi].sub.x]) (7) denotes a 3 x 3 Euler rotation matrix composite:

[mathematical expression not reproducible], (8)

where [s.sub.k] and [c.sub.k] respectively denote sin([[phi].sub.k]) and cos([[phi].sub.k]), k representing one of the axes, x, y, or z.

The orientation of an aircraft may indeed be represented in a convenient and intuitive way with the three Euler orientations. However, Euler angles are not equal to the angles measured with a 3D gyroscope, and obviously, their use in this context is erroneous. Comparing (5) and (8), we can conclude that rotation matrices for simultaneous (5) and sequential (8) rotation interpretations are not equal and that relying on the latter will not yield the correct result in the case of 3D gyroscope measurements.

Let us look at a simple illustrative example. Suppose that the sensor rotates with an angular velocity w = 360[degrees]/s around the unit axis

v = [[1 1 1].sup.T]/[square root of 3], (9)

for T = 1 s. After this rotation time, the orientation of the sensor is obviously equal to its initial orientation; thus, the rotation matrix for the simultaneous interpretation (5) is actually, in this case, the identity matrix. We can write

R([phi],v) = I. (10)

By considering (1)-(4), we can conclude that, in this case, the angles measured with an ideal gyroscope would be

[[phi].sub.x] = [[phi].sub.y] = [[phi].sub.z] = 360[degrees]/[square root of 3]. (11)

Substituting these angles for the yaw, pitch, and roll Euler angles in (8) yields

[mathematical expression not reproducible]. (12)

The obtained rotation matrix [R.sub.E]([[phi].sub.z], [[phi].sub.y], [[phi].sub.x]) is not equal to the identity matrix I and would obviously bring to an incorrect angular orientation:

[R.sub.E]([[phi].sub.z], [[phi].sub.y], [[phi].sub.x]) [not equal to] R([phi],v). (13)

In practice, much smaller angles of rotations are used in calculations of final angular orientation. As the rotation angles become ever smaller, the difference between the simultaneous and sequential rotation results becomes ever more negligible. However, even for small angles, the error in the computed angular orientation accumulates over time and can become significant, as we will present and discuss in the next sections.

3. Angular Orientation Systematic Error

3.1. Angular Orientation Error Measures. We express the angular orientation error [epsilon] in terms of the orientation deviation angle, that is, the angle of a rotation that would correct the calculated orientation of the sensor:

[epsilon] = across(tr([S.sub.fin] x [([S.sup.(cal).sub.fin]).sup.T]-1/2), (14)

where [S.sub.fin] is the 3 x 3 matrix of the actual final orientation and [S.sup.(cal).sub.fin] is the 3 x 3 matrix of the calculated state.

Due to integration, the error in the computed angular orientation accumulates over time. To make the results comparable across different simulations and measurements, we normalize the deviation angle [epsilon] (14) with respect to the total time of rotation T:

[[epsilon].sub.norm] = [epsilon]/T. (15)

3.2. Methodology. To isolate the effect of the erroneous rotation interpretation, we consider an ideal 3D gyroscope that provides time-discrete samples of angular velocities at sample moments [nT.sub.s], where [T.sub.s] is the sampling interval:

[T.sub.s] = 1/[f.sub.s], (16)

and [f.sub.s] is the sampling rate. We thus obtain N = T [f.sub.s] samples of the angular velocity vector to [omega] (4).

We further consider the sensor and reference frame coordinate axes to be initially aligned. The initial orientation of the sensor is hence given by the identity matrix:

[S.sub.init] = I. (17)

For this investigation, we finally consider that the sensor rotates for T= 1s with a constant angular velocity w = 360[degrees]/s, around the unit axis:

v = [1 1 1]/[square root of 3]. (18)

For this example, the initial and actual final orientations of the sensor are hence identical. We can write

[S.sub.fin] = [S.sub.init] = I. (19)

For each of the N rotation samples, the sensor rotates for an angle [DELTA][phi] = 360[degrees]/N around the constant axis v. As we are assuming the gyroscope is accurate and the rotation axis is constant throughout the rotation, we can conclude that all samples of the measured angular velocities are equal to

[[omega].sub.x,n]

and that all angles of rotations around the sensor axes are equal to

[DELTA][[phi].sub.s] = [[phi].sub.x,n] = [[phi].sub.y,n] = [[phi].sub.z,n] = 360[degrees]/N[square root of 3], 1 [less than or equal to] n [less than or equal to] N. (21)

We can finally conclude that for all samples n, the angle of rotation calculated according to (3) is equal to

[[phi].sub.n] = [DELTA][phi] = [square root of [[phi].sup.2.sub.x,n]+[[phi].sup.2.sub.y,n]+[[phi].sup.2.sub.z,n]] = 360[degrees]/N, 1 [less than or equal to] n [less than or equal to] N. (22)

It is obvious that combining all N rotations in a single rotation matrix would yield the identity matrix I. Considering this and (19) for calculating the resulting angular orientation (6), we can hence write

[S.sup.(sin).sub.fin] = [S.sub.init] x R([phi],v) = [S.sub.init] x I = [S.sub.init] = [S.sub.fin]. (23)

This implies that interpreting the rotations as simultaneous by considering SORA (1)-(4) does not produce an error in the calculated angular orientation.

To calculate the final angular orientation of the sensor according to the sequential rotation interpretation (7), we multiply the Euler rotation matrix [R.sub.E]([DELTA][[phi].sub.s], [DELTA][[phi].sub.s], [DELTA][[phi].sub.s]) (8) by itself N - 1 times:

[S.sup.(seq).sub.fin] = [S.sub.init]([R.sub.E][([DELTA][[phi].sub.s],[DELTA][[phi].sub.s],[DELTA][[phi].sub.s])).sup.N]. (24)

By substituting [S.sup.(seq).sub.fin] for [S.sup.(cal).sub.fin] in (14), we can calculate the angular orientation error for the sequential rotation interpretation. Because both the initial and actual final angular orientations of the sensor are given with the identity matrix (19), (14) simplifies to

[epsilon] = (tr([S.sup.(seq).sub.fin])-1/2)

=across([tr[([R.sub.E]([DELTA][[phi].sub.s],[DELTA][[phi].sub.s],[DELTA][[phi].sub.s])).sup.N]-1]/2. (25)

Total measurement time is set to T = 1 s, so the normalized deviation angle (15) is equal to

[[epsilon].sub.norm] = [epsilon]/s. (26)

As [DELTA][[phi].sub.s] decreases with the sampling rate and as small enough angle rotations become nearly commutative, we expect both error measures, [epsilon] (25) and [[epsilon].sub.norm] (26), to decrease with the sampling rate too.

3.3. Results and Discussion. The normalized orientation deviation angles [[epsilon].sub.norm] obtained according to (26) for different sampling frequencies decreasing from 10,000 Hz to 4 Hz in a logarithmic scale are presented in Figure 1. Different sampling frequencies correspond to different angles of the gyroscope rotations [DELTA][phi], increasing from 0.036[degrees] to 90[degrees].

As expected, the results obtained show that the angular orientation error that is a consequence of erroneously interpreting the three simultaneous orthogonal rotations measured using a 3D gyroscope as sequential increases with the magnitude of the angles of the considered rotation steps.

We can observe that by substantially lowering the sampling frequency, that is, enlarging the angle of individual rotation [DELTA][phi], the magnitude of the error can become so significant that the results become completely unreliable, even after only one second of rotation. In the considered measurement scenario, for a sampling frequency of 4 Hz, the individual rotation angle is [DELTA][phi] = 90[degrees]. For an angle of that magnitude, the normalized error in the angular orientation calculated according to the sequential rotation interpretation is [[epsilon].sub.norm] = 73-16[degrees]/s.

Due to accumulation over time, the absolute error in the calculated angular orientation cannot be neglected, even for small angles. For example, for a common sampling frequency [f.sub.s] = 2048Hz, the individual rotation angle is 0.176[degrees] and the systematic error in the computed angular orientation [epsilon] exceeds 6[degrees] after one minute of rotation.

It is also worth noting that as the rotation axis is constant throughout the rotation, by correctly interpreting the measured angular velocities, we can compute the final angular orientation in a single step. If applied when interpreting the measured rotations as sequential, the same approach would result in even larger errors than presented here.

4. Real-Case Measurement

4.1. Methodology. To observe the implications of rotation interpretation in real-case measurements, in which the described systematic error obfuscates the accuracy of the computed angular orientation along with typical errors attributed to an inaccurate sensor and random measurement noise, we rely on real motion tracking.

We used the MEMS 3D gyroscope ITG3200-3, manufactured by InvenSense (Sunnyvale, CA, USA) [37], providing for 16-bit outputs in a [+ or -]2000[degrees]/s range. The sampling rate of the sensor's analog-to-digital converter was set to 1000 Hz. However, to relate to a real-time measurement environment more realistically and to support computational efficiency, the obtained signals were downsampled to [f.sub.s] = 50 Hz. Prior to the measurements, we performed static calibration of the sensor according to the procedure described in [35].

We performed seven measurements. In each of these, we manually rotated the sensor. At the end of each measurement, we positioned the sensor in the exact same orientation as it was initially. This procedure enabled us to estimate the angular orientation accuracy as the deviation of the sensor's final (computed) orientation from its initial orientation and compare both interpretations' results in a simple manner, avoiding the use of additional equipment.

We rotated the sensor in such a way that for each measurement, the dynamics of the rotation was larger than that for the previous measurement. As has been elaborated in Section 2, as long as the axis of the gyroscope rotation is constant during the measurement interval, the orientation and the magnitude of vector SORA (1) are respectively equal to the axis and angle of the actual rotation of the gyroscope. For a general measurement scenario, however, it is necessary to account for possible rotation axis changes during Ts. By progressively enlarging the dynamics of the measured rotations, we could quantify the effect that these (immeasurable) rotation axis changes have on the angular orientation accuracy.

Otherwise than stated, the motion of the gyroscope, including its rotation, was arbitrary. The obtained angular velocities are presented in Figure 2.

By assuming that, on average, the change in the angular velocity between two consecutive samples reflects the change during samples, that is, the measurement intervals n, we can use the following measure of angular velocity dynamics:

[bar.[parallel][DELTA][omega][parallel]] = 1/[N-1][N.summation over (n=2)][parallel][[omega].sub.n]-[[omega].sub.n-1][parallel], (27)

to represent rotation axes changes during [T.sub.s].

Different sources of measurement inaccuracies including imperfect sensor calibration, noise, and immeasurable rotation axis changes obviously introduce errors in the angular orientation computed according to both the simultaneous and the sequential rotation interpretations. As the initial and actual final angular orientations of the sensor in all of the seven performed measurements were identical, we could calculate the respective deviation angles (14) as

[[epsilon].sup.(seq)] = arcos(tr([S.sup.(seq).sub.fin])-1/2), (28)

[[epsilon].sup.(seq)] = arcos(tr([S.sup.(seq).sub.fin])-1/2), (29)

where [[epsilon].sup.(seq)] is the orientation deviation angle when sequential and [[epsilon].sup.(sim)] when simultaneous rotations are considered. We further obtained the normalized deviation angles for both respective interpretations, [[epsilon].sup.(seq).sub.norm] and [[epsilon].sup.(sim).sub.norm], by inserting (28) into (15). We expect all of these errors to be affected by the angular velocity dynamics (27).

To evaluate the gain of using the correct interpretation of 3D gyroscope measurements, we introduce [F.sub.err] to denote the angular orientation error reduction factor:

[F.sub.err] = [[epsilon].sup.(seq)]-[[epsilon].sup.(sim)]/[[epsilon].sup.(seq)]. (30)

4.2. Results and Discussion. The results are presented in Figure 3. For the simultaneous rotation interpretation, the error in the calculated angular orientation ranges from 0.034[degrees] to 1.250[degrees] per second of rotation and can become significant over time. However, the error that results when interpreting rotations as sequential is much larger and ranges from 0.132[degrees] to 16.152[degrees] per second of rotation. An error reduction of 74% to 98% is achievable if the gyroscope measurements are interpreted correctly.

Considering the applicative value of these results, we can point out the following: even for the largest error obtained by interpreting the measured rotations as simultaneous (1.250[degrees]/s), we can rotate the sensor up to almost 13 times longer before reaching the error of the sequential rotation interpretation approach (16.152[degrees]/s).

The results provided show that the error in the angular orientation obtained according to both interpretations of rotations, in general, increases with the dynamics of the angular velocities, which reflect the changes of the rotation axis during measurement intervals. This error is, in essence, a measurement error and cannot be computationally taken into account, regardless of the interpretation method employed. However, even for the most extreme conditions here considered, the error reduction factor is significant.

5. Conclusion

Due to different sources of motion sensor inaccuracies, accurate position estimation is a delicate task. Erroneous interpretation of the values measured with a 3D gyroscope only extends the deleterious effects on the estimated position of the sensor. Estimating the angular orientation by interpreting the three simultaneous rotations as sequential, that is, Euler, may seem straightforward. However, the angles measured with a gyroscope are not, in general, equal to the angles of the elemental Euler rotations. We have shown that not only is interpreting simultaneous rotations as sequential theoretically erroneous, it introduces a significant systematic error in the angular orientation when the rotation angles become large.

For a rotation rate of 360[degrees]/s when sampled at 2048 Hz, the angular orientation error exceeds 6[degrees] after one minute of rotation. For the lowest sampling rate considered (4 Hz) for the same motion, the computed angular orientation is entirely unreliable as the error exceeds 73[degrees] after only one second of rotation. We have shown that significant accuracy improvements can be achieved in the real-case measurement scenario in which this systematic error only extends the deleterious effects of sensor inaccuracies in a noisy environment.

It is necessary to note that the simultaneous rotation interpretation and angular orientation computation presented in this article are orientated towards eliminating the systematic error that arises from the erroneous rotation interpretation. Different interpretations cannot compensate for other sources of inaccuracies.

Besides random errors attributed to an inaccurate sensor, an error that cannot be eliminated is a consequence of angular velocity changes between the sampling moments. This error is in its essence a measurement error, and rather than computationally, it can be adequately accounted for by suitably reducing the measurement interval, that is, enlarging the sampling frequency. Unfortunately, this opposes the restriction of the energy autonomy of the devices that utilize gyroscopes for frequent or continuous everyday motion tracking and will demand an ever-lower sampling frequency.

However, even when the angular velocity changes during the measurement intervals, for the illustrative motion considered in this article, an error reduction of 74% to 98% is achievable if the 3D gyroscope measurements are interpreted correctly and the sensor itself is properly calibrated. For more accurate high-end gyroscopes, this factor would be even larger. Consequently, we can rotate the sensor for a significantly longer time before reaching the error of the sequential rotation computation approach.

With these observations, we can confirm our primary assumption; that is, in general, significant error reduction can be achieved with correct interpretation of the 3D gyroscope measurement. We can finally conclude that by adopting the practices presented in this article, more accurate motion tracking and analysis can be performed.

https://doi.org/10.1155/2018/9684326

Conflicts of Interest

The authors declare that they have no conflicts of interest. References

[1] R. Antonello and R. Oboe, "Exploring the potential of MEMS gyroscopes: successfully using sensors in typical industrial motion control applications," IEEE Industrial Electronics Magazine, vol. 6, no. 1, pp. 14-24, 2012.

[2] N. Barbour and G. Schmidt, "Inertial sensor technology trends," IEEE Sensors Journal, vol. 1, no. 4, pp. 332-339, 2001.

[3] N. Yazdi, F. Ayazi, and K. Najafi, "Micromachined inertial sensors," Proceedings of the IEEE, vol. 86, no. 8, pp. 1640-1659, 1998.

[4] J. Bijker and W. Steyn, "Kalman filter configurations for a low-cost loosely integrated inertial navigation system on an airship," Control Engineering Practice, vol. 16, no. 12, pp. 1509-1518, 2008.

[5] J. H. Wang and Y. Gao, "Land vehicle dynamics-aided inertial navigation," IEEE Transactions on Aerospace and Electronic Systems, vol. 46, no. 4, pp. 1638-1653, 2010.

[6] M. Koifman and I. Y. Bar-Itzhack, "Inertial navigation system aided by aircraft dynamics," IEEE Transactions on Control Systems Technology, vol. 7, no. 4, pp. 487-493, 1999.

[7] B. Barshan and H. F. Durrant-Whyte, "Inertial navigation systems for mobile robots," IEEE Transactions on Robotics and Automation, vol. 11, no. 3, pp. 328-342, 1995.

[8] O. J. Woodman, "An introduction to inertial navigation," University of Cambridge Computer Laboratory, Cambridge, UK, 2007, UCAM-CL-TR-696, 2007, http://www.cl.cam.ac.uk/ techreports/UCAM-CL-TR-696.html.

[9] T. Seel, J. Raisch, and T. Schauer, "IMU-based joint angle measurement for gait analysis," Sensors, vol. 14, no. 12, pp. 6891-6909, 2014.

[10] D. McIlwraith, J. Pansiot, and G. Z. Yang, "Wearable and ambient sensor fusion for the characterisation of human motion," in 2010IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 5505-5510, Taipei, 2010.

[11] O. Tunnel, K. Altun, and B. Barshan, "Classifying human leg motions with uniaxial piezoelectric gyroscopes," Sensors, vol. 9, no. 12, pp. 8508-8546, 2009.

[12] B. Ayrulu-Erdem and B. Barshan, "Leg motion classification with artificial neural networks using wavelet-based features of gyroscope signals," Sensors, vol. 11, no. 12, pp. 1721-1743, 2011.

[13] K. Altun, B. Barshan, and O. Tunnel, "Comparative study on classifying human activities with miniature inertial and magnetic sensors," Pattern Recognition, vol. 43, no. 10, pp. 3605-3620, 2010.

[14] S. Stancin and S. Tomazic, "Early improper motion detection in golf swings using wearable motion sensors: the first approach," Sensors, vol. 13, no. 12, pp. 7505-7521, 2013.

[15] A. M. Sabatini, "Estimating three-dimensional orientation of human body parts by inertial/magnetic sensing," Sensors, vol. 11, no. 12, pp. 1489-1525, 2011.

[16] H. J. Luinge and P. H. Veltink, "Measuring orientation of human body segments using miniature gyroscopes and accelerometers," Medical & Biological Engineering & Computing, vol. 43, no. 2, pp. 273-282, 2005.

[17] D. Roetenberg, H. J. Luinge, and P. Slycke, Xsens MVN: Full 6DOF Human Motion Tracking Using Miniature Inertial Sensors, Xsens Tech, Netherlands, 2013, http://www.xsens.com/ images/stories/PDF/.

[18] P. Sarcevic, Z. Kincses, S. Pletl, and L. Schaffer, "Distributed movement recognition algorithm based on wrist-mounted wireless sensor motes," in Proceedings of European Wireless 2015; 21th European Wireless Conference, pp. 1-6, Budapest, Hungary, 2015.

[19] D. T. P. Fong and Y. Y. Chan, "The use of wearable inertial motion sensors in human lower limb biomechanics studies: a systematic review," Sensors, vol. 10, no. 12, pp. 11556-11565, 2010.

[20] K. Aminian and B. Najafi, "Capturing human motion using body-fixed sensors: outdoor measurement and clinical applications," Computer Animation & Virtual Worlds, vol. 15, no. 2, pp. 79-94, 2004.

[21] D. Roetenberg, P. J. Slycke, and P. H. Veltink, "Ambulatory position and orientation tracking fusing magnetic and inertial sensing," IEEE Transactions on Biomedical Engineering, vol. 54, no. 5, pp. 883-890, 2007.

[22] H. J. Luinge and P. H. Veltink, "Ambulatory measurement of arm orientation," Journal of Biomechanics, vol. 40, no. 1, pp. 78-85, 2007.

[23] K. Watanabe and M. Hokari, "Kinematical analysis and measurement of sports form," IEEE Transactions on Systems, Man, and Cybernetics--Part A: Systems and Humans, vol. 36, no. 3, pp. 549-557, 2006.

[24] D. Giansanti, G. Maccioni, and V. Macellari, "The development and test of a device for the reconstruction of 3-D position and orientation by means of a kinematic sensor assembly with rate gyroscopes and accelerometers," IEEE Transactions on Biomedical Engineering, vol. 52, no. 7, pp. 1271-1277, 2005.

[25] D. Jurman, M. Jankovec, R. Kamnik, and M. Topic, "Calibration and data fusion solution for the miniature attitude and heading reference system," Sensors and Actuators A: Physical, vol. 138, no. 2, pp. 411-420, 2007.

[26] W. Dong, K. Y. Lim, Y. K. Goh et al., "A low-cost motion tracker and its error analysis," in 2008 IEEE International Conference on Robotics and Automation, pp. 311-316, Pasadena, CA, USA, 2008.

[27] H. Negoro, M. Ueda, K. Watanabe, K. Kobayashi, and Y. Kurihara, "Measurement and analysis of golf swing using 3D acceleration and gyroscopic sensors," in 2011 Proceedings of SICE Annual Conference, pp. 1111-1114, Tokyo, Japan, 2011.

[28] J. C. Lementec and P. Bajcsy, "Recognition of arm gestures using multiple orientation sensors: gesture classification," in Proceedings. The 7th International IEEE Conference on Intelligent Transportation Systems, pp. 965-970, Washington, WA, USA, 2004.

[29] G. Schall, D. Wagner, G. Reitmayr et al., "Global pose estimation using multi-sensor fusion for outdoor augmented reality," in 2009 8th IEEE International Symposium on Mixed and Augmented Reality, pp. 153-162, Orlando, FL, USA, 2009.

[30] S. Stancin and S. Tomazic, "Angle estimation of simultaneous orthogonal rotations from 3D gyroscope measurements," Sensors, vol. 11, no. 12, pp. 8536-8549, 2011.

[31] J. B. Kuipers, Quaternions & Rotation Sequences: A Primer with Applications to Orbits, Aerospace and Virtual Reality, Princeton University Press, Princeton, NJ, USA, 2nd edition, 1999.

[32] D. Giansanti and G. Maccioni, "Guidelines for calibration and drift compensation of a wearable device with rate-gyroscopes and accelerometers," in 2007 29th Annual International Conference on the IEEE Engineering in Medicine and Biology Society, pp. 2342-2345, Lyon, 2007.

[33] J. Fei and J. Zhou, "Robust adaptive control of MEMS triaxial gyroscope using fuzzy compensator," IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 42, no. 6, pp. 1599-1607, 2012.

[34] R. P. Leland, "Adaptive control of a MEMS gyroscope using lyapunov methods," IEEE Transactions on Control Systems Technology, vol. 14, no. 2, pp. 278-283, 2006.

[35] S. Stancin and S. Tomazic, "Time- and computation-efficient calibration of MEMS 3D accelerometers and gyroscopes," Sensors, vol. 14, no. 8, pp. 14885-14915, 2014.

[36] S. Tomazic and S. Stancin, "Simultaneous orthogonal rotation angle," Journal of Electrical Engineering and Computer Science, vol. 78, pp. 7-11, 2011.

[37] InvenSense ITG3200-3, Integrated triple-axis digital-output gyroscopeInvenSense, Sunnyvale, CA, USA, http://www .invensense.com/mems/gyro/itg3200.html.

Sara Stancin and Saso Tomazic

Faculty of Electrical Engineering, University of Ljubljana, Trzaska 25, Ljubljana, Slovenia

Correspondence should be addressed to Sara Stancin; sara.stancin@fe.uni-lj.si

Received 5 September 2017; Accepted 7 February 2018; Published 22 March 2018

Academic Editor: Carmine Granata

Caption: Figure 1: Angular orientation systematic error that arises from erroneously interpreting three simultaneous rotations measured using a 3D gyroscope as sequential rotations. For all examples presented, the sensor rotates for 360[degrees] in one second. The examples differ with respect to the sampling frequency. Different sampling frequencies correspond to different magnitudes of angles of the individual rotation steps. We can observe that the error increases with the magnitude of the individual rotation angles. For large angles, angular orientation becomes completely unreliable.

Caption: Figure 2: Angular velocities obtained using a calibrated 3D gyroscope in a noisy environment and used for estimating the effect of the erroneous rotation interpretation on angular orientation accuracy in real-case measurements. The angular orientation of the sensor was computed and evaluated after each of the seven measurements. The dynamics of the angular velocities was progressively increasing from the first to the last measurements. The motion was otherwise arbitrary.

Caption: Figure 3: Angular orientation errors, expressed as normalized deviation angles, and error reduction for different angular velocity dynamics. We can observe that for both the simultaneous and sequential interpretations of 3D gyroscope measurements, in general, the error increases as the average angular velocity change becomes larger. However, for all considered measurements, significant error reduction is achieved if the measured values are correctly interpreted as simultaneous rotations.
COPYRIGHT 2018 Hindawi Limited
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Research Article
Author:Stancin, Sara; Tomazic, Saso
Publication:Journal of Sensors
Date:Jan 1, 2018
Words:5364
Previous Article:Remote Sensing of Sustainable Ecosystems.
Next Article:A Strong Tracking SLAM Algorithm Based on the Suboptimal Fading Factor.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters