Printer Friendly
The Free Library
23,375,127 articles and books


Reliability-based rearrangement of ECG automated interpretation chain.

ABSTRACT

Objective: Sequences of electrocardiogram (ECG) interpretation procedures from various manufacturers of ECG-dedicated software were studied in aspect of data flow and reliability of intermediate results. The results motivated us to design a new system architecture considering error propagation issues on subsequent stages of ECG processing and reducing the data stream on initial stages.

Methods: The proposed architecture has network topology and consists of the procedures interconnected by data buses. Each node and isolated sub-networks were tested against the MIT-BIH, CSE standard databases and described by the incorrect result probability and data reduction efficiency. The optimized solution considers also the probability of the procedure use and probability of useless outcome. Best performing network was selected and compared to the original sequential interpretation chain.

Results: The optimized architecture moves reduction-effective functions to the front of the processing chain, reduces the cumulative error propagation by parallel use of multiple short processing chains and reduces the interpretation processing time and the required computational power. Depending on interpretation domain, the reduction of outcome relative inaccuracy was up to 87% (from 2.8% to 1.5%) for pacemaker pulse detection or 70% (from 6.3% to 3.7%) for wave axes determination.

Conclusion: Significant improvements in automated ECG interpretation were achieved by rearrangement of processing chain only, without any change in processing methods. Reduction of the data stream at early processing stages is particularly advantageous in wireless interpretation system, since task sharing involves minimum exploitation costs.

Key words: E-health, home care, pervasive ECG monitoring, distributed systems, ubiquitous computing

Introduction

Automatic electrocardiogram (ECG) interpretation algorithms are currently implemented in various devices and the fast growing domain of particular interest are small wearable cardiomonitors. The autonomous recorders monitoring, interpretation and reporting in real-time (1-4) rise new prerequisites for the architecture of interpretive software. Despite our studies included interpretive algorithms from various manufacturers (5-9), all investigated applications repeat a very similar architecture pattern originating from the upgradeable modules concept (Fig. 1). This architecture pattern reflects the functional growth, very convenient for manufacturers, since diverse users-dedicated software is built from the same modules (10).

[FIGURE 1 OMITTED]

The alternative concept, presented in this paper, aims at the optimization of data reliability and consequently at the improvement of the diagnostic outcome reliability. We demonstrate that in a limited resources condition, effective reduction of data stream and avoiding of unnecessary operations may help for resources relocation towards more relevant calculation and also rise the quality improvement.

As a result of our investigations of existing interpretive software for electrocardiographs, in this paper we generalize the rules concerning an optimal architecture for maximum data reliability.

Methods

The proposed rearrangement of the processing chain consists in reduction of its length and in putting most reliable procedures at the front. Maintaining the overall computation costs unchanged is a very important advantage in aspect of wearable implementations.

In course of our research on a wearable implementation of ECG interpretation process, we had to rearrange the interpretive software structure. This opened the opportunity for reliability-based investigations of an optimal data flow and interpretation chain architecture. We consider diagnostic subroutines as black-boxes and do not attempt to modify the mathematical foundations of interpretive algorithms. The value of probability of each function call was estimated from the occurrence of particular heart disease in the database.

The theoretical considerations were developed to an experimental implementation allowing selective saving and assessment of meta-data. Our prototype was tested with use of two references:

--standard ECG databases: MIT-BIH (11) and CSE (12) recommended for testing of the software performance,

--standard interpretive software designed to be embedded in a stand-alone ECG machine.

A. Pursuit of the incorrect interpretation probability

In the ECG interpretation chain, each subroutine is based on heuristic assumptions and consequently has an intrinsic non-zero probability of inappropriate processing and incorrect outcome. Because the software manufacturer is not able to foresee all combinations of possible signal recording conditions and possible heart diseases, the performance control is not guaranteed by testing procedures even very thorough. Subsequent procedures in a typical processing chain use reuse results of the previous steps, thus misinterpretations and inaccuracies propagated in the chain may compensate or cumulate (13). Unfortunately, statistically writing, the correct and accurate result is a singularity in a cloud of all possible outcomes making the error-cumulative scenario is much more probable.

First experiment aimed at estimating a set of statistical parameters for each interpretive procedure representing its functionality and dependence on precedent processing stages (Fig. 2) by the values of:

--[delta]--outcome relative inaccuracy (%),

--[epsilon]--probability of false outcome (%),

--r--data reduction ratio,

--p--probability of use (%) (depending on the frequency of related disease occurrence).

[FIGURE 2 OMITTED]

B. Investigations of the data reduction efficiency

Bearing in mind the wearable target implementation, high reliability was the principal, but not the only optimization factor. From a statistical viewpoint, the ECG interpretation is a sort of data reduction process. The effective data reduction at the beginning of the process, postulated by the wearable recorders implementation can be achieved with use of:

--putting the most reduction-effective procedures in front of the processing chain,

--putting the most frequently used procedures in front of the processing chain.

The second experiment consisted in quantitative measurement of information stream reduction ratio for each subroutine in the ECG processing chain by comparing the expected throughput of the outputs and the inputs.

C. Concepts of software architecture

Asynchronous computing and data buses concept were considered as two alternative approaches to the new architecture design.

Asynchronous computing is based on different validity periods of metadata. In a system with variable data validity intervals the triggering procedure works individually for each diagnostic procedure and is located at the end of the corresponding processing path. Since the path usually contains multiple procedures in chain, each data request implies the use of existing metadata as long as within their validity period, or otherwise is transmitted to the previous procedure (Fig. 3). Since for some metadata the validity period is longer than for the final data, only a fraction of the triggers achieves the beginning of the processing chain avoiding the unnecessary processing of huge amount of raw data.

[FIGURE 3 OMITTED]

Data buses are inter-procedure information channels described by statistical parameters of the data estimating the expected throughput (Fig. 4). Each data flow was assigned a throughput level being a combination of average data stream when used, frequency of use (depending on data refresh rate) and probability of use (depending on the frequency of related disease occurrence).

[FIGURE 4 OMITTED]

Optimization of data buses restricts the access to the unprocessed data representation for subroutines of high peak throughput and high frequency of use. Such procedures were consequently redesigned towards the use of a common and very reliable data interface. The particular challenge is the consolidation of all functions having access to the raw signal at their inputs (Fig. 1).

The software architecture rearrangements are constrained by the logical flow of ECG diagnostic procedure. In some rare cases the data processing chain has to follow a specified order providing first the general information (e.g. a heart beat was detected) and then more precise details (e.g. morphology type or waves length). Within these constraints the reduction-effective and frequently used procedures were identified in the second experiment.

Results

Experimental results

The experiment on the incorrect interpretation probability (see IIA) resulted in a value priority level computed from statistical reliability and usage description and attributed to each basic ECG interpretation procedure (Table 1).

The final architecture of ECG interpretation software is based on data buses concept and after being optimized for reliability and early reduction of data stream contains three raw signal access points (Fig. 5):

--common interface for signal quality estimation, baseline estimation and pacemaker detection procedures,

--heart beat detection filtered input,

--waves measurement and axis determination filtered off-line buffer (provided also for ST measurement and P-averaging not considered here).

[FIGURE 5 OMITTED]

The group of functions accessing the raw signal issues a complete description of a heart beat (bus 2), which is not the diagnostic outcome, but contains all meta-data necessary for further processing and thus the raw signal is not necessary.

These data appear occasionally, once per heart beat, but even for as high heart rate as 180 bps, the data stream is 8.2 times lower than for a 12-lead 500 sps raw signal. Should the remaining interpretation procedures be performed by another node of a distributed network, breaking the local processing here is a convenient solution. The architecture optimization was performed on a standard ECG interpretation software provided for tests by a regional manufacturer.

The structured source code was written in C++ programming language. The original target application was the interpretive bedside ECG recorder, and the purpose of the optimization was a manufacturer's interest in migrating to a wearable computer platform. Third experiment aimed at the separate assessment of the original and the modified architectures by two factors:

--average reduction of data rate at the subsequent processing stages,

--average inaccuracy and error probability for selected diagnostic parameters.

Table 2. compares the average data reduction ratio on subsequent stages of interpretation process. The right column highlights the difference between the original and optimized architectures, and proves that significant data reduction was achieved by the modified architecture on the early stages of interpretation process.

Since the processing stages could not be set in a similar way for the original and for the modified software, with regard to the identical testing conditions, we used the processing time as an estimation of the interpretation progress. The processing stages were set at every 20 percent of the total interpretation time. Such relative approach compensates the variability of particular ECG files processing time, and the average processing time, shorter for redesigned software.

The result accuracy of modified architecture was tested accordingly to the international standards requirements (14). The quantitative results for both architectures are summarized in Table 3. The comparison of the diagnostic reliability for isolated procedures (Table 1) with the corresponding results of the whole processing chain (Table 3) leads to the conclusion that in case of the optimized architecture, the overall reliability of each parameter is much less affected by the remaining procedures of ECG interpretation chain.

Discussion

The work presented in this paper was motivated by recent changes in cardiac monitoring techniques made towards the applications of modern digital communication technology. The classical approach to the ECG interpretation processing chain was revised and important software architecture modifications were proposed to overcome two principal drawbacks:

--necessity of raw signal access on the advanced processing stages,

--cumulative error propagation resulting from data dependencies in the processing chain.

Both aspects were thoroughly studied and research results were applied to the real interpretive software, taking an opportunity of co-operation with the ECG equipment manufacturer. The modular software was modified only at the subroutines interconnection level without changes or adjustment of mathematical methods. Main result is the relative improvement of diagnostic outcome accuracy and data stream reduction, rather than their absolute values. Therefore, any manufacturer may check his software for the concordance with the guidelines issued herein. The aim of our research was fully achieved. We proved that the software architecture optimization is suitable for interpretation improvement in the following areas:

--moves reduction-effective functions to the front of processing chain and consequently reduces the inter-procedures data flow, and thus lowers the communication costs in case of distributed processing,

--reduces the cumulative error propagation by the parallel use of multiple short processing chains instead of one long chain,

--reduces the interpretation processing time and the required computational power, and thus extends the wearable devices autonomy time.

Acknowledgment

Scientific work was financed by State Committee for Scientific Research resources in years 2004-2007 as a research project No. 3 T11E 00127

References

(1.) Chiarugi F, Trypakis D, Kontogiannis V, Less PJ, Chronaki CE, Zeaki M, et. al. Continuous ECG monitoring in the management of pre-hospital health emergencies. Comput Cardiol 2003: 30; 205-8.

(2.) Pinna GD, Maestri R, Gobbi E, La Rovere MT, Scanferlato JL. Home telemonitoring of chronic heart failure patients: novel system architecture of the home or hospital in heart failure study. Comput Cardiol 2003: 30; 105-8.

(3.) Banitsas KA, Georgiadis P, Tachakra S, Cavouras D. Using handheld devices for real-time wireless Teleconsultation. IEMBS '04. Proceedings of the Engineering in Medicine and Biology Society, 2004. 26th Annual International Conference of the IEEE; 2004 Sept 1-5; 2004; 2: 3105-8.

(4.) Bar-Or A, Healey J, Kontothanassis L, Van Thong JM. BioStream: A system architecture for real-time processing of physiological signals. IEMBS '04. Proceedings of the Engineering in Medicine and Biology Society, 2004. 26th Annual International Conference of the IEEE; 2004 Sept 1-5; 2004; 2: 3101-4.

(5.) IBM electrocardiogram analysis program physician's guide (5736-H15). 2nd edition IBM. 1974.

(6.) HP M1700A Interpretive cardiograph physician's guide. 4th edition. Hewlett-Packard. 1994.

(7.) DRG MediArc Premier IV operator's manual version 2.2. 1995.

(8.) ECAPS-12C User guide: interpretation standard revision A. Nihon Kohden. 2001.

(9.) CardioSoft Version 6.0 Operator's Manual. GE medical systems information technologies, Inc: Milwaukee; 2005.

(10.) Paoletti M, Marchesi C. Low computational cost algorithms for portable ECG monitoring units. IFMBE Medicon 2004. Proceedings of the Mediterranean Conference on Medical and Biological Engineering; 2004 July 31- August 5; Naples, Italy; 2004. p. 231.

(11.) Moody G. MIT/BIH Arrhythmia database distribution. Massachusetts Institute of Technology, Division of Health Science and Technology: Cambridge, MA; 1993.

(12.) Willems JL. Common standards for quantitative electrocardiography. 10-th CSE Progress Report. ACCO publ: Leuven; 1990.

(13.) Straszecka E, Straszecka J. Uncertainty and imprecision representation in medical diagnostic rules. IFMBE Medicon 2004. Proceedings of the Mediterranean Conference on Medical and Biological Engineering ; 2004 July 31- August 5; Naples, Italy; 2004. p. 172.

(14.) IEC 60601-2-47. Medical electrical equipment: Particular requirements for the safety, including essential performance, of ambulatory electrocardiographic systems. 2001.

Piotr Augustyniak

AGH University of Science and Technology, Krakow, Poland

Address for Correspondence: Piotr Augustyniak, AGH University of Science and Technology, 30, Mickiewicza Ave. 30-059, Krakow, Poland Phone: +48126174712 Fax +48126341568 E-mail: august@agh.edu.pl
Table 1. Basic ECG interpretation procedures, their
statistical parameters and attributed priority levels

 Statistical parameters, %

Procedure name Priority
 level
 [delta] [epsilon] r p

Signal quality
 assessment 10 3.3 20 97 1
Pacemaker pulse
 detection <1 8.3 70 3 4
Heart beat
 detection 1.5 2.5 70 100 2
Baseline
 estimation 3 <1 20 97 3
Heart rate
 estimation <1 <1 1 100 1
Heart beat
 classification 10 3 50 88 1
Waves measurement 3 5 100 85 2
Axis determination 3 5 300 85 3
Dominant
 rhythm detection 0 8.5 1.5 100 1
Arrhythmia
 detection 0 10 1.3 80 2

ECG--electrocardiogram, [delta]--outcome relative inaccuracy,
[epsilon]--probability of false outcome, r--data reduction ratio,
p--probability of use

Table 2. Average data reduction ratio (%) on
subsequent stages of interpretation process

 Data reduction related to
Interpretation raw signal, %
progress Data
(% of total Original Optimized reduction
processing time) architecture architecture gain, %

 0 100 100 0
 20 78 47 31
 40 54 31 23
 60 32 22 10
 80 14 12 2
 100 8 8 0

Table 3. Diagnostic parameters quality achieved by
the original and the optimized architectures

 Statistical parameters

Interpretation Original
domain architecture, %

 [delta] [epsilon]

Pacemaker pulse detection 2.8 9.3
Heart beat detection 2.5 3.5
Baseline estimation 4.3 1.3
Heart rate estimation 1.0 1.2
Heart beat classification 14.0 7.1
Waves measurement 5.1 7.5
Axis determination 6.3 7.8
Dominant rhythm detection 0.0 10.5
Arrhythmia detection 0.0 13.0

 Statistical parameters

Interpretation Optimized
domain architecture,^

 [DELTA] [epsilon]

Pacemaker pulse detection 1.5 9.0
Heart beat detection 1.7 2.9
Baseline estimation 4.3 1.3
Heart rate estimation 1.0 1.2
Heart beat classification 12.0 4.0
Waves measurement 3.3 5.3
Axis determination 3.7 5.1
Dominant rhythm detection 0.0 8.8
Arrhythmia detection 0.0 11.8

([delta])--outcome relative inaccuracy,
([epsilon])--probability of false outcome
COPYRIGHT 2007 Galenos Yayincilik
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2007 Gale, Cengage Learning. All rights reserved.

 Reader Opinion

Title:

Comment:



 

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Original Investigation; electrocardiogram
Author:Augustyniak, Piotr
Publication:The Anatolian Journal of Cardiology (Anadolu Kardiyoloji Dergisi)
Article Type:Report
Geographic Code:4EXPO
Date:Jul 1, 2007
Words:2663
Previous Article:Understanding ST depression in the stress-test ECG.
Next Article:Application of body surface potential mapping in coronary artery disease diagnosis.
Topics:



Related Articles
No pattern seen in survey for overreader qualifications.
Electrocardiographic artifacts.
EZ ECG rhythm interpretation.
Comparative assessment of ECG dynamics in myocardial infarction according to reperfusion therapy approach (primary and facilitated coronary...
The Pierre Rijlant lecture 2007: the future of electrocardiography.
Prevalence of ST-elevation in right precordial leads in patients presenting with acute coronary syndrome without ST-elevation in standard 12-lead...
Comparison of magnetocardiography and electrocardiography.

Terms of use | Copyright © 2014 Farlex, Inc. | Feedback | For webmasters