Reliability-based rearrangement of ECG automated interpretation chain.ABSTRACT
Objective: Sequences of electrocardiogram electrocardiogram /elec·tro·car·dio·gram/ (-kahr´de-o-gram?) a graphic tracing of the variations in electrical potential caused by the excitation of the heart muscle and detected at the body surface. (ECG ECG electrocardiogram.
Also called an electrocardiogram, it records the electrical activity of the heart. ) interpretation procedures from various manufacturers of ECG-dedicated software were studied in aspect of data flow and reliability of intermediate results. The results motivated us to design a new system architecture considering error propagation issues on subsequent stages of ECG processing and reducing the data stream on initial stages.
Methods: The proposed architecture has network topology See topology. and consists of the procedures interconnected by data buses. Each node and isolated sub-networks were tested against the MIT-BIH, CSE (Certified Systems Engineer) See Microsoft certification. standard databases and described by the incorrect result probability and data reduction efficiency. The optimized solution considers also the probability of the procedure use and probability of useless outcome. Best performing network was selected and compared to the original sequential interpretation chain.
Results: The optimized architecture moves reduction-effective functions to the front of the processing chain, reduces the cumulative error propagation by parallel use of multiple short processing chains and reduces the interpretation processing time and the required computational power. Depending on interpretation domain, the reduction of outcome relative inaccuracy in·ac·cu·ra·cy
n. pl. in·ac·cu·ra·cies
1. The quality or condition of being inaccurate.
2. An instance of being inaccurate; an error. was up to 87% (from 2.8% to 1.5%) for pacemaker pacemaker
Source of rhythmic electrical impulses that trigger heart contractions. In the heart's electrical system, impulses generated at a natural pacemaker are conducted to the atria and ventricles. pulse detection or 70% (from 6.3% to 3.7%) for wave axes determination.
Conclusion: Significant improvements in automated ECG interpretation were achieved by rearrangement re·ar·range
tr.v. re·ar·ranged, re·ar·rang·ing, re·ar·rang·es
To change the arrangement of.
re of processing chain only, without any change in processing methods. Reduction of the data stream at early processing stages is particularly advantageous in wireless interpretation system, since task sharing involves minimum exploitation costs.
Key words: E-health, home care, pervasive ECG monitoring, distributed systems Distributed systems (computers)
A distributed system consists of a collection of autonomous computers linked by a computer network and equipped with distributed system software. , ubiquitous computing ubiquitous computing - Computers everywhere. Making many computers available throughout the physical environment, while making them effectively invisible to the user. Ubiquitous computing is held by some to be the Third Wave of computing.
Automatic electrocardiogram (ECG) interpretation algorithms are currently implemented in various devices and the fast growing domain of particular interest are small wearable cardiomonitors. The autonomous recorders monitoring, interpretation and reporting in real-time (1-4) rise new prerequisites for the architecture of interpretive software. Despite our studies included interpretive algorithms from various manufacturers (5-9), all investigated applications repeat a very similar architecture pattern originating from the upgradeable modules concept (Fig. 1). This architecture pattern reflects the functional growth, very convenient for manufacturers, since diverse users-dedicated software is built from the same modules (10).
[FIGURE 1 OMITTED]
The alternative concept, presented in this paper, aims at the optimization of data reliability and consequently at the improvement of the diagnostic outcome reliability. We demonstrate that in a limited resources condition, effective reduction of data stream and avoiding of unnecessary operations may help for resources relocation towards more relevant calculation and also rise the quality improvement.
As a result of our investigations of existing interpretive software for electrocardiographs, in this paper we generalize generalize /gen·er·al·ize/ (-iz)
1. to spread throughout the body, as when local disease becomes systemic.
2. to form a general principle; to reason inductively. the rules concerning an optimal architecture for maximum data reliability.
The proposed rearrangement of the processing chain consists in reduction of its length and in putting most reliable procedures at the front. Maintaining the overall computation costs unchanged is a very important advantage in aspect of wearable implementations.
In course of our research on a wearable implementation of ECG interpretation process, we had to rearrange re·ar·range
tr.v. re·ar·ranged, re·ar·rang·ing, re·ar·rang·es
To change the arrangement of.
re the interpretive software structure. This opened the opportunity for reliability-based investigations of an optimal data flow and interpretation chain architecture. We consider diagnostic subroutines as black-boxes and do not attempt to modify the mathematical foundations of interpretive algorithms. The value of probability of each function call was estimated from the occurrence of particular heart disease in the database.
The theoretical considerations were developed to an experimental implementation allowing selective saving and assessment of meta-data. Our prototype was tested with use of two references:
--standard ECG databases: MIT-BIH (11) and CSE (12) recommended for testing of the software performance,
--standard interpretive software designed to be embedded Inserted into. See embedded system. in a stand-alone ECG machine.
A. Pursuit of the incorrect interpretation probability
In the ECG interpretation chain, each subroutine A group of instructions that perform a specific task. A large subroutine might be called a "module" or "procedure." Subroutine is somewhat of a dated term, but it is still quite valid. is based on heuristic A method of problem solving using exploration and trial and error methods. Heuristic program design provides a framework for solving the problem in contrast with a fixed set of rules (algorithmic) that cannot vary.
1. assumptions and consequently has an intrinsic non-zero probability of inappropriate processing and incorrect outcome. Because the software manufacturer is not able to foresee all combinations of possible signal recording conditions and possible heart diseases, the performance control is not guaranteed by testing procedures even very thorough. Subsequent procedures in a typical processing chain use reuse results of the previous steps, thus misinterpretations and inaccuracies propagated in the chain may compensate or cumulate (13). Unfortunately, statistically writing, the correct and accurate result is a singularity (1) See technology singularity.
(2) (Singularity) An experimental operating system from Microsoft for the x86 platform written almost entirely in C#, a .NET managed code language. Released in 2007, Singularity is a non-Windows research project. in a cloud of all possible outcomes making the error-cumulative scenario is much more probable.
First experiment aimed at estimating a set of statistical parameters A statistical parameter is a parameter that indexes a family of probability distributions.
Among parameterized families of distributions are the normal distributions, the Poisson distributions, the binomial distributions, and the exponential distributions. for each interpretive procedure representing its functionality and dependence on precedent processing stages (Fig. 2) by the values of:
--[delta]--outcome relative inaccuracy (%),
--[epsilon]--probability of false outcome (%),
--r--data reduction ratio,
--p--probability of use (%) (depending on the frequency of related disease occurrence).
[FIGURE 2 OMITTED]
B. Investigations of the data reduction efficiency
Bearing in mind the wearable target implementation, high reliability was the principal, but not the only optimization factor. From a statistical viewpoint, the ECG interpretation is a sort of data reduction process. The effective data reduction at the beginning of the process, postulated pos·tu·late
tr.v. pos·tu·lat·ed, pos·tu·lat·ing, pos·tu·lates
1. To make claim for; demand.
2. To assume or assert the truth, reality, or necessity of, especially as a basis of an argument.
3. by the wearable recorders implementation can be achieved with use of:
--putting the most reduction-effective procedures in front of the processing chain,
--putting the most frequently used procedures in front of the processing chain.
The second experiment consisted in quantitative measurement of information stream reduction ratio for each subroutine in the ECG processing chain by comparing the expected throughput of the outputs and the inputs.
C. Concepts of software architecture
Asynchronous Refers to events that are not synchronized, or coordinated, in time. The following are considered asynchronous operations. The interval between transmitting A and B is not the same as between B and C. The ability to initiate a transmission at either end. computing and data buses concept were considered as two alternative approaches to the new architecture design.
Asynchronous computing is based on different validity periods of metadata. In a system with variable data validity intervals the triggering procedure works individually for each diagnostic procedure and is located at the end of the corresponding processing path. Since the path usually contains multiple procedures in chain, each data request implies the use of existing metadata as long as within their validity period, or otherwise is transmitted to the previous procedure (Fig. 3). Since for some metadata the validity period is longer than for the final data, only a fraction of the triggers achieves the beginning of the processing chain avoiding the unnecessary processing of huge amount of raw data.
[FIGURE 3 OMITTED]
Data buses are inter-procedure information channels described by statistical parameters of the data estimating the expected throughput (Fig. 4). Each data flow was assigned a throughput level being a combination of average data stream when used, frequency of use (depending on data refresh rate The number of times per second that a device, such as a display screen or DRAM chip, is re-energized. See vertical scan frequency and dynamic RAM.
(hardware) refresh rate ) and probability of use (depending on the frequency of related disease occurrence).
[FIGURE 4 OMITTED]
Optimization of data buses restricts the access to the unprocessed data representation for subroutines of high peak throughput and high frequency of use. Such procedures were consequently redesigned towards the use of a common and very reliable data interface. The particular challenge is the consolidation of all functions having access to the raw signal at their inputs (Fig. 1).
The software architecture rearrangements are constrained by the logical flow of ECG diagnostic procedure. In some rare cases the data processing data processing or information processing, operations (e.g., handling, merging, sorting, and computing) performed upon data in accordance with strictly defined procedures, such as recording and summarizing the financial transactions of a chain has to follow a specified order providing first the general information (e.g. a heart beat was detected) and then more precise details (e.g. morphology type or waves length). Within these constraints the reduction-effective and frequently used procedures were identified in the second experiment.
The experiment on the incorrect interpretation probability (see IIA (1) (Information Industry Association, Washington, DC) In 1999, IIA merged with SPA (Software Publishers Association) to become the Software & Information Industry Association. See SIIA. ) resulted in a value priority level computed from statistical reliability and usage description and attributed to each basic ECG interpretation procedure (Table 1).
The final architecture of ECG interpretation software is based on data buses concept and after being optimized for reliability and early reduction of data stream contains three raw signal access points (Fig. 5):
--common interface for signal quality estimation, baseline estimation and pacemaker detection procedures,
--heart beat detection filtered input,
--waves measurement and axis determination filtered off-line buffer (provided also for ST measurement and P-averaging not considered here).
[FIGURE 5 OMITTED]
The group of functions accessing the raw signal issues a complete description of a heart beat (bus 2), which is not the diagnostic outcome, but contains all meta-data necessary for further processing and thus the raw signal is not necessary.
These data appear occasionally, once per heart beat, but even for as high heart rate as 180 bps, the data stream is 8.2 times lower than for a 12-lead 500 sps raw signal. Should the remaining interpretation procedures be performed by another node of a distributed network, breaking the local processing here is a convenient solution. The architecture optimization was performed on a standard ECG interpretation software provided for tests by a regional manufacturer.
The structured source code was written in C++ programming language. The original target application was the interpretive bedside ECG recorder, and the purpose of the optimization was a manufacturer's interest in migrating to a wearable computer See body-worn computer. platform. Third experiment aimed at the separate assessment of the original and the modified architectures by two factors:
--average reduction of data rate at the subsequent processing stages,
--average inaccuracy and error probability for selected diagnostic parameters.
Table 2. compares the average data reduction ratio on subsequent stages of interpretation process. The right column highlights the difference between the original and optimized architectures, and proves that significant data reduction was achieved by the modified architecture on the early stages of interpretation process.
Since the processing stages could not be set in a similar way for the original and for the modified software, with regard to the identical testing conditions, we used the processing time as an estimation of the interpretation progress. The processing stages were set at every 20 percent of the total interpretation time. Such relative approach compensates the variability of particular ECG files processing time, and the average processing time, shorter for redesigned software.
The result accuracy of modified architecture was tested accordingly to the international standards requirements (14). The quantitative results for both architectures are summarized in Table 3. The comparison of the diagnostic reliability for isolated procedures (Table 1) with the corresponding results of the whole processing chain (Table 3) leads to the conclusion that in case of the optimized architecture, the overall reliability of each parameter is much less affected by the remaining procedures of ECG interpretation chain.
The work presented in this paper was motivated by recent changes in cardiac monitoring techniques made towards the applications of modern digital communication technology. The classical approach to the ECG interpretation processing chain was revised and important software architecture modifications were proposed to overcome two principal drawbacks:
--necessity of raw signal access on the advanced processing stages,
--cumulative error propagation resulting from data dependencies A data dependency in computer science is a situation whereby computer instructions refer to the results of preceding instructions that have not yet been completed. This can also be known as a data hazard. Ignoring data dependencies can result in race conditions. in the processing chain.
Both aspects were thoroughly studied and research results were applied to the real interpretive software, taking an opportunity of co-operation with the ECG equipment manufacturer. The modular software See modular programming. was modified only at the subroutines interconnection level without changes or adjustment of mathematical methods. Main result is the relative improvement of diagnostic outcome accuracy and data stream reduction, rather than their absolute values. Therefore, any manufacturer may check his software for the concordance concordance /con·cor·dance/ (-kord´ins) in genetics, the occurrence of a given trait in both members of a twin pair.concor´dant
n. with the guidelines issued herein. The aim of our research was fully achieved. We proved that the software architecture optimization is suitable for interpretation improvement in the following areas:
--moves reduction-effective functions to the front of processing chain and consequently reduces the inter-procedures data flow, and thus lowers the communication costs in case of distributed processing The first term used to describe the distribution of multiple computers throughout an organization in contrast to a centralized system. It started with the first minicomputers. Today, distributed processing is called "distributed computing." See also client/server. ,
--reduces the cumulative error propagation by the parallel use of multiple short processing chains instead of one long chain,
--reduces the interpretation processing time and the required computational power, and thus extends the wearable devices autonomy time.
Scientific work was financed by State Committee for Scientific Research resources in years 2004-2007 as a research project No. 3 T11E 00127
(1.) Chiarugi F, Trypakis D, Kontogiannis V, Less PJ, Chronaki CE, Zeaki M, et. al. Continuous ECG monitoring in the management of pre-hospital health emergencies. Comput Cardiol 2003: 30; 205-8.
(2.) Pinna pinna /pin·na/ (pin´ah) auricle (1).pin´nal
n. pl. pin·nae
pin GD, Maestri R, Gobbi E, La Rovere MT, Scanferlato JL. Home telemonitoring of chronic heart failure patients: novel system architecture of the home or hospital in heart failure study. Comput Cardiol 2003: 30; 105-8.
(3.) Banitsas KA, Georgiadis P, Tachakra S, Cavouras D. Using handheld devices for real-time wireless Teleconsultation. IEMBS IEMBS IEEE (Institute of Electrical and Electronics Engineers, Inc.) Engineering in Medicine and Biology Society '04. Proceedings of the Engineering in Medicine and Biology Society, 2004. 26th Annual International Conference of the IEEE (Institute of Electrical and Electronics Engineers, New York, www.ieee.org) A membership organization that includes engineers, scientists and students in electronics and allied fields. ; 2004 Sept 1-5; 2004; 2: 3105-8.
(4.) Bar-Or A, Healey J, Kontothanassis L, Van Thong JM. BioStream: A system architecture for real-time processing Noun 1. real-time processing - data processing fast enough to keep up with an outside process
data processing - (computer science) a series of operations on data by a computer in order to retrieve or transform or classify information of physiological signals. IEMBS '04. Proceedings of the Engineering in Medicine and Biology Society, 2004. 26th Annual International Conference of the IEEE; 2004 Sept 1-5; 2004; 2: 3101-4.
(5.) IBM (International Business Machines Corporation, Armonk, NY, www.ibm.com) The world's largest computer company. IBM's product lines include the S/390 mainframes (zSeries), AS/400 midrange business systems (iSeries), RS/6000 workstations and servers (pSeries), Intel-based servers (xSeries) electrocardiogram analysis program physician's guide (5736-H15). 2nd edition IBM. 1974.
(6.) HP M1700A Interpretive cardiograph car·di·o·graph
1. An instrument used to record the mechanical movements of the heart.
2. See electrocardiograph.
car physician's guide. 4th edition. Hewlett-Packard. 1994.
(7.) DRG DRG,
n the abbreviation for diagnosis-related group.
see dorsal respiratory group.
DRG Diagnosis-related group Managed care A unit of classifying Pts by diagnosis, average length of hospital stay, and MediArc Premier IV operator's manual version 2.2. 1995.
(8.) ECAPS-12C User guide: interpretation standard revision A. Nihon Kohden. 2001.
(9.) CardioSoft Version 6.0 Operator's Manual. GE medical systems information technologies, Inc: Milwaukee; 2005.
(10.) Paoletti M, Marchesi The nobile family Marchesi comes from the city Lugo, Italy in region Emilia-Romanga, Italy.
After being forced to escape from italy and the landhelds (sicsic), the Marchesi C. Low computational cost algorithms for portable ECG monitoring units. IFMBE IFMBE International Federation for Medical and Biological Engineering Medicon 2004. Proceedings of the Mediterranean Conference on Medical and Biological Engineering; 2004 July 31- August 5; Naples, Italy; 2004. p. 231.
(11.) Moody G. MIT/BIH Arrhythmia arrhythmia (ārĭth`mēə), disturbance in the rate or rhythm of the heartbeat. Various arrhythmias can be symptoms of serious heart disorders; however, they are usually of no medical significance except in the presence of database distribution. Massachusetts Institute of Technology Massachusetts Institute of Technology, at Cambridge; coeducational; chartered 1861, opened 1865 in Boston, moved 1916. It has long been recognized as an outstanding technological institute and its Sloan School of Management has notable programs in business, , Division of Health Science and Technology: Cambridge, MA; 1993.
(12.) Willems JL. Common standards for quantitative electrocardiography electrocardiography (ĭlĕk'trōkärdēŏg`rəfē), science of recording and interpreting the electrical activity that precedes and is a measure of the action of heart muscles. . 10-th CSE Progress Report. ACCO ACCO American College of Chiropractic Orthopedists
ACCO Association of County Commissioners of Oklahoma
ACCo American Cyanamid Company
ACCO Adenoid Cystic Carcinoma Organization
ACCO American Clip Company
ACCO Assistant Central Control Officer publ: Leuven; 1990.
(13.) Straszecka E, Straszecka J. Uncertainty and imprecision im·pre·cise
impre·cisely adv. representation in medical diagnostic rules. IFMBE Medicon 2004. Proceedings of the Mediterranean Conference on Medical and Biological Engineering ; 2004 July 31- August 5; Naples, Italy; 2004. p. 172.
(14.) IEC (International Electrotechnical Commission, Geneva, Switzerland, www.iec.ch) An organization that sets international electrical and electronics standards founded in 1906. It is made up of national committees from over 60 countries.
IEC - International Electrotechnical Commission 60601-2-47. Medical electrical equipment A piece of electrical equipment is a machine, powered by electricity and usually consists of an enclosure, a variety of electrical components and often a power switch. Examples of Electrical Equipment
emanating from or pertaining to electrocardiography.
maintenance of a more or less continuous surveillance of a patient's cardiac status by means of electrocardiography. systems. 2001.
AGH University of Science and Technology History
At the conference of the Polish miners and metallurgists held in Kraków on 24 February 1912, a resolution was passed indicating the need for the university of mining. A campaign of support was started in the Parliament of Austria-Hungary. , Krakow, Poland
Address for Correspondence: Piotr Augustyniak, AGH University of Science and Technology, 30, Mickiewicza Ave. 30-059, Krakow, Poland Phone: +48126174712 Fax +48126341568 E-mail: august@agh AGH Akademia Gorniczo-Hutnicza
AGH Allegheny General Hospital (Pittsburgh, PA, USA)
AGH Alpena General Hospital (Michigan)
AGH Helsingborg, Sweden - Angelholm/Helsingborg (Airport Code) .edu.pl
Table 1. Basic ECG interpretation procedures, their statistical parameters and attributed priority levels Statistical parameters, % Procedure name Priority level [delta] [epsilon] r p Signal quality assessment 10 3.3 20 97 1 Pacemaker pulse detection <1 8.3 70 3 4 Heart beat detection 1.5 2.5 70 100 2 Baseline estimation 3 <1 20 97 3 Heart rate estimation <1 <1 1 100 1 Heart beat classification 10 3 50 88 1 Waves measurement 3 5 100 85 2 Axis determination 3 5 300 85 3 Dominant rhythm detection 0 8.5 1.5 100 1 Arrhythmia detection 0 10 1.3 80 2 ECG--electrocardiogram, [delta]--outcome relative inaccuracy, [epsilon]--probability of false outcome, r--data reduction ratio, p--probability of use Table 2. Average data reduction ratio (%) on subsequent stages of interpretation process Data reduction related to Interpretation raw signal, % progress Data (% of total Original Optimized reduction processing time) architecture architecture gain, % 0 100 100 0 20 78 47 31 40 54 31 23 60 32 22 10 80 14 12 2 100 8 8 0 Table 3. Diagnostic parameters quality achieved by the original and the optimized architectures Statistical parameters Interpretation Original domain architecture, % [delta] [epsilon] Pacemaker pulse detection 2.8 9.3 Heart beat detection 2.5 3.5 Baseline estimation 4.3 1.3 Heart rate estimation 1.0 1.2 Heart beat classification 14.0 7.1 Waves measurement 5.1 7.5 Axis determination 6.3 7.8 Dominant rhythm detection 0.0 10.5 Arrhythmia detection 0.0 13.0 Statistical parameters Interpretation Optimized domain architecture,^ [DELTA] [epsilon] Pacemaker pulse detection 1.5 9.0 Heart beat detection 1.7 2.9 Baseline estimation 4.3 1.3 Heart rate estimation 1.0 1.2 Heart beat classification 12.0 4.0 Waves measurement 3.3 5.3 Axis determination 3.7 5.1 Dominant rhythm detection 0.0 8.8 Arrhythmia detection 0.0 11.8 ([delta])--outcome relative inaccuracy, ([epsilon])--probability of false outcome