Data-driven process and operational improvement in the emergency department: the ED dashboard and reporting application.EXECUTIVE SUMMARY
Emergency departments (EDs) in the United States are expected to provide consistent, high-quality care to patients. Unfortunately, EDs are encumbered by problems associated with the demand for services and the limitations of current resources, such as overcrowding, long wait times, and operational inefficiencies. While increasing the effectiveness and efficiency of emergency care would improve both access and quality of patient care, coordinated improvement efforts have been hindered by a lack of timely access to data.
The ED Dashboard and Reporting Application was developed to support data-driven process improvement projects. It incorporated standard definitions of metrics, a data repository, and near real-time analysis capabilities. This helped acute care hospitals in a large healthcare system evaluate and target individual improvement projects in accordance with corporate goals. Subsequently, there was a decrease in "arrival to greet" time--the time from patient arrival to physician contact--from an average of 51 minutes in 2007 to the goal level of less than 35 minutes by 2010.
The ED Dashboard and Reporting Application has also contributed to data-driven improvements in length of stay and other measures of ED efficiency and care quality. Between January 2007 and December 2010, overall length of stay decreased 10.5 percent while annual visit volume increased 13.6 percent. Thus, investing in the development and implementation of a system for ED data capture, storage, and analysis has supported operational management decisions, gains in ED efficiency, and ultimately improvements in patient care.
Emergency departments are critical to the delivery of healthcare for Americans. EDs have long been the destination of choice for acute medical conditions such as major trauma, strokes, and heart attacks, and they are increasingly the entry point for patients, handling nearly 50 percent of hospital admissions (Elixhauser and Owens 2006; Owens and Elixhauser 2006). In addition, EDs continue to serve as a safety net for millions of Americans. Between 1996 and 2006, annual ED visits increased 32 percent, reaching approximately 120 million visits, yet the number of hospital EDs decreased nearly 5 percent (Pitts et al. 2008). This has placed an enormous strain on the capacity resources of EDs. Reports of ED crowding are increasing, as are the associated negative consequences, such as ambulance diversion, prolonged patient wait times, increased patient complaints, decreased staff satisfaction, decreased physician productivity, and suboptimal clinical outcomes (Derlet and Richards 2000; Richards, Navarro, and Derlet 2000; Liu, Hobgood, and Brice 2003; Eckstein and Chan 2004; Richardson 2006; Pines and Hollander 2008; Pines et al. 2008).
While reducing crowding could alleviate these negative consequences, simply adding capacity to meet demand is not feasible for most facilities. Accordingly, there have been serious calls to increase the effectiveness and efficiency of emergency care (CFEC/BHCS/IOM 2007). Most improvement efforts have focused on the creation and implementation of a series of projects with a specific throughput focus (e.g., faster triage; improved lab turnaround for troponin), a targeted clinical entity (e.g., acute ST elevation myocardial infarction), or a particular strategy (e.g., order sets) (Bradley et al. 2006; Hwang et al. 2010; Wiler et al. 2010). However, these projects have demonstrated that improvement in the ED is a moving target; each change to the ED workflow can reveal new bottlenecks in the overall ED throughput. This cycle will continue until improvement efforts are based upon dynamic data that reflects the current state of all ED activity.
The development of a tool that provides continuous measurement of the entire ED throughput experience, such as the ED Dashboard and Reporting Application created by HCA Inc., would expose deficiencies in the current system. This tool would subsequently provide a window into the whole system, allowing for rapid assessment of change within the ED and identifying the targeted areas in need of improvement. However, major barriers to the creation of such a tool are the standardization of definitions for ED metrics and establishment of a repository for data that would allow for near real-time analysis. For the ED Dashboard and Reporting Application to be successful as both a process improvement project as well as a business and operational tool, significant resources had to be dedicated to the selection, refinement, and deployment of standard ED metrics.
Defining ED Metrics
In order to provide safe, timely, efficient, and cost-effective emergency care, providers must have access to data that allows them to gauge and measure their performance. Yet until recently (Welch et al. 2006; CMS and TIC 2010; Welch et al. 2010), there has been a lack of national agreement of definitions for ED metrics. At the time this project was developed, there was an ongoing discussion about what processes to measure and how to measure them (Graff et al. 2002; Lindsay et al. 2002). While the Centers for Disease Control and Prevention regularly reported select national data (CDC 2010), other professional groups had yet to come to a consensus about proposed ED performance measures. As a result, a variety of interpretations existed for even the most straightforward of measures. For example, definition of "arrival time" ranged from time of registration to time that patient requests services to patient self-report of time. The lack of a structured, unified definition of the event or a process to capture that event effectively impeded consistent data collection, comparative analysis, and process improvement.
In 2004, HCA recognized that this lack of industry agreement was an impediment to the operational efficiency of healthcare providers. Thus, the decision was made to invest in the infrastructure necessary to gather, standardize, and deploy standard definitions as a tool for process improvement. An ED Dashboard team was developed to help select the metrics. This team chose metrics that would allow for facility-specific analysis, deliberately limiting the number of measures in order to maintain focus on ED performance improvement. Five standard metrics were selected for inclusion in the ED Dashboard: arrival date/time; placed in bed date/time; MD/DO/PA/NP initiates contact/greets patient date/time; date/ time patient dispositioned; and date/ time patient physically leaves ED.
The establishment of universal definitions for these measures was a considerable challenge. Development of a system-wide consensus standard for ED measures as a part of the ED Dashboard and Reporting Application was accomplished by soliciting input from ED leadership from across the company and incorporating guidance from the current emergency management literature and CMS (CMS and TJC 2010). The definitions developed for the ED Dashboard were highly relevant to the field in general and were in agreement with subsequent definitions proposed by various professional groups, such as the Emergency Nurses Association (ENA), American College of Emergency Medicine (ACEP), and the Emergency Department Benchmarking Alliance (EDBA) (Welch et al. 2006).
However, these definitions required several revisions in order to maximize consistency in reporting. For example, a lack of specificity in the definition of "greeted by the provider time" led to a variety of interpretations by facility staff. This variation in turn limited the usefulness of the data set in comparisons between facilities. Revisions were made, incorporating specific changes in response to ongoing and concurrent feedback gathered from users via e-mail, conference calls, site visits, and observations. In short, proposed definitions of metrics were implemented and evaluated in a preliminary form during development. Revised definitions, shown in Exhibit 1, were subsequently incorporated into the final ED Dashboard application.
The process of defining and refining measures highlighted the need for both specificity and flexibility within the system. The effort to define standard definitions for metrics was crucial to the acceptance of the ED Dashboard and Reporting Application into the specific ED workflow of individual facilities. The remainder of this article discusses the components of the ED Dashboard and Reporting Application, the implementation process, and associated results. Also presented are lessons learned and conclusions about the usefulness of the ED Dashboard and Reporting Application in quality improvement programs.
HCA Inc. is one of the largest healthcare providers in the United States, managing 163 hospitals, 112 surgery and endoscopy centers, and more than 600 physician practices, oncology centers, and imaging facilities in 23 states and England. HCA-affiliated facilities (collectively, "HCA") include general community, suburban, and rural hospitals as well as academic health centers and tertiary-referral hospitals. These facilities provide approximately 5 percent of the major hospital services in the United States. The patient population served by these facilities is highly diverse; 38 percent of HCA primary service areas are considered multicultural (US Census 2000; Hospital Corporation of America 2006).
HCA has 180 distinct EDs. In addition to community-based EDs, these include 18 freestanding ED centers, 2 academic ED centers, 5 pediatric ED centers, and 4 critical access ED centers. There are 3 EDs designated as Level 1 trauma centers, 13 designated as Level 2 trauma centers, 9 designated as Level 3 trauma centers, and 12 designated as Level 4 trauma centers. In 2009, HCA EDs handled nearly 6 million patient visits, or an average daily visit volume of over 15,000. Average annual volume per ED was approximately 32,000.
Data were collected through each facility's Meditech portal as well as another ED documentation vendor, T SystemEV (www.tsystem.com, Dallas, TX). Data were collected both on paper and electronically as part of the patient care experience and documentation of that experience. Those facilities without electronic documentation recorded the data on paper, and those data were subsequently entered into Meditech. Nursing staff as well as providers (physicians, physician assistants, and nurse practitioners) and registration staff all contributed to the data-collection process.
Performance tolerances were established for individual data elements at the corporate level (for example, maximum time for arrival to discharge). These tolerances were developed based on evaluation of current metrics across the company while respecting both target goals and outliers. Facilities were permitted to set more restrictive tolerances; however, those tolerance levels were only used for internal monitoring.
Access to the database was granted using a tiered security protocol. Access was granted to any individual expressing a desire or need to know ED throughput as a part of their work duties, including ED directors and leadership, ED medical staff leaders, hospital leadership, quality and risk personnel, and ancillary services personnel including laboratory and imaging. The tiered security protocol restricted the level of detail given to users according to role. For instance, all authorized users had access to summary data. In contrast, only the ED director had access to daily patient-level data at his or her facility.
As collected data were self-reported, it was essential to include the ability to check and correct data entries that were missing, incorrect, or otherwise not feasible (e.g., arrival time after departure time). Patient records with missing, out-of-tolerance, or negative values for any of the data elements were automatically tagged for review and editing. To edit the data, the ED director or authorized designee referenced the patient chart to correct missing or out-of-tolerance data in the ED Dashboard; an outline of this process is presented in Exhibit 2. Any missing data in the dashboard utility would exclude that patient from the particular measure; each data point was based on the number of patients that had complete and accurate data for that measure.
Auditing and improvement of the database occurred at every level within the organization. As more electronic documentation was introduced, the data integrity improved, as did the ability to audit real time and retrospectively. Notably, each facility was given the autonomy and responsibility to collect data and implement successful process improvement programs to meet corporate goals. This encouraged facilities to collect data consistently and with integrity and to make prudent use of the data to improve processes and correct problems. Facilities' efforts to collect accurate data were supported by the previously discussed standard definitions, communication of clear goals from corporate leadership, and technological support as necessary.
All of the edited, self-reported data from the ED Dashboard were de-identified and imported nightly into the reporting application. Additional company-collected information, such as financial, human resources, and satisfaction data, was also incorporated into the reporting application for use in customizable analyses.
All reports generated through this application followed the same general structure. Turnaround time (TAT) could be calculated for any of 11 combinations of arrival, triage, bed, greet, disposition, or leave times (e.g., arrival to bed, greet to disposition). Reports were divided into four categories: TAT ED, TAT Radiology, TAT Lab, and TAT Executive. TAT ED segmented the data set according to criteria such as time (shift/hour, time period, day of week, delay code), patient attribute (gender, age, race, language, payer class), provider (primary care provider, midlevel, ED physician, emergency services provider), or other (diagnosis, D/C disposition, triage level). TAT Radiology reported turnaround time for certain radiology tests; this function will be expanded into its own dashboard in future revisions. TAT Lab reported turnaround time for a sample test representing each lab modality. TAT Executive reported data summaries by month, year, trend, and so on. An example report menu display is shown in Exhibit 3. All reports could be presented as graphs, and these graphs could be printed or downloaded in an editable format.
Prior to the ED Dashboard, available data was fragmented and gathered in a nonstandard manner. Consistent data collection was often stymied by the burden of data management. Available data was therefore limited to a segment of the population or a distinct time period as collected to support a process-change effort in a particular facility. Early efforts to standardize data definitions and reporting processes uncovered wide variation, significant outliers, and erroneous data. These observations led to the development of key principles for the proposed ED Dashboard and Reporting Application: standardized metrics, company-established goals, consistent data collection, inclusion of total population, ability to scrub data after collection, and methods to include or exclude noted outliers from analysis.
The ED Dashboard and Reporting Application, including the established standard definitions of metrics (Exhibit 1), was fully implemented in 2006. Facilities were expected to meet the corporate goal for arrival to greet time of less than 45 minutes; this goal was later revised to 35 minutes. While the ED Dashboard was not intended as a self-contained improvement program, it provided access to data for sub-cycles within the arrival to greet parameter: arrival to triage, triage to bed, and bed to greet. Facility leadership was instructed to use the data to specifically identify problem areas within their individual workflows and implement process improvement activities to decrease arrival to greet time. Overall arrival to greet time decreased from 51 minutes in 2007 to 28 minutes in 2010, as shown in Exhibit 4.
Exhibit 5 displays arrival to leave time (shown in the exhibit as bars) and annual visit volume (shown as a line) for 2007 through 2010. Arrival to leave time, also known as length of stay (LOS), decreased from an average of 200 minutes in 2007 to 179 minutes in 2010. During this time, annual visit volume increased: 5,116,100 visits in 2007; 5,246,400 visits in 2008; 5,593,500 visits in 2009; and 5,801,432 visits in 2010.
[GRAPHIC 4 OMITTED]
The ED Dashboard and Reporting Application was intended as a tool to measure, analyze, and store ED throughput data in order to aid in the deployment of process improvement programs. Challenges to the development and implementation of the ED Dashboard and Reporting Application ranged from the practical (creating standard definitions, technical limitations to data capture) to the abstract (fears about additional complications to the ED workflow or distraction from finding solutions to existing problems). Yet overcoming these challenges resulted in a robust database that could be used for the development of both process and operational improvement programs.
[GRAPHIC 5 OMITTED]
The ED Dashboard and Reporting Application provided near real-time access to data that allowed for the deployment of various interventions to improve ED throughput. Early in its existence, the ED Dashboard and Reporting Application was utilized as a process improvement tool, answering specific questions about throughput based on diagnosis, patient characteristics, and other criteria. Using the continuous, near real-time data-processing ability of the ED Dashboard and Reporting Application, facilities could monitor the average time from arrival to greet, identify specific problems within sub-cycles (arrival to triage, triage to bed, bed to greet), and target improvement activities accordingly. For instance, arrival to bed time for a particular facility may include time to triage, time spent waiting in the lobby, and time spent with registration. Through the monitoring and analysis capabilities of the ED Dashboard and Reporting Application, this facility could determine if overall goals were being met and, if problems existed, whether triage, waiting, or registration caused the delays.
The ED Dashboard and Reporting Application was not in itself an improvement project but rather allowed for the implementation of specific interventions, such as immediate bedding, rapid triage, rapid medical evaluation for low-acuity patients, or expansion and renovation of an emergency department in the context of a facility benefit. The access to facility-specific data allowed for the targeting and testing of changes to maximize benefit, aiding in the implementation of the right improvements for a particular facility. The associated decrease in LOS underscores the usefulness of this project and the value of reliable and easily accessible data. Additionally, the measured decrease in LOS occurred while overall ED visit volume increased by 13.6 percent, suggesting that the targeted improvement projects facilitated by the ED Dashboard increased ED efficiency and allowed facilities to respond to and accommodate changes in volume.
In addition to acting as a repository for ED throughput data, the ED Dashboard also provided consensus about standard terminology. Through this project, HCA pushed forward to define specific metrics, later comparing these standards to other national groups (Welch et al. 2006; CMS and TJC 2010; Welch et al. 2010). The consequences of this are twofold. First, definition of standard metrics facilitated the clear collection of data from all ED patients across a diverse array of facilities. This allowed for the identification of opportunities for improvement amidst the noise of individual hospital characteristics, imperfect data, and a persistent lack of industry agreement on definitions. Second, as national standards are adopted into reporting requirements (National Quality Forum 2009; CMS and TJC 2011), the ED Dashboard is already positioned for compliance.
This project is not without limitations. The HCA ED Dashboard and Reporting Application was an ambitious and costly project, requiring a large investment of resources. At the time of development, there were no commercially available systems that would function as desired, and the entire system had to be built using available internal information technology resources. Thus, the dedication and commitment of corporate leadership was essential to the project. As such, the size and structure of the HCA system was an advantage that might limit generalizability. In the intervening years, however, several vendors have developed products with similar functionality. Therefore, providers considering a similar system must now evaluate the "make or buy" decision; we recommend verifying that a commercially purchased system accepts data from any source, allows for editing, and produces near real-time reports.
Through the implementation process, we also learned that the amount of user effort and engagement necessary to complete this project should not be underestimated. Facility-level users were asked to collect and edit data as well as aid in definition development. As the ED workflow is highly demanding even without additional responsibilities, many facility managers expressed concern that these activities would necessitate the hiring of additional personnel or assignment of additional work duties. Thus, extra effort was taken to engineer this project with the end user in mind and alleviate these concerns. The tool was generally simple and easy to use even for individuals with limited comfort using databases. Reports were built to answer common questions, and the ease of accessing data and reports reinforced the use of the application. As the utility of the application in improving patient care became more apparent, acceptance was nurtured through widespread education in combination with consistent support from executive leadership. Eventually, the program reinforced itself, as improvement activities based on this application were successful and in turn drove increased demand for more data.
Ultimately, the ED Dashboard and Reporting Application can help to increase understanding of the effect of process changes and business decisions on the quality and efficiency of patient care. Through the measurement of the entire ED population, the ED Dashboard puts comprehensive, actionable information into the hands of those who need it. The robustness of the database has allowed queries to grow in complexity and has revealed how changes in ED throughput affect operational metrics or how operational changes alter flow metrics. As a result, the ED Dashboard is becoming a powerful tool to support operational management decisions. While this function will be investigated more fully in the future, facilities have already utilized dashboard metrics as part of their marketing and patient engagement materials, from billboards to mobile applications, to promote short wait times within local markets. Likewise, executive leaders have utilized this application as a tool for evaluating capital investment and hiring assessments as well as indicating where changes in ED size, type, bed number, staffing, or other resources would have the maximum patient benefit. When correlated with satisfaction data, ED Dashboard data could also indicate changes that may improve overall patient care. Altogether, the use of the ED Dashboard as an operational tool will help illuminate changes that could advance ED services as both a critical product line and a portal to other services.
Prior to this program, we observed that process improvement in the ED was often stymied by a lack of data; programs could not be effectively targeted without a comprehensive understanding of ED throughput and the variables that affect ED throughput. Similarly, the burden of data collection--from technology limitations to staffing requirements--interfered with process improvement projects. Investing in a tool that aided the collection, storage, analysis, and near real-time availability of ED throughput data provided the necessary transparency for improvement.
While this project required a large commitment of resources, the efforts to clearly define measures, ensure an accurate database, and provide near real-time access to data and reports allowed for improvement efforts to be highly focused and reactive to the current state. This robust information enabled the entire system to respond to and grow with changes in volume while making continuous improvements to patient wait times. The successful creation and implementation of this project aided in both process improvement and operational management decisions, further supporting the creation of similar tools to drive gains in ED efficiency and ultimately improve patient care.
Thank you to HCA Information Technology and Services for their assistance in the technical development of the ED Dashboard and Reporting Application.
Bradley, E. H., J. Herrin, Y. Wang, B. A. Barton, T. R. Webster, J. A. Mattera, S. A. Roumanis, J. P. Curtis, B. K. Nallamothu, D. J. Magid, R. L. McNamara, J. Parkosewich, J. M. Loeb, and H. M. Krumholz. 2006. "Strategies for Reducing the Door-to-Balloon Time in Acute Myocardial Infarction." N Engl J Med 355 (22): 2308-2320.
Centers for Disease Control and Prevention (CDC). 2010. "National Hospital Ambulatory Medical Care Survey." Retrieved August 18. www.cdc.gov/nchs/ahcd.htm.
Centers for Medicare & Medicaid Services and The Joint Commission (CMS and TJC). 2011. "Specifications Manual for National Hospital Quality Measures." QualityNet. Retrieved June 3. www.qualitynet.org.
--. 2010. "Specifications Manual for National Hospital Quality Measures." QualityNet. Retrieved August 18. www .qualitynet.org.
Committee on the Future of Emergency Care in the Hnited States Health System, Board on Health Care Services, and Institute of Medicine (CFEC/BHCS/IOM). 2007. Hospital-Based Emergency Care: At the Breaking Point. Washington, DC: National Academies Press.
Derlet, R. W., and J. R. Richards. 2000. "Overcrowding in the Nation's Emergency Departments: Complex Causes and Disturbing Effects." Ann Emerg Med 35 (1): 63-68.
Eckstein, M., and L. S. Chan. 2004. "The Effect of Emergency Department Crowding on Paramedic Ambulance Availability." Ann Emerg Med 43 (1): 100-105.
Elixhauser, A., and P. Owens. 2006. Reasons for Being Admitted to the Hospital through the Emergency Department, 2003, HCUP Statistical Brief # 2. Rodccille, MD: Agency for Healthcare Research and Quality. www .hcup-us.ahrq.gov/reports/statbriefs/sb2.pdf.
Graft, L., C. Stevens, D. Spaite, and J. Foody. 2002. "Measuring and Improving Quality in Emergency Medicine." Acad Emerg Med 9 (11): 1091-1107.
Hospital Corporation of America. 2006. HCA Diversity and Inclusion 2006 Report. Nashville, TN: HCA, Inc.
Hwang, H., K. Baumlin, J. Berman, N. K. Chawla, D. A. Handel, K. Heard, E. Livote, J. M. Pines, M. Valley, and K. Yadav. 2010. "Emergency Department Patient Volume and Troponin Laboratory Turnaround Time." Acad Emerg Med 17 (5): 501-507.
Lindsay, P., M. Schull, S. Bronskill, and G. Anderson. 2002. "The Development of Indicators to Measure the Quality of Clinical Care in Emergency Departments Following a Modified-delphi Approach." Acad Emerg Med 9 (11): 1131-39.
Liu, S., C. Hobgood, and I. H. Brice. 2003. "Impact of Critical Bed Status on Emergency Department Patient Flow and Overcrowding." Acad Emerg Med 10 (4): 382-85.
National Quality Forum. 2009. National Voluntary Concensus Standards for Emergency Care: A Consensus Report. Washington, DC: National Quality Forum.
Owens, P., and A. Elixhauser 2006. Hospital Admissions That Began in the Emergency Department, 2003, HCUP Statistical Brief # 1. Rockville, MD, Agency for Healthcare Research and Quality. www.bcup-us.ahrq .gov/reports/statbriefs/sb1.pdf.
Pines, J. M., and J. E. Hollander. 2008. "Emergency Department Crowding Is Associated with Poor Care for Patients with Severe Pain." Ann Emerg Med 51 (1): 1-5.
Pines, J. M., S. lyer, M. Disbot, J. E. Hollander, F. S. Shofer, and E. M. Datner. 2008. "The Effect of Emergency Department Crowding on Patient Satisfaction for Admitted Patients." Acad Emerg Med 15 (9): 825-31.
Pitts, S. R., R. W. Niska, J. Xu, and C. W. Burt. 2008. "National Hospital Ambulatory Medical Care Survey: 2006 Emergency Department Summary." Natl Health Stat Report (7): 1-38.
Richards, J. R., M. L. Navarro, and R. W. Derlet. 2000. "Survey of Directors of Emergency Departments in California on Overcrowding." West J Med 172 (6): 385-88.
Richardson, D. B. 2006. "Increase in Patient Mortality at 10 Days Associated with Emergency Department Overcrowding." Med J Aust 184(5): 213-16.
US Census. 2000. "US Census Report." www .census.gov.
Welch, S. J., B. R. Asplin, S. Stone-Griffith, S. J. Davidson, J. Augustine, and J. Schuur. 2010. "Emergency Department Operational Metrics, Measures and Definitions: Results of the Second Performance Measures and Benchmarking Summit." Ann Emerg Med 58 (1): 33-40.
Welch, S., J. Augustine, C. A. Camargo Jr., and C. Reese. 2006. "Emergency Department Performance Measures and Benchmarking Summit." Acad Emerg Med 13 (10): 1074-80.
Wiler, J. L., C. Gentle, J. M. Halfpenny, A. Heins, A. Mehrotra, M. G. Mikhail, and D. Fite. 2010. "Optimizing Emergency Department Front-End Operations." Ann Emerg Med 55 (2): 142-60, el41.
James E. Bliffen, FACHE, corporate performance improvement manager, MedStar Health, Columbia, Maryland; and Benjamin Rosenthal, FACHE, CPHQ, corporate director, PI Measurement and Reporting, MedStar Health, Columbia, Maryland
Many organizations, including our own health system, struggle with the development of standard definitions of metrics and ongoing maintenance of a data repository with real-time analysis capabilities. On any system-wide application install and many performance improvement projects, much of the initial work is devoted to this effort. Often, we are as pleased to have created the working groups and processes to reach agreement on the definitions and metrics as with any initial outcome demonstrated by the data. Accordingly; it is gratifying to see that the investment of significant time and resources for data management within a large system such as HCA has led to real and sustained gains in efficiency and improvements in patient care processes and related outcomes.
The choice of emergency departments (EDs) to demonstrate this accomplishment seems particularly apt as they tend to operate discretely within hospitals and most face similar and well-documented issues of overcrowding and inappropriate use. Focused as they are on immediate and high-level response to medical emergencies, ED practitioners do not always worry so much about delays and other process issues for nonurgent care. However, service demand can vary widely during any 24-hour period, providing opportunities during slow periods to review and scrub the process data being collected. This is a critical factor in obtaining reliable, usable data for analysis. Too often, in our experience observing patient flow in the ED, a complete understanding of processes is distorted by the lack of accurate data collection during critical times. With so many individuals responsible for recording data points at various locations and times, it is almost impossible to know what was missed until too late. HCA makes a strong effort to capture and record accurate and inclusive process data on every patient and, equally important in our view, establishes systemwide performance tolerances for the individual data elements. We believe this was crucial to the successful results obtained.
Nevertheless, we question whether EDs without computerized record keeping could allocate the resources necessary for such a comprehensive data collection effort. Many of the requested metrics are milestones needed from physicians and nurses who are necessarily focused on the patient emergency at hand, not longer-term process improvement. Our system of hospitals will attempt to collect much of this data automatically from a new electronic medical record application. As we work with clinicians and administrators on the design and build, we expect to establish their ownership of the data and commitment to process improvement using it. HCA found it took considerable time for the program to reinforce itself, but that the results now speak for themselves. We believe the publication of this work will help us in the establishment of data repositories with standard definitions acceptable to our ED professionals.
It would be interesting to know from HCA how many of their EDs are regularly using the database for local performance improvement work, which provider and patient characteristics are demonstrating the greatest variation, and whether the desire to oversee progress in real time might lead to new functional reporting relationships that complement the traditional ED management structure.
Suzanne Stone-Griffith, RN, MSN, CNAA, assistant vice president of emergency services, Clinical Services Group, HCA; Jane D. Englebright, PhD, RN, chief nursing officer and vice president, Clinical Services Group, HCA; Dickson Cheung, MD, MBA, MPH, attending physician, Ski, Ridge Medical Center, Carepoint, PC; Kimberly M. Korwek, PhD, medical writer, Clinical Services Group, HCA; and Jonathan B. Perlin, MD, PhD, MSHA, FACE FACML chief medical officer and president, Clinical Services Group, HCA
For more information about the concepts in this article, please contact Dr. Perlin at firstname.lastname@example.org.
"HCA," "Company," "we," "our" or "us," as used herein refers to HCA Inc. and its affiliates unless otherwise stated or indicated by context.
EXHIBIT 1 Standard Metric Definitions Utilized in the ED Dashboard Metric Definition Arrival Date/Time The date/time the patient arrives/signs into the ED. Time when the patient actually arrives in the ED * Should be when the patient arrives (before they sit down), not when they complete their sign-in form or are acknowledged by a nurse/greeter * Time stamp clocks should be used at the front door and back door (EMS) to document arrival time. Clock should be matched to Meditech or atomic clock. Triage Date/Time The date/time when triage begins (by a nurse) * Not data collection by a tech/paramedic * Can be the rapid triage time Placed in Bed Date/Time The date/time the patient is placed into a bed, stretcher, or chair in the ED treatment area * Can be any space where a medical screening exam (MSE) can be initiated, even if it is not a permanent space (i.e., front, back, hallway, triage room) MD/DO/PA/NP Initiates The date/time the MD/DO/PA/NP has Contact/Greets Patient Date/ initial contact to begin the MSE or Time rapid evaluation. * When the MD makes a connection with the patient * This does not include watching the patient walk into the room or when the physician picks up the chart Date/Time Patient Physically The date/time that the ED patient Leaves ED physically leaves the ED treatment/bed area, regardless of destination