An on-line cost analysis system.
In other words, administrators are asking us to cut costs and be more productive. But what are our costs? Which tests are the most expensive? Which are the least expensive? What does it cost us on average to service DRG 20?
Our laboratory's on-line cost analysis system is now generating the right answers. We call it on-line because it provides, as near as possible, the real-time, up-to-the-minute cost of tests. This kind of system will be an invaluable laboratory management tool in coming years.
St. Luke's is a 473-bed, university-affiliated hospital located on the eastern edge of Cleveland. The department of pathology has approximately 115 full-time equivalents, and our workload was just over 5,000,000 CAP units last year.
Our particular laboratory information system (LIS) uses a 16-bit minicomputer--not a tabletop microcomputer--with 500Kb of semiconductor memory. There are four 50-megabyte disk drives and a single 188-megabyte drive. The system presently supports nearly 100 total peripherals. It is not small.
We wanted computer programs that would perform cost analyses along the lines of our manual cost analysis procedures. We also sought automated updating of test costs as costs of supplies, labor, and other items changed. The manual analyses required hours of tedious work and were highly prone to mathematical errors. Our goal was to cut the time and increase the accuracy.
It's clear that the fewer places a single piece of information is stored, the easier it is to keep it up to date. That's the approach we took. So, for example, when we update the cost of 10 X 75 mm test tubes, that single entry is referred to by the computer and extended to all methods using the tubes. The system actually holds pointers, not data. It's rather like a librarian who may not know all the answers but does know where to find them.
Wading into cost analysis can become very confusing very fast. One way to retain equilibrium is to focus on the two types and three levels of costs.
The types are direct and indirect, and both must be examined exhaustively to calculate the real cost of a test. Direct costs are the actual expenses for supplies, reagents, technologist time, and instrumentation required to perform a test. They are fairly easy to calculate. Indirect costs are the problem. We have to examine and include the cost of running quality control and purchasing QC materials. We also must account for such nonrevenue time at holidays, vacations, sick leave, and continuing education. And there are miscellaneous costs--we always have 4 X 4 gauze around, but to which tests do we assign it? Last but not least, we must factor in the assigned hospital and laboratory overhead.
It will be argued that quality control is a direct cost. I agree, but it still must be handled as an indirect cost, because there is no straight one-to-one relationship between patient tests and quality control. So we calculate the total cost of quality control for a procedure and then prorate it to each patient test.
The three levels of costs are global, section-specific, and test-specific. Global costs apply to all tests in the laboratory. These include hospital-assigned overhead and such laboratory overhead as the lab director's and lab manager's salaries, clerical wages, office expenses, and the cost of running the LIS. Global costs are divided by total laboratory workload to arrive at a cost per workload unit, then allocated to each test in the lab according to its CAP weight. These calculations can be based either on the previous year's workload or on the lab workload updated to the time of the cost analysis.
The second level of costs, section-specific, includes average salaries and other items that vary from chemistry to hemotology to microbiology: for example, general supplies, quality control, and travel/education. These are divided by total section workload, then allocated to each test in the section according to its CAP weight. Finally, test-specific costs include the specific supplies and reagents used on each test, and instrument costs. While some of these distinctions can blur a bit, they should be intuitively obvious for the most part.
Figure I displays calculations of hospital and laboratory overhead costs per test at a hypothetical 500-bed hospital, based on an annual lab workload of 6,000,000 CAP units. Note that this total covers all work performed, including quality control, standards, blanks, and repeats. So if quality control represents 30 per cent of all testing, 1,800,000 CAP units would be missed in billing. Remove the 1,800,000 units from the workload, and overhead costs per test increase correspondingly, as the adjusted calculations indicate in Figure II.
Figure III shows how instrument and quality control costs per test are calculated in many cases. The instrument cost combines lease cost or depreciation and maintenance costs. There are two quality control costs. The direct cost is the actual expense of running the quality control. The other cost is the expenditure for QC materials.
These calculations become tricky when dealing with discrete analyzers and quality control materials with many components. In general, we find the percentage of total patient results accounted for by the particular test and apply that to the annual instrument cost and the total nember of QC tests on the instrument (to obtain the direct QC cost). To calculate QC material costs, we first compile the total number of patient results for which the material was used over a 12-month period. From this total usage and the annual material cost, we are able to calculate the cost per patient result.
Our computer system's workload statistical files contain information on the number of patient tests, patient repeats, QCs, blanks, standards, and QC repeats in a fairly clear and accessible format. In preparing a cost analysis, the system looks at the latest 12-month statistical period--through February 1984, for example, if the analysis is performed in March.
A test is primed for future analyses when we enter its name, supplies, reagents, instrument name if any, quality control materials, and any miscellaneous cost items. All one needs to know in order to do this is the test code; the cost inventory catalog numbers of the reagents, supplies, and QC materials; and the volume or number of each item per test.
During an actual analysis, the computer asks only for the test code and where to print the report. It locates the global and section overhead data and technologist salary cost and then examines the cost inventory. It finds each item, checks the cost per workload unit, and calculates the cost per test.
Under our old manual system of performing a cost analysis on a single test, the section supervisor spent 30 to 60 minutes in initial preparation and calculations. The assistant laboratory manager then spent 10 to 15 minutes reviewing and checking the math, and a secretary put in another 15 minutes typing the finished copy. The total value of this time, including fringe benefits, was roughly $18. This is probably a conservative figure, considering that the lab manager and his assistant usually review the data more than once and return some of the material to the supervisors for corrections. Therefore, the total cost of analyzing 300 procedures under our manual system was at least $5,400.
Each test analysis can now be performed by a supervisor in less than 10 minutes. Add five minutes for the lab manager or assistant to review the data, and the cost for a typical procedure is about $3. For more than 300 procedures, the cost is $900. That's a saving of $4,500--or about 83 per cent--for the initial analysis.
Subsequent computer analyses are essentially cost-free. It takes less than 30 seconds to run each, as opposed to another 30 to 45 minutes ($10) under a manual system.
In addition to the obvious time savings, staff resistance to the drudgery of doing cost analyses disappears. The final report is easy to read and free from typographical errors. As a result, the reviewer need only check the content rather than proofread the typing or the arithmetic.
The system also supports prospective analysis of tests that are not currently on the laboratory's menu. For these tests, the system solicits the CAP weight, the projected number of patient and QC tests per year, and the number of results. once these data are entered, the prospective analysis proceeds the same as for any other test. This is one of the system's most popular features. It supports decisions regarding sendouts versus in-house testing and supplies rational charges for new tests.
The system must accomodate a number of exceptions. Most laboratory sections perform quality control on a test-by-test basis, but microbiology and the blood bank do it on a section basis. So we enter their QC budgets in the section-specific area. These costs are assigned to all the tests in the section according to their workload value.
For the future, we plan to add the capability of temporarily changing any parameters and having the system immediately recalculate the cost. We also plan a complete DRG analysis system that will check the cost of all patient tests for a given Diagnosis Related Group. Both these programs are scheduled for implementation in the late 1984.
The system has been eminently successful in our institution. Other hospitals may feel quite differently about calculating costs, and our system may or may not fill their needs. But the system has a great deal of inherent flexibility; as we experiment, we find that it will tolerate many different approaches to cost analysis.
A major advantage is the ability to spotlight areas where costs are high and should be reduced. The ability to undertake prospective analysis is another distinct advantage. Should your laboratory stop doing theophylline tests on the HPLC and acquire a new drug monitoring instrument? Put the data in the computer, enter the number of tests, and decide for yourself. It takes about 15 minutes if you have the reagent, instrument lease, and maintenance costs.
The system is written in RAT-for (RAtional Fortran). Our hospital is negotiating to sell it to our computer vendor for general distribution. We also hope to have a microcomputer version available later this year.
In the seven months since its implementation at St. Luke's, the system has slashed the time devoted to test cost evaluations by 75 per cent. That's cost-effective cost analysis.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||updates costs per test and simplifies complex calculations|
|Author:||Whitehouse, Clyde R.|
|Publication:||Medical Laboratory Observer|
|Date:||Mar 1, 1984|
|Previous Article:||How much does an MT program cost the hospital.|
|Next Article:||And now, a good word for nurses.|