Printer Friendly

The past as prologue: a look at the last 20 years.

In 1969, a new journal entered the growing field of laboratory medicine. Medical Laboratory Observer arrived at an auspicious and crucial time, for the next two decades would be among the most productive and turbulent in the field's relatively brief history.

After the tumultuous year of 1968, with all the political turmoil and student unrest, 1969 seemed to offer new beginnings. There was a new administration in Washington, the Vietnam war was winding down from the intensity of 1968, and a spirit of gloom about the nation's future was turning to one of cau- tious optimism.

Those of us in lab medicine were reflecting with some satisfaction on the achievements of the preceding two to three decades and looking forward with much enthusiasm to developments that were in their early stages or just over the horizon.

The introduction of quality control concepts and procedures to the

clinical laboratory had been one of the major achievements of the previous 20 years. Before the advent of this system of monitoring performance, the laboratory staff depended on its collective analytical skill, but this did not guard against errors due to deterioration of reagents and standards and other sources of bias. The staff had to rely on complaints from clinicians to alert it to such problems. As the staff became busier with growing test volume, however, close contact between the laboratory and clinicians diminished.

With introduction of their quality control chart to clinical chemistry in 1950, Levey and Jennings gave

the laboratory its first dependable internal system of monitoring reliability of test results. Then, in 1958, medical technolo- gists Freier and Rausch presented the first comprehensive daily quality control program for laboratories.

As the reliability of laboratory results grew, did respect for the results. This led to a more extensive dependence by medicine on the lab.

Another singular

achievement, in 1957, was to have a profound effect on clinical chemistry and medicine as a whole in the decade to come. That year Technicon unveiled the first commercial continuous-flow automatic analyzer-the Auto- Analyzer, designed by Leonard Skeggs.

This approach placed reasonably reliable, rapid analysis of several major blood constituents within the reach of most hospital laboratories. It also enabled laboratories to produce large quantities of data in a short period of time with a reasonable amount of labor. Thus it heralded a technological revolution in clinical chemistry.

Multichannel automated analysis soon followed, bringing on another new development-biochemical screening. In 1966, Ralph Thiers and his associates suggested the following strategy: If biochemical analyses can be performed rapidly, accurately, and economically, why not include them in the initial examination of a patient, testing for a number of biochemical parameters in serum? Unsuspected disease might be detected earlier, and the workup of the patient expedited.

This was indeed an exciting prospect, and many took up the challenge. Screening batteries became part of the admission examination in hospitals and clinics. Mass screening of well populations was also proposed by some as a means of disease detection.

Another revolution in the making came from an unexpected direction. The Berson-Yalow studies on the state of [.sup.131]insulin in plasma led to their development

of a radioimmunoassay of plasma insulin. From the concept of this analytical approach came the competitive binding assay, an innovation that helped place almost all biologically important substances-including hormones, vitamins, polypeptides, pressor amines, and pharmaceutical agents-within range of analysis and greatly increased the scope and range of clinical laboratory testing.

Other important changes were taking place. Ouchterlony's radial immunodiffusion method in gel led to the development and application of immunoelectrophoresis by Grabar and Williams in 1953; now the clinical laboratory had a sensitive method for the detection and quantitation of a large variety of proteins. In 1966, Laurell introduced "rocket" immunoelectrophoresis, which greatly reduced analytical time.

Chromatography, discovered by Tswett in 1906 during a study of plant pigments, led to the development of gas-liquid partition chromatography by Martin and James in 1952. This powerful analytical tool was still largely unexploited by laboratory medicine in 1969, but new developments were on the horizon.

Moving-boundary electrophoresis, introduced by Tiselius in 1937, paved the way for electrophoresis on stabilized media, zone electrophoresis, and paper and cellulose acetate electrophoresis-powerful tools for the separation and identification of proteins.

Many other analytical, conceptual, and scientific advances were reshaping the environment and the work of the clinical laboratory. The Beckman DU spectrophotometer

appeared in the 1950s and dominated the field of absorption spectrophotometry. Flame photometers also came into use in the 1950s, permitting ready and rapid determinations of sodium and potassium, previously measured by laborious chemical methods and therefore sadly neglected. The atomic ab- sorption spectrometer, which appeared soon after, made calcium and magnesium determinations much more feasible and rapid.

For the most part, the examples I cited are drawn from clinical chemistry. Many outstanding advances occurred in other areas of laboratory medicine in the 1950s and 1960s. Among these were the development of cytogenetics and the discovery of several major chromosomal abnormalities, including the Philadelphia chromosome found in 1960 by Nowell and Hungerford in chronic myelogenous leukemia.

The major histocompatibility complex (the HLA system) was uncovered during this period by Dausset et al and others, leading to greatly improved prospects for organ transplant surgery.

In 1965, Gold and Freedman reported a protein, carcinoembryonic antigen (CEA), thought to be specific for colon cancer. That ushered in an era of "tumor markers, " which would become more important in the 1970s and 1980s.

Computers had moved into the clinical laboratory by 1969, offering the promise of much improved data processing and management. With the development of integrated solid state electronic circuits in the early 60s, sophisticated computers could be produced at lower cost, making possible their entry into the clinical laboratory. In

1969, the task of developing "interactive systems" that could communicate directly with laboratory personnel and instruments (on line) was just beginning.

These are only a few examples of the scientific and technological advances, innovations, and discoveries that characterized the field of laboratory medicine in the two decades preceding the birth of MLO. Such advances also underlay a number of other developments.

For example, an improved understanding of the pathophysiological basis of diabetes mellitus and other endocrine disorders followed Berson's and Yalow's notable 1959 breakthrough on insulin assay. Similarly, the introduction of flame photometry and its rapid and precise estimation of serum K

concentration undoubtedly saved the lives of many patients in diabetic coma and in postoperative states who previously would have died from unsuspected hypokalemia.

The advent of Medicare and Medicaid in 1965 and passage of the Clinical Laboratory Improvement Act of 1967 marked the beginning of a new era of government interest, participation, and involvement in health care and laboratory medicine in the United States, an involvement that was to grow in the decades to come.

This, in brief, was the situation in 1969, when MLO was launched. Physicians and scientists in lab medicine took justifiable pride in the advances that had been made in the previous two decades, a time filled with the promise of continued progress, discovery, and innovation.

Could we have predicted back then what was to come in the next

two decades? In some areas, probably yes. It would have been fairly safe to forecast further advances in automation, perhaps even to the extent of a fully automated chemistry section. We might also have predicted that multiphasic screening would become much more highly developed and very successful, but here we would have been wrong.

Computers would have been given a rosy future in the clinical laboratory, and we might have justifiably expected new technological developments in chromatography, spectroscopy, and other analytical areas. Finally, it would not have been hard to prophesy that government intervention in medicine and laboratory medicine would increase.

Automation forged ahead strongly in the 1970s and 1980s. The rigidity of the early multi-channel continuous-flow systems gave way to new ones with discrete sampling and random access, allowing much more flexibility. While the earlier systems were outstanding for admission screening and multiphasic screening, they did not meet the needs of many hospital laboratories for greater flexibility and versatility in high-volume batch chemical analysis.

Du Pont's ACA system and the Hycel Mark X had made their appearance in 1968 and were being introduced into clinical laboratories 20 years ago. They were followed by the ABA Systems in 1971, Technicon's SMAC in 1973, Beckman's Astra in 1978, the Kodak 400 in 1979, and Technicon's Chem-1 in 1985, among others.

Most of these offered considerable advantages over earlier sys-

tems in their flexibility, versatility, durability, and direct on-line computer operation. By 1989, this field was quite well established, with a fine variety of instrumental and conceptual approaches to suit the requirements of individual laboratories.

Laboratory robotics just began to blossom in the 1980s. Robots were introduced in this decade to mechanize. specimen preparation steps, relieving technologists of those tasks. This type of automation is bound to find more use in the next decade for many routine tasks, including transportation.

Computers boomed in the clinical laboratory during the 1970s, chiefly because of advances in microprocessor technology and the consequent development of interactive systems that could communicate directly with both personnel and instruments. New data processing capabilities revolutionized almost all large clinical chemistry labs and made possible the development of the versatile automated analytical systems that characterized this era.

The 1970s might well have been termed "The Decade of the Computer. " Comparatively few laboratories, however, had adopted total laboratory information systems (LIS) by the mid-1980s, probably because of the anticipated expense. Widening LIS use is certainly on the horizon for the 1990s.

Mass screening and admission testing, which had shown such promise earlier, went into decline. Several studies in the 1970s in Britain, Australia, and the United States demonstrated that the yield from multiple biochemical screening on admission to hospitals and clinics was discourag-

ingly low and did not justify the expense. No positive effects on outcome were noted. There were only negative effects: Such screening increased expenses and length of stay rather than decreasing them as hoped. By the 1980s, most clinical laboratories had abandoned this approach.

Ion-selective electrodes began to appear in the clinical laboratory in the early 1970s. Of course, the glass electrode pH meter-the grandfather of all electrodes-had by then been a fixture in the laboratory for two or three decades. In 1967, Ross introduced a calcium selective electrode that made its way in the early 1970s into the laboratory for determination of ionized calcium, a long sought goal.

Na+ and K+ electrodes followed in the latter 1970s, as did electrodes for other blood constituents, including glucose, ammonia, and urea. After initial problems with interference, some of these methods (notably for Na+ and K+) quickly replaced flame photometry in many laboratories. In some instances, the new methods were incorporated in automated analyzers. The next step was attempted transcutaneous monitoring Of O[.sub.2], CO[.sub.2], and the electrolytes, an effort that will proceed into the 1990s.

Practical instrumentation for gas and liquid chromatography also arrived in the early 1970s. This approach-with its high resolution, sensitivity, rapidity, and ability to provide accurate simultaneous quantitation-offered great advantages over other types of separation analysis. Accurate and sensitive laboratory measurements of a large number of thera-

peutic agents were made possible.

As a result, therapeutic drug monitoring was much improved. Now effective levels of such drugs as the cardioactive agents lidocaine and procainamide and anticonvulsive agents could be monitored reliably, and treatment with these agents became safer and more effective.

One of the signal breakthroughs in science and technology in these decades was the introduction of hybridoma technology by Kohler and Milstein, permitting the ready production of monoclonal antibodies for a large number of biological uses. Antibodies with specific recognition for specific antigenic determinants could now be prepared. This advance has had an immense impact on immunochemistry and other areas of laboratory medicine, especially immunology. It has improved the specificity of immunoassays and contributed to the blossoming field of tumor markers.

For decades, the clinical laboratory has assisted in the diagnosis and management of cancer. The "Holy Grail" has been a test that would reliably signal the presence of cancer-always positive in its presence, always negative in its absence. This goal has eluded us and may always continue to do so. Since Gold's discovery of carcinoembryonic antigen, alluded to earlier, a new burst of interest developed in what came to be known as tumor markers. The rise of monoclonal antibody technology spurred this interest further, and a new goal emerged: To find serum or urine proteins specific for different types of cancer.

Though CEA proved on further study not to be specific for colon cancer, it still became very useful in following the treatment of such cancer, indicating the cancer's continued presence after therapy, its decrease or increase, and the volume of "tumor burden. "

Many other tumor markers- CA125 in ovarian cancer, CA15- 3, and DF-3, for example-have also been discovered. Most of these are onco-fetal antigens, but there are also specific enzymes, receptors, and ectopic hormones shed by tumors and considered markers. While none of these has yet been useful in screening, due to inadequate specificity, many are helpful in monitorng therapy.

Quality control methodology improved notably through the work of Westgard and Groth, who introduced new multiple rules and procedures. One of their objectives has been to reduce interlaboratory variation and thus improve transferability of laboratory data between centers, a highly worthwhile goal achieved in Scandinavia but not yet in this country. The work of Gilbert and Fraser created new interest in analytical goals in clinical chemistry, and the College of American Pathologists took up this cause.

Quality control broadened into the concept of quality assurance, which addressed not only analytical quality but also reporting, data management, and the use of tests and data. This helped lead us back to the interface between the lab and clinical medicine and an interest in how tests are used and results interpreted.

Two events that accelerated this movement were the observa-

tions by Griner and Liptzin of a large-scale overuse and misuse of laboratory tests in a teaching hospital and the publication of a monograph by Galen and Gambino that introduced decision analysis to a laboratory medicine audience worldwide. Normal values and normal ranges and their ambiguities gave way to reference values and reference ranges.

Many additional advances occurred during these productive decades in which medicine grew more dependent on the laboratory for its advancement as a science. Signal achievements occurred in other areas of laboratory medicine besides chemistry. Microbiology became increasingly automated and dependent on rapid analytical methods for identifying organisms. Virology greatly expanded, as did immunology.

Blood banking became transfusion medicine, with an accent on a wide variety of blood resources for specific therapeutic needs. Cell markers-specific cell proteins acting as antigenic determinants-became useful in hematology and histopathology for identifying specific cell types.

Above all, the "New Biology" began to enter laboratory medicine in the late 1980s, with all the new concepts and tools of molecular biology, and it offered unique promise of enriching all areas of laboratory medicine in the 1990s.

Predictably, government intrusion into the clinical laboratory increased during these years. The nationwide introduction of Diagnosis Related Groups (DRGS) in 1983 imposed a new cost consciousness on medicine, the hospital industry, and clinical laboratories. The hospital laboratory be-

came less of a profit center and more of a cost center.

This made it more imperative for the laboratory director to provide strong justification when proposing acquisition of new equipment and introduction of new tests. It also spawned a "survival manual" for clinical laboratories.

These cost-containment measures, along with an increase in the so-called "managed care" HMOs and PPOS, shifted much of the health care dispensed by hospitals to ambulatory care settings, with a predictable impact on the hospital laboratory.

The dramatically rising work volume characteristic of many laboratories in the 197Os slowed to a more manageable pace of increase in the late 1980s. Yet well-managed laboratories continued to flourish and perform more effectively than ever before their crucial role in support of patient care. After the exciting advances of the 1970s and 1980s, the future of laboratory medicine never looked brighter.
COPYRIGHT 1989 Nelson Publishing
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1989 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:history of laboratory medicine
Author:Benson, Ellis S.
Publication:Medical Laboratory Observer
Date:Jul 1, 1989
Previous Article:Study fuels drive for self-referral restrictions.
Next Article:Quality management: watchword for the '90s.

Related Articles
New directions in interpretive test reporting.
Laboratory science and professional certification in the 20th century.
If the shoe fits ... leading into the second century at Wake Forest University School of Medicine. (Wake Forest Centennial).
Medieval sexuality; a casebook.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters