Printer Friendly

A history of computers and computerized imaging.

The computer has revolutionized diagnostic imaging, making possible techniques such as computed tomography, magnetic resonance imaging, sonography and computed radiography. This article traces the historical development of computers and demonstrates how their brief association with the radiologic sciences has transformed diagnostic medicine.

Until the early 1970s, radiographic imaging was accomplished basically the same way it had been since Wilhelm Conrad Roentgen first discovered the x-ray: A beam of x-ray energy was transmitted through an object and a sensitive image receptor recorded the results. Improvements had been made in the way the image was generated and recorded, but the basic process had not changed.

In the early 1970s, however, radiography underwent a revolution. The incorporation of computers into the image generation process during the 1970s was one of the biggest advances in diagnostic imaging since the discovery of the x-ray itself The changes that computers have fostered, including the rise of new imaging techniques such as computed tomography and magnetic resonance imaging, continues even today.

This article discusses the development of the computer and examines its impact on radiology during the past 25 years.

Development of the Computer

Man's need to count, categorize and record information has expanded with his increasing societal evolution and complexity. Early man used his fingers to count and classify elementary objects. Aristotle was the first to suggest that our base-10 number system evolved simply because humans have five fingers on each hand.[1] When early man ran out of fingers and toes to count with, he probably progressed to groups of sticks or stones.

As mankind progressed and organized itself into tribes, societies, governments and economic systems, counting and methods of recording data became increasingly important. As a result, our ancestors created mechanical devices to aid them in their calculations. The abacus is an example of one early counting device. Although "abacus" is a Greek word and the Greeks are known to have used the device around 500 B.C., historians have credited its invention to the Babylonians.[1]

Another significant development in the history of computing was the establishment of the concept of zero. The Hindus, circa 800 A.D., were the first to describe this important idea.[1] The establishment of zero clarified the value of a number's position and made it possible to express all numbers by a combination of 10 symbols.

As the years passed, Eastern mathematical developments filtered into Europe. Arabic notation was introduced to Europe circa 1150. The invention of the printing press in the 1400s permitted the widespread dissemination of Arabic notation and other mathematical principles throughout the world.

Galileo (1564-1642) made inestimable contributions to the sciences in general and particularly to computing. He linked mathematics with the physical sciences,[2,3] making it possible to quantify and prove abstract physical principles.

John Napier (1550-1617), the Scottish mathematician who devised logarithms, also constructed a device for counting. Called the "Napier bones," the device consisted of a series of cylinders with numbers imprinted on them. By turning the cylinders, Napier was able to multiply and divide numbers.[4]

Based on Napier's early work, William Oughtred developed the slide rule in the 1620s. The slide rule, which continued to be used well into the 20th century, was a direct predecessor of the analog computer.[4]

French philosopher and mathematician Blaise Pascal (1623-1662) developed a computing machine that involved separate counters. When one of the counters reached its maximum capacity, it would rotate to zero and the counter to its left would increase its count by one unit.[5] Today, the name Pascal is associated with a high-level computer programming language.[6]

Gottfried von Leibniz (1646-1716), recognizing the need for faster and more accurate computations, continued to improve counting machine design. Whereas Pascal's machine was basically an adding machine, von Leibniz produced a machine that could multiply, divide and extract roots.[6] He also advocated the adoption of the binary number system for machine calculations.

In the mid-1800s, Englishman and computer pioneer Charles Babbage (1792-1871) designed a prototype of the modern computer. The concept for his "analytical engine" included a "store" where all necessary mathematical variables would be readied for operation and a "mill" where the actual calculations would be performed. In modern computers, these devices are known as the memory and the central processing unit.

The system of input for Babbage's analytical engine was a series of punched cards.[7,8] Babbage did not invent the punch card system; he adapted and improved the system developed by Joseph Marie Jacquard (1752-1834). Jacquard, a weaver, used punched cards to control weaving looms.

Babbage's work was futuristic. He and a friend, Lady Ava Lovelace, struggled to devise a method for programming the analytical engine. They worked on the machine for years but never completed it, partly due to the lack of an adequate power supply.

In an ironic twist, Babbage's work was not appreciated during his own lifetime. It was not until after the first electronic digital computer was developed, based on designs he promulgated in the 19th century, that Babbage's genius was recognized.

Herman Hollerith (1860-1929) was an American engineer who shared Babbage's love of tinkering. Indeed, Hollerith patented many inventions in addition to advancing the process of computing.

In 1890, Hollerith was asked by Dr. John Shaw Billings of the U.S. Census Office to devise a method to tabulate the 1890 census in a timely manner. The 1880 census had taken nine years to tabulate by hand. Hollerith was able to construct efficient calculating machines based on a punch card input system. His machines were so successful that the 1890 census was counted within 12 months.[9] Hollerith later founded a business called the Tabulating Machine Company, which made great strides toward revolutionizing business office machines. In the mid-1920s, long after Hollerith had sold his interest in the company, it changed its name to the International Business Machine Company -- better known today as IBM.

Although simple counting methods had progressed, scientists needed more sophisticated devices to solve complex equations. A large contribution was made by an engineer at the Massachusetts Institute of Technology, Vannevar Bush. In 1930, Bush and several colleagues built what was essentially an analog computer. The machine weighed 100 tons, contained nearly 200 miles of wire and required 2000 vacuum tubes.

Bush's scientific contributions were substantial, but his political acumen was of greater benefit to later computer development.[1] As one of the premier engineers of the day, he accurately interpreted the international political climate of the 1930s and met with President Franklin Roosevelt to advocate the organization of the U.S. scientific community in preparation for war. Convinced by the persuasive Bush, Roosevelt made him director of the newly created National Defense Research Committee. One of Bush's first acts was to enlist the support of the nation's universities.

A crucial problem to the U.S. Army during World War II was the calculation of accurate ballistic tables for artillery pieces. This was a challenging assignment for any mathematician and was the perfect project for a computer, which could factor in all of the variables and necessary computations. One of the universities involved in the war effort was the Moore School of Engineering at the University of Pennsylvania. It had acquired a differential analyzer -- Bush's 1930 invention -- to help calculate the firing tables.

John Maunchly, a physicist, was a graduate student and faculty member at Moore. There, Maunchly met J. Presper Eckert, a fellow graduate student and lab instructor. Both Maunchly and Eckert were conducting research that involved highly complex equations. In addition, both were working on the Army's ballistic tables using the differential analyzer. The accuracy of the firing tables produced with the differential analyzer was in doubt, but there was no alternative method.

Maunchly and Eckert began discussing the construction of a new type of electronic counting device -- a digital counting machine. Their idea was met with initial criticism because it would require thousands of vacuum tubes. The machine would be unreliable, critics argued, because the tubes would be subject to failure. Eckert overcame this objection by proposing to operate the tubes at less than full power.

Fervent work on the machine began in the spring of 1943. Electrical components were collected, inspected, modified and incorporated into the design. In May 1944, Maunchly and Eckert s creation was ready for its first test. It would be considered a success if, when fed a mathematical problem, the machine could store the information, process it according to programmed instructions and then provide a result. It succeeded.

Maunchly and Eckert s machine consisted of 40 separate modules, 17,468 vacuum tubes, 1500 electrical relays, 70,000 resistors and 10,000 capacitors. It weighed 30 tons, took up 1800 square feet and was 100 feet long, 10 feet high and 3 feet deep.[1] It was named ENIAC -- Electronic Numerical Integrator and Calculator -- and it was the first digital computer.[10]

Aided by John von Neumann, Maunchly and Eckert later produced a second computer, the EDVAC (Electronic Discrete Variable Computer). This computer's program was stored internally, based on von Neumann's concept.

The UNIVAC (Universal Automatic Computer) was produced in the 1950s, again by Maunchly and Eckert. The U.S. Census Bureau purchased UNIVAC, completing the circle begun 60 years earlier in the bureau's quest to streamline its number-crunching efforts.

In terms of physical size, these early computers were goliaths. They also had a voracious appetite for electricity, leading to the early conclusion that computer use would be limited to the government or big business. Further developments in the 1960s and 1970s, however, resulted in an explosion of computer technology that eventually affected all phases of American life.

Key among these developments were the invention of transistors, the production of the integrated circuit and the introduction of the silicon chip.[10] Transistors replaced bulky, heat-producing vacuum tubes and also were more reliable. With the smaller size of the transistor, the integrated circuit was incorporated to reduce the size of the computer even further. Finally, with the invention of the silicon chip, it became possible to etch an entire central processing unit, as well as complex electrical circuits, onto areas the size of the head of a pin.

As computers became more reliable and less expensive, researchers began to investigate their potential utility in medicine. Computerization had great promise, particularly, for the medical imaging sciences.

Historical Perspective of Medical Imaging

Medical imaging's genesis came on Nov. 8, 1895, in the laboratory of a professor at the University of Wurzburg in Germany. Wilhelm Conrad Roentgen was experimenting with a Hittorf-Crookes tube that emitted cathode rays and caused certain materials to luminesce. Because he thought their effect might be obscured by the strong luminescence of the tube itself, he carefully covered the tube with pieces of black cardboard. After testing the tube to his satisfaction, Roentgen was preparing to interrupt the current when he suddenly noted a faint flickering glow on a nearby bench.[11] Excitedly, Roentgen lit a match and to his great surprise discovered that the source of the mysterious light was a barium platinocyanide screen lying on the bench.

Roentgen spent the next seven weeks in his laboratory, even taking his meals and sleeping there, attempting to validate his discovery. When asked what he thought upon observing the fluorescent light, Roentgen replied, I did not think, I investigated."[12] During this time, he wrote to a friend, "I have discovered something interesting but I do not know whether or not my observations are correct."[11]

One evening, Roentgen persuaded his wife to be the subject for an experiment. He placed her hand on a cassette loaded with a photographic plate and made an exposure of 15 minutes. On the developed plate, the bones of her hand were clearly visible." Next, Roentgen prepared a manuscript titled On a New Kind of Ray, a Preliminary Communication," which he submitted to the Wurzburg Physical Medical Society on Dec. 28, 1895.[13] In the manuscript, Roentgen coined the term x-rays" to distinguish them from other known rays.[14] He also showed that many materials were transparent to x-rays, although in widely varying degrees. Roentgen had discovered a new type of radiation which, upon leaving the Crookes tube, was capable of passing through certain solids that are opaque to ordinary light.[15]

Roentgen sent copies of his article and prints of the x-ray picture to several well-known physicists. One of the scientists showed the prints to his father, who was a newspaper editor. The father immediately prepared an elaborate article on the revolutionary discovery, and within days the news had spread throughout the world. Almost overnight, Roentgen became "the focus of international praise, condemnation, and curiosity."[11]

A historical footnote to Roentgen's discovery is that the first x-ray picture actually was produced almost six years earlier. On Feb. 22, 1890, Professor Arthur Good-speed of the University of Pennsylvania was demonstrating the properties of a Crookes tube to William Jennings, a photographer. Jennings had stacked several unexposed photographic plates next to the tube, on top of which were two coins-his trolley fare. When Jennings later developed the plates, some were mysteriously fogged and one contained an image of two round discs. The plates were filed away and forgotten for nearly six years. Only after Roentgen s discovery did the two men re-create the setting and grasp the magnitude of the observation that they had failed to make.

The first public demonstration of x-rays before a scientific body occurred Jan. 23, 1896, when Roentgen addressed the Wurzburg Physical Medical Society. He invited a famed anatomist, Albert Von Kolliker, to have his hand photographed by the rays. After an excellent x-ray picture of the hand was shown, the anatomist led the assemblage in three cheers for the discoverer and proposed that the new rays be called "Roentgen's Rays."[11] In 1901, Roentgen received the first Nobel Prize in physics for his discovery.

The x-ray's discovery prompted a flurry of scientific research worldwide. In 1896, fluoroscopy was detailed by the Italian physicist Enrico Salvioni. He called his invention a cryptoscope and said it "consisted of a tube with a fluorescent screen at one end and an opening for the eyes at the other."[11]

American Thomas Edison plunged into radiography with great enthusiasm. His laboratory tested 8000 substances for fluorescence and found calcium tungstate to be superior. He coated a rigid material with this chemical, designed an aperture for the eyes and covered it with a hood. Edison first demonstrated his device at the 1896 National Electrical Exposition in New York, calling it the Edison fluoroscope.[16]

On Jan. 27, 1896, Arthur Write, director of Yale University s Sloane Physical Laboratory, produced the first intentional radiograph in America. He placed a lead pencil, a pair of scissors and a 25-cent piece on cardboard-covered bromine photographic paper and exposed them for 15 minutes. The first clinical radiograph was made on Feb. 3, 1896, when Edwin Broof Front, a professor of astronomy at Dartmouth, was asked by his physician brother to radiograph the fractured forearm of a patient.[11]

Early researchers in radiology used emulsion-coated glass plates to record their images. In 1896, John Carbutt of Philadelphia manufactured the first glass plate designed specifically for x-ray imaging.[17] The plates were fragile, expensive, cumbersome and provided a low maximum density. Despite these problems, glass plates were used until 1913, when Eastman Kodak introduced a single-emulsion cellulose nitrate film that permitted lower exposures. In 1918, Kodak introduced double-emulsion film, and in 1924 the company introduced film with a cellular acetate safety base. In 1936, Ansco introduced non-screen x-ray film for extremity work and in the 1950s a polyester film base replaced cellular acetate.[17]

The intensifying screen was designed by Edison, who used calcium tungstate as a phosphor. The first screens had many flaws, but the process was perfected in the 1920s.[11] In 1922, Eastman Kodak introduced an intensifying screen that incorporated calcium tungstate crystals in a cellulose binder, coated on a cellulose sheet for support.[11] Intensifying screens composed of a barium lead sulfate phosphor were introduced in 1948, and in the early 1970s research on rare-earth phosphors in color TV tubes resulted in the development of rare-earth intensifying screens for diagnostic radiography. New green-light or blue-light emitting screens were made of high x-ray absorbing phosphors. X-ray films were matched to the respective color output of the screens. Crossover control screen and film technology, combined with new tabular grain film emulsions, continue to improve screen-film imaging.[17]

Another great advance in radiography occurred in 1913 when German radiologist Gustave Bucky described the use of a stationary honeycombed griddiaphragm placed between the patient and the x-ray film to reduce fogging from scatter radiation. In 1916, Hollis Potter described a rotating circular disk grid. The Potter-Bucky grid was introduced in 1921.

The introduction of the rotating anode x-ray tube also helped advance radiography. In 1914, two independent researchers in the United States and Germany applied for patents for a rotating anode tube. In 1915, William Coolidge patented a tube in which the electron beam was held in place by a magnetic field while the entire tube was rotated.[13] It was not until 1929, however, that Philips designed the first commercially available rotating anode tube.

A dilemma that perplexed fluoroscopy researchers for years was how to provide an image bright enough for daylight vision. At the 1941 meeting of the Radiological Society of North America, W. Edward Chamberlain proposed borrowing the image amplification techniques used in the electron microscope and television for use in medical fluoroscopy.[18] It was not possible to use daylight vision until the development of the image intensifier by Coltman in 1948.

In 1937, physicist C.F. Carlson invented Xerography, a new kind of photocopying process. Soon afterward, Xerographic methods were applied as a substitute for film radiography -- a procedure known as Xeroradiography.[16] Carlson's rights to the process were exploited around 1947 by the Hayloid Corporation, which later became the Xerox Corporation. In 1956, the first commercially available system for medical imaging was developed.[19] In the Xeroradiographic process, a blue and white reflective paper image provided extraordinary soft tissue detail. Xeroradiographic paper replaced direct exposure mammographic film in the 1960s. Today, high detail single-screen, single-emulsion film systems are used for mammography.[17] The last Xeroradiographic system was produced in 1990.

Conventional tomography was performed first by Karol Mayer of Poland, who used a moving x-ray tube to examine the heart in 1914.[20] However, a Parisian physician, Bocage, first correctly described and planned the concept in 1921. Ziedesde Plantes, a Dutchman, made the first clinical application of this concept in 1935. Plantes also coined the terms tomography, planigraphy and zonography.[21]

Computerized Imaging

Conventional x-ray imaging has several limitations, including superimposition of structures lying in the x-ray beam's path, poor soft tissue differentiation and a lack of quantitative analysis of the x-ray beam attenuation characteristics of the object being radiographed.[22]

During the past 100 years, the profession has developed methods to reduce the effect of superimposition. One method is to take more than one projection or to take a routine series of projections; a second method is tomography. Other methods include part motion techniques, such as breathing techniques and the wagging jaw technique. None of these methods is completely successful, nor do they provide for quantitative analysis of the attenuated x-ray beam. In addition, soft tissue differentiation is poor.

Godfrey Hounsfield, an engineer at Electric and Musical Industries (EMI) in Great Britain, was interested in computer applications in pattern recognitions from various electronic signals. His work led him to investigate the possibility of creating an image via the computer, based on attenuation measurements of the beam passing through matter. Hounsfield's research led to the development of the first clinical computed tomography scanner in 1972.[23] He shared the Nobel Prize in Medicine and Physiology in 1979 with Allan Cormack, another contributor to CT development.

Cormack, a physicist working in South Africa, was interested in solving beam attenuation problems with varying tissue characteristics in radiation therapy.[23] Not only was he able to solve the necessary mathematics for these problems, his solutions also were incorporated for use by the computer in CT scanners.

With CT, it became possible to take many attenuation measurements from many angles and project an image of the area of interest. The CT scanning and reconstruction methods, in fact, eliminated superimposing structures. In addition, soft tissue differentiation was remarkably improved.

Computed tomography also made it possible for researchers to manipulate images and perform quantitative analysis of x-ray attenuation. Information from a series of x-ray attenuation measurements could be presented in various contrast settings, thereby increasing diagnostic efficacy.

With the introduction of CT, the method of recording information changed. An x-ray tube still was used, but intensifying screens and film no longer were necessary. Instead, the information for the image was acquired under computer control using electronic radiation detectors (information acquisition), stored in the computer for manipulation according to programmed procedures (image generation) and projected onto a cathode ray tube according to programmed protocols (image display). The image then could be transferred to film for interpretation by a radiologist.

Since the early 1970s, improvements in CT capabilities have progressed dramatically. Today, it is common for spiral CT scanners to complete exams in seconds, compared with the several minutes it took the first CT machine. Image quality also has improved significantly as computer power and capability has increased.

The Future

The introduction of the computer revolutionized diagnostic imaging, making new approaches and new modalities possible. Today, computers have been linked successfully to all imaging modalities. Magnetic resonance imaging, ultrasound, nuclear medicine and digital subtraction angiography all rely on computer technology, and research continues on the cost-effective use of computed radiography systems for diagnostic x-ray.

With computer-assisted imaging, it is possible to generate clear images of virtually any anatomical region. In addition, dedicated three-dimensional image reconstruction computers can manipulate and display raw image data in virtually any plane. Computed radiography systems can even compensate for all but the most egregious exposure errors in radiographic technique.[24]

No single development since the discovery of the x-ray itself has contributed as significantly to the profession of radiologic technology as computer-generated imaging. Computers will continue to play a major role in diagnostic imaging in the future.

References

[1.] Shurkin JN. One, two, many. In: Engines of the Mind. New York, NY; W.W. Norton & Co; 1984:20-96.

[2.] Drake S. Galileo -- Pioneering Scientist. University of Toronto Press; 1990:233-240.

[3.] Asimov I. The search for knowledge. In: Understanding Physics. New York, NY: Barnes & Noble; 1993:8-9.

[4.] Hornsburg EM, ed. Handbook of the Napier Tercentenary Celebration of Modern Instruments & Methods of Calculation. Charles Babbage Institute Reprint Series for the History of Computing. Tumash Publishers; 1982:14,155.

[5.] Hazelton R. Blaise Pascal -- The Genius of His Thought. Philadelphia, Pa: Westminster Press; 1974:53-55.

[6.] Sanders DH. Preparing computer programs. In: Computers Today. New York, NY: McGraw-Hill; 1983:34,378.

[7.] Hyman A. Charles Babbage -- Pioneer of the Computer. Princeton, NJ: Princeton University Press; 1982:164-173

[8.] Lindgren M. Glory & Failure: The Difference Engines of Johann Muller, Charles Babbage & George Scheutz. Cambridge, Mass: MIT Press; 1990:225-235.

[9.] Austrian GD. Forging Giant of Information Processing. New York, NY: Columbia University Press; 1982:67-68.

[10.] Hunter TB. Development and architecture of the modern digital computer. In: The Computer in Radiology. Rockville, Md: Aspen Publications; 1986:7-23.

[11.] Eisenberg RL. Radiology: An Illustrated History. St. Louis, Mo: Mosby Year Book; 1992:22-97.

[12.] Glasser O. Wilhelm Conrad Rontgen and the Early History of the Rontgen Ray. Springfield, Ill: Charles C. Thomas Publisher; 1934:11.

[13.] Bruwer AJ. Classic Descriptions in Diagnostic Roentgenology, Vol. I. Springfield, III: Charles C. Thomas Publisher; 1964:25.

[14.] Glasser O. Dr. W.C. Rontgen. Springfield, Ill: Charles C. Thomas Publisher; 1945:23-46.

[15.] Selman J. X-rays (Roentgen rays). In: The Fundamentals of (ray and Radium Physics. Springfield, Ill: Charles C. Thomas Publisher; 1985:156.

[16.] Grigg ERN. The Trail of the Invisible Light. Springfield, III: Charles C Thomas Publisher; 1964:437-438.

[17.] Cullinan A, Cullinan J. Recording the radiographic image. Radiol Technol 1995;66:241-244.

[18.] Cullinan A, Cullinan J. Early fluoroscopic imaging. Radiol Technol. 1994;66:123.

[19.] Bushong S. Concepts of radiation. In: Radiologic Science for Technologists -- Physics, Biology & Protection. 5th ed. St. Louis, Mo: Mosby Publishing; 1993;12.

[20.] Bruwer AJ. Classic Descriptions in Diagnostic Roentgenology, Vol. II. Springfield, Ill: Charles C. Thomas Publisher; 1964.

[21.] Berrett A, ed. Modern Thin-section Tomography. Springfield, Ill: Charles C. Thomas Publisher; 1973.

[22.] Hounsfield GN. Computed medical imaging. (Nobel lecture.) JCAT. 1980;4:665-674.

[23.] Seeram E. Computed tomography: an overview. In: Computed Tomography-Physical Principles, Clinical Applications & Quality Control. Philadelphia, Pa: W.B. Saunders; 1994:6-10.

[24.] Mixdorf M, Tortorici M. Practical considerations in computed radiography. Seminars in Radiologic Technology. 1993;1:18.

Michael A. Mixdorf M. Ed., R T.(R), is an assistant professor in the Department of Radiological Sciences, College of Health Sciences, University of Nevada, Las Vegas.

Ray E. Goldsworthy, M.S., R.T.(R), is an assistant professor in the Department of Radiological Sciences, College of Health Sciences, University of Nevada, Las Vegas.

Reprint requests may be sent to the American Society of Radiologic Technologists, Publications Department, 15000 Central Ave. SE, Albuquerque, NM 87123-3917.

[C] 1996 by the American Society of Radiologic Technologies.
COPYRIGHT 1996 American Society of Radiologic Technologists
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1996 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Mixdorf, Michael A.; Goldsworthy, Ray E.
Publication:Radiologic Technology
Date:Mar 1, 1996
Words:4231
Previous Article:Ethical decisions at the darkest hour.
Next Article:Magnetization transfer in magnetic resonance imaging.
Topics:

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |