Printer Friendly

A Look Back at the 20th century...Spectrometers.

The 20th century proved to be the most revolutionary in metalcasting's history. With the dawn of the new millennium and as a tribute to the people and technological innovations that have shaped our industry's past, present and future, modern casting is taking a look back at some of the most important contributions of the century.

In an effort to be quality-conscious, foundries always have looked for ways to ensure accurate composition data of their molten metal. If the chemistry of a melt is the slightest bit wrong, the probability of defective castings is increased. But determining what that chemistry really is--quickly, accurately and at a reasonable cost--presents the foundryman with a major challenge and a considerable responsibility.

With the introduction and advancements of the spectrometer over they years, foundries have been able to pour castings with the confidence that the alloy and residual content is accurate. Spectrometers offer foundries unmatched automated chemical analysis, making it one of the most dependable and essential pieces of equipment in the metalcasting industry.

Evolution of Discovery

The invention of the spectrometer was accidental. In 1666, Sir Isaac Newton inserted a glass prism in a beam of sunlight that was shining into his darkened room through a hole in the shutter. The light that came out of the prism was not white but was of seven different colors: red, orange, yellow, green, blue, indigo and violet. The spreading into rays was called "dispersion" by Newton, and he called the different colored rays the "spectrum." When the light rays were passed again through a second prism, they turned back into white light and were geometrically widened. If only one ray was passed through the prism it would come out the same color as it went in. Newton concluded that white light was made up of seven different-colored rays, which were dispersed through the prism because they were reflected at differentangles. His work was taken only so far, though.

Newton considered light to be of a particular nature, while his contemporary, Christian Huygens, considered it to be a wave phenomenon (similar to sound waves). Huygens' wave theory of light eventually triumphed over Newton's particle theory.

Between 1814 and 1824, Joseph Fraunhofer built the first spectroscope, or "spectrum viewer," by placing a prism before the eyepiece of a telescope to study the dim light of "heavenly" objects. He carefully measured the dark lines in the sun spectrum and indicated the more important lines by letters. This system of nomenclature is still used for these lines, such as the D, or yellow, sodium line. These dark lines in the sun spectrum are due to absorption of continuous radiation by vapors of the elements.

In 1860, Robert Bunsen and Gustav Kirchhoff demonstrated how they could identify useful elements such as iron, copper and lead, or sodium and potassium, in potential ores, by the colors that powdered specimens sprinkled into a Bunsen burner flame produced. They could estimate concentrations by the intensities of the colors compared to those from chemically analyzed standards to within 40-50%.

Three developments in the 1890s made it easier to explore the spectral lines in the ultraviolet range:

* using quartz prisms and lenses or gratings instead of glass as a dispersant;

* using photographic plates as a recording mechanism;

* replacing Bunsen burners with DC arcs as source units.

Frank Twyman and Adam Hilger produced the world's first commercial spectrograph in 1900 (Fig. 1). It was built in a mahogany housing for stability and had an adjustable width entrance slit, a 20-cm focal length quartz collimating lens, a Comu quartz prism, a 20-cm quartz camera lens and a plate holder for a 3.25 x 4in. photographic plate. The instrument was useful for analyzing nonferrous materials, but the 20-cm focal length produced an unresolved continuum with specimens containing heavy metals such as iron. Until 1912, spectrographs were mainly purchased by universities to identify and measure spectral lines of newly discovered elements. The first industrial application in the U.S. is credited to W.H. Bassett at Anaconda Research Laboratories, Waterbury, Connecticut. It used a 170-cm Hilger Littrow spectrograph with a spark source to analyze brass and other copper alloys and to control their composition.

Birth of Spectrometry

In 1936, Thanheiser and Heyes reported the first photoelectric detection of spectra using photocells, marking the birth of spectrometry. As a result of radar detection research during World War II, the development of photomultiplier tubes, which had the capability of taking a few photons of light and generating millions of amplified or stored electrons, led to the development of spectrometers. These machines could analyze several elements in minutes. "A fellow didn't have to be a chemist or physicist to conduct an analysis with these machines," said AlexLofquist, Angstrom, Inc., who began his career in the spectrometer field with Applied Research Laboratories (ARL), Glendale, California, more than 40 years ago. 'The first ones were highly electromechanical."

In the late 1930s, U.S. manufacturers began producing spectrographs using grating instead of a prism as the dispersing element. In 1937, Maurice Hasler, the founder of ARL, produced the first grating spectrograph for the Geological Survey of California. In 1938, Baird & Assoc., Cambridge, Massachusetts, produced its first Eagle 3-in grating spectrograph with 15,000 grooves/in., and Jarrell Ash Corp., Waltham, Massachusetts, delivered its first 21-ft Wadsworth grating spectrograph to the General Electric Co.'s River Works Lynn, Massachusetts, for quality control in the production of aircraft engines.

Due to the expense of early spectrometers, large steel plants and primary aluminum smelters were the only metal producers that could afford the first models. During the late 1940s, spectrometers began to find their way into the foundry industry, which was seeking to address a glaring need for quality assurance. "The physical properties of castings are largely dependent upon chemistry," said Jerry Spencer, a veteran of the spectrometer industry. "Metal chemistry, along with heat treating, annealing and other post-casting operations, determine how a casting will perform, and if there is too much or too little of one element, defects will occur."

Prior to spectrometers, foundry metallurgists used a welder's arc and a spectroscope to estimate chemical composition with the human eye. Also, many foundries had their samples analyzed by outside laboratories, which was time-consuming, thereby causing melts to be poured without knowledge of the chemical composition until several hours or even days later. When foundries discovered errors in their metal chemistries, the castings would be scrapped at a high cost, or worse, the customer would receive a product below quality standards. The only alternative to having the melt sample analyzed was melting certified ingot, which also carried a high price tag.

Baird introduced spectrometer technology to iron and steel industries, while ARL initiated its use in the aluminum industry. According to Lofquist, the first foundry to conduct gray iron analysis was Motor Castings, West Allis, Wisconsin. Alcoa Automotive Castings, New Kensington, Pennsylvania, was the first in the aluminum industry to utilize a spectrometer, Lofquist added. Other notable foundries in the dawning of the spectrometer include General Motors' Central Foundries, Saginaw, Michigan, which created its own spectrographic standards; Clow WaterSystems Co., Coshocton, Pennsylvania; Dominion Castings, Ltd., Hamilton, Ontario, Canada; Wabash Aluminum, Wabash, Indiana; John Deere Waterloo Works Foundry, Waterloo, Iowa; and Simon Saw & Steel, Lockport, New York.

With the ability to conduct analyses in-house with spectrometers, foundries could pour metal with full confidence of the exact chemical composition. Knowing that they were receiving consistent, quality castings was an invaluable asset in the eyes of casting customers. Other advantages of spectrometer use included:

* the avoidance of lost heats;

* minimization of alloy additions;

* ferrosilicon savings;

* the ability to integrate new alloys.

Over the Years

With casting customer specifications becoming tighter over the years, foundries that owned spectrometers required more out of them and those that did not have the luxury of an in-house spectrometer searched for ways to acquire the instrument. In the early 1960s, the vacuum spectrometer was introduced. This development offered the ability to determine carbon, sulfur and phosphorus, and better wavelengths for other elements.

The accuracy of spectrometers was improved with advances in spark excitation sources. While early spark sources operated at 60 Hz, advancements in source technology allowed operation at 300-400 Hz. This increased the speed at which analyses were returned.

Using an atomic, or optical, emission spectrometer with spark discharge, it was possible, even in the 1960s, to analyze a steel sample for more than 12 elements in less than 3 min. This previously undreamed of analysis time awakened the interest of metal manufacturers. Short analysis times during melting, together with the reliability of analytical results, led to the transformation of metallurgical processes.

In 1972, mobile spectrometers for onsite use were developed. These instruments made it possible to sort and inspect scrap in less than 5 sec, thus opening up a whole new range of applications for spectrometers, while providing for greater insight into scrap content before melting.

Improvements in the instruments themselves also were substantial. When first used, spectrometers had to be installed in temperature-controlled laboratories. As the instruments became smaller, environmental requirements lessened to the point that some could be used on the foundry floor. Other improvements occurred in sample-taking and preparation. Use of a high-energy preburn, which homogenizes the surface of the sample, improved accuracy and sped up analysis.

The cost of foundry spectrometers has dropped dramatically over the past several years because of:

* replacement of vacuum tube technology with solid-state devices, bringing significant cost, speed and size reductions;

* improvements in optical component manufacturing, such as the transition from ruled gratings to holographic gratings;

* the ability to produce smaller spectrometers with superior optical resolution;

* the almost geometrical improvements in PC technology, allowing lower cost, smaller size and a high degree of software control of basic instrument functions,

Impact of Computers

A large milestone in the further development of spectrometers was the use of computers. In the late 1960s, computers first were used to control spectrometric measurements and convert analog signals into concentrations, thus shortening analysis times. More importantly, human error was excluded from the evaluation process. As computers became smaller, more efficient and less expensive, they became more widely used in spectro-chemical analysis. In addition to taking over measurements and evaluation, computers also are used to control the spectrometers, monitor the functions of the system and transmit analysis results to other data stations. Today, a spectrometer without a computer is inconceivable.

Prior to 1970, spectrometer readouts came out on a strip chart recorder, or analog clock. Analog readouts proved much faster and more accurate than spectrographic analysis. Numbers were available immediately and could be transferred directly to the melt station by telephone. Core memory, which was a mere 4 kilobytes back in the 1960s, has expanded, while hard drives are able to store programs that used to be fed into the instruments on paper tape.

With a modern spectrometer, it takes less than a minute to completely analyze a sample for chemical composition.

Expanding Business

An advantage of spectrometers that some foundries can credit their longevity to is their ability to expand a shop's capabilities. In the early 1950s, Clow Corp., a gray iron pipe foundry with three facilities, did not require analysis of its melts. When it realized that it could produce longer pipe with ductile iron, which had recently been discovered, the foundry purchased a spectrometer. "With ductile iron, it was necessary to analyze the composition of the melt," said Sam Clow, a former technical director for the foundry. "We were able to enter a new market and produce more feet of pipe with ductile iron, which was great for us since our pipe was sold by the foot."
COPYRIGHT 2000 American Foundry Society, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2000, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Bastian, Kevin M.
Publication:Modern Casting
Date:Oct 1, 2000
Previous Article:Selecting a Spectrometer for Your Operation.
Next Article:Research Identifies Opportunities for Increased Casting Usage.

Related Articles

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters