Printer Friendly

Quantized atom.

Now that Rutherford had formulated the nuclear atom (see 1911), it was posible to view the hydrogen atom as consisting of a nucleus (bearing a charge of +1) and a single electron (with a charge of -1) circling it.

One could argue that the electron, as it circled the nucleus, in effect oscillated from side to side. This oscillation, according to Maxwell's equations, should result in electromagnetic radiation. But if this were so, the electron would lose energy as it circled and would spiral into the nucleus.

The Danish physicist Niels Henrik David Bohr (1885-1962) tried to solve this problem by applying quantum theory to the atom. The electron, he decided, could not radiate energy except in intact quanta, each of which represented a large amount of emergy on the atomic scale. Consequently the electron, when it radiated, would lose a large packet of energy and would not spiral into the uncleus gradually but drop very suddenly to a lower orbit nearer the nucleus.

It would do this each time it radiated a quantum of energy. Eventually it would reach the lowest orbital state, below which it couldn't fall, and it would then emit no more energy.

In reverse, if the atom absorbed energy, the electron would suddenly rise to a higher orbit, and this would continue with further absorption of energy until it left the atom altogether, at which time the atom would be ionized, becoming a fragment with a positive charge equal in size to the number of electrons that had been boiled off, so to speak.

As electrons rose to higher orbits and fell to lower orbits, they would radiate only certain wavelengths, and under other conditions, absorb those same wavelengths, as Kirchhoff had shown over half a century before (see 1859).

The presence of many electrons rising and failling in orbits might confuse the issue, but hydrogen, with its single electron, should be easier to handle.

Indeed, hydrogen has a simple spectrum, giving off radiation at a series of wavelengths that can be related to each other by a rather simple equation.

This equation had been worked out by a Swiss physicist, Johann Jakob Balmer (1825-1898), in 1885. It had not seemed to have much significance at the time, but now Bohr could choose orbits for a hydrogen electron that would yield just those wave-lengths that the hydrogen spectrum displayed.

Bohr's suggestion wasn't perfect. There were fine details of the hydrogen spectrum that it couldn't account for. There was also no explanation of why the electron, when it was in a particular orbit and oscillating back and forth, did not lose energy. If it couldn't give off an entire quantum, why didn't it stop oscillating? Bohr's suggestion, however, was the first application of quantum theory to the atom and was enormously important for that reason. The imperfections were gradually removed in succeeding years, and for his work, Bohr was awarded the Nobel Prize for physics in 1922.

COPYRIGHT 1994 HarperCollins Publishers
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1994 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Asimov, Isaac
Publication:Asimov's Chronology of Science & Discovery, Updated ed.
Article Type:Reference Source
Date:Jan 1, 1994
Previous Article:Lead isotopes.
Next Article:Nitrogen-filled light bulbs.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters