Printer Friendly

Backing up 'back prop.' (back propagation in neural networks)

Backing up 'back prop'

In the early 1970s, a neural network training procedure known as back propagation was developed by three independent sets of researchers. Since then, back propagation has become the predominant neural network approach to studying brain function.

Critics argue the system is biologically implausible and a poor model of how circuits of brain cells handle information. But one of the originators of back propagation, computer scientist Paul J. Werbos of the National Science Foundation in Washington, D.C., says there is a future for the system as a model of how humans learn.

Back propagation networks contain a layer of input units, a layer of output units and an intermediate or "hidden" layer of units. With repeated trials, the hidden layer takes on response properties that best accomplish the computational task being learned. During training, error signals are sent back through the network to adjust the strength of connections between all processing units in order to push the system toward a predetermined output. A recent experiment produced hidden-unit responses to visual input that closely matched electrical responses of monkey brain cells critically involved in vision (SN: 3/5/88, p.149).

Conventional wisdowm, Werbos says, holds that information flows forward from cell to cell in the mammalian brain, but does not retrace its steps in the back propagation manner. "But I believe there is a biological basis to all this work in neural networks," he says. There are indications, for instance, of a backward flow of electrical processing among glia, poorly understood cells in the brain that serve as a kind of glue holding neurons in place. Glia may provide a biological basis for back propagation, says Werbos.

The challenge for computer modelers, he says, is to design back propagation learning rules that work faster than the relatively slow systems now in use, and to develop networks that learn about the environment without the external guidance of error signals.
COPYRIGHT 1988 Science Service, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1988, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Bower, Bruce
Publication:Science News
Date:Aug 6, 1988
Previous Article:High society on the brain.
Next Article:Of mice and asthma.

Related Articles
Neural nets catch the ABCs of DNA.
Neural networks set sights on visual processing in brain.
The brain in the machine: biologically inspired computer models renew debates over the nature of thought.
Neural-net neighbors learn from each other.
Mathematics, computer science and statistics: a software tool developed for the classification of remote sensing spectral reflectance data.
Physics and engineering: a procedural method for determining model order for feedforward neural networks.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters