Printer Friendly

Rethinking the mind; cognitive science faces a philosophical challenge.

Cognitive science faces a philosophical challenge

John R. Searle sees gaping cracks in the edifice of the mind constructed by cognitive scientists. Searle, a philosopher at the University of California, Berkeley, peruses the mental rules and representations and computer programs that buttress the cognitive citadel with the eye of a skeptical contractor. Watch out for falling bricks, he warns; the structure lacks the mortar of consciousness to hold it together.

"More than anything else, it is the neglect of consciousness that accounts for so much barrenness and sterility in psychology, the philosophy of mind, and cognitive science," Searle asserts.

Although Searle's remark will win him no popularity contests among scientists of the mind, it nevertheless reflects the recently renewed interest in deciphering the nature of consciousness. From a variety of perspectives, scientists are now trying to define more clearly what they mean when they refer to "conscious" and "unconscious" mental activity.

Searle first rankled cognitive scientists in 1980 when he published his widely cited "Chinese Room" argument, an attack on the notion, promoted by advocates of "strong artificial intelligence," that the mind corresponds to a computer program implemented in the hardware of the brain.

Searle compared the computers favored by artificial intelligence enthusiasts to a person who does not speak Chinese but sits in a room with Chinese dictionaries and a filing system. If an outsider slips questions written in Chinese under the door, the person uses the reference works to compose answers in Chinese. Responses emerging from the room might prove indistinguishable from those of a native Chinese speaker, Searle contended, even though the person toiling in the Chinese Room understands neither the questions nor the answers.

The moral of this exercise: A system such as a computer can successfully employ a set of logical rules without knowing the meaning of any of the symbols it manipulates using those rules.

Supporters of strong artificial intelligence view the Chinese Room as a flimsy sanctuary from the argument that a properly programmed computer possesses a "mind." Philosopher Daniel C. Dennett of Tufts University in Medford, Mass., calls Searle's analogy simplistic and irrelevant. A computer program that could hold its own in a conversation would contain layers of complex knowledge about the world, its own responses, likely responses of a questioner, and much more, Dennett contends. Indeed, computers have displayed a growing conversational prowess in the last several years. Their increasingly deft dialogue stems from the interactions among various strands of information, each of which comprehends nothing on its own, Dennett maintains.

Put another way, proper programming transforms a bunch of unreflective parts into a thinking system, whether they reside in a mainframe or a human skull.

For Searle, the Chinese Room debate lies behind him as he aims his new assault on what he calls the "much deeper" mistake of cognitive scientists--their neglect of conscious experience. He describes his views in the December 1990 BEHAVIORAL AND BRAIN SCIENCES (with sometimes heated responses from more than 30 cognitive scientists) and in his book The Rediscovery of the Mind (1992, MIT Press).

Cognitive science tends to regard the mind as a collection of relatively independent faculties, or modules, that contain unconscious rules for language, perception, and other domains of thought, Searle argues. Consciousness, in the sense that one can pay attention to, reason about, or describe these rules, rarely gets attention in their theories. Other facets of the unconscious, such as memories and repressed conflicts, sometimes enter awareness but more often influence thought and behavior in surreptitious ways, according to cognitive researchers.

Searle spurns this approach, with its reliance on what he calls a "deep unconscious" unable to pierce the surface of awareness. Mental life consists of conscious states and those neurophysiological processes that, under the right circumstances, generate conscious states, he argues. Most brain states that participate in mental life do not reach consciousness, but they must have the capacity to do so, Searle proposes. He dubs this formulation the "Connection Principle."

For example, the unconscious intention to satiate hunger with food simply reflects some biological aspect of the brain's workings that has the capacity to produce conscious appetite and food-seeking behavior in certain situations, Searle asserts. Unconscious processes totally divorced from awareness, such as the transfer of chemical messengers from one brain cell to another, harbor no intentions and do not meet Searle's criteria for "mental."

Mental life emerges as an inherent feature of the brain, just as liquidity is a feature of water, in Searle's view.

Moreover, consciousness feeds off an individual's singular point of view, thus rendering it subjective and not reducible to traditional objective measurements of behavior, he maintains. Investigators of consciousness must strive to understand "the first-person point of view," Searle says.

His model treats consciousness as an on-off switch -- one is either conscious or not. But once the switch goes on, the brain produces consciousness in a broad range of intensities. Even dreams are a mild form of consciousness, Searle asserts, though they fall far short of full-blown alertness.

The center of conscious attention contrasts with "peripheral consciousness," he adds. For example, while focused on writing an article, a person retains peripheral awareness of the shirt on his or her skin, the feel of computer keys, and numerous other thoughts and sensations.

Searle treats the unconscious mental conflicts and desires described by Sigmund Freud as cases of "repressed consciousness," because they typically bubble to the surface, although often in disguised form. Most beliefs, worries, and memories also operate outside awareness, with the potential for entering consciousness, and thus follow the Connection Principle, he notes.

But the unconscious does not consist of fully formed thoughts, Searle asserts. The brain contains "brute" biological processes that create the wide variety of conscious experiences, he says; in essence, the brain builds consciousness on the spot, rather than hauling it out of storage.

Cognitive researchers make a mistake comparable to that of scientists more than a century ago who erroneously believed that the leaves of a plant turn toward the sun because the plant wants to survive, he maintains. Biologists later learned that secretions of a specific hormone direct the movements of a plant's leaves, not a floral "decision" to catch as many rays as possible.

In the same way, the brain's neurophysiology produces certain types of conscious experience without making any inferences or following any rules, according to Searle.

Consider a visual phenomenon known as the Ponzo illusion. When two parallel, horizontal lines of equal length lie between two vertical lines that converge toward the top, the parallel line on top looks longer. Cognitive psychologists who specialize in perception suggest that the brain computes sensory information using unconscious rules that sometimes produce optical distortions. One theory holds that the Ponzo illusion may result from two unconscious inferences: first, that the top parallel line lies farther away because it's closer to the converging lines, and second, that the top line extends farther because it's farther away.

Searle offers an alternative theory: Still-unclear brain processes handle parallel and converging lines in such a way as to produce the conscious experience of the Ponzo illusion. "Nonconscious operations of the brain know nothing of inference, rule following, or size and distance judgments," he argues.

Searle also attacks the notion of unconscious rules of "universal grammar" championed by linguists such as Noam Chomsky of the Massachusetts Institute of Technology in Cambridge. Chomsky and others have theorized that the ability of healthy children to learn readily the language of their community and other natural human languages -- but not logically possible "artificial" languages -- shows that the brain contains an innate "language-acquisition device" consisting largely of grammatical rules that are unavailable to conscious thought.

Searle, an ardent foe of universal grammar for more than 15 years, agrees that human brains contain a biological capacity for language acquisition that limits the type of languages we can learn. But proposing language rules that lie beyond the grasp of consciousness makes as little sense as proposing a universal visual grammar that tells us, "If it is infrared, don't see it, but if it's blue, it's okay to see it," Searle holds. The brain's visual system simply limits what sort of colors humans can see.

The Berkeley philosopher says it often proves tempting to theorize about thought processes unavailable to consciousness, especially when studying complex abilities such as language learning. But "brute neurophysiology," not hidden rules, translates perceptions and language into thought and behavior, Searle contends.

Connectionist computers, also known as neural networks, work on this principle, he notes. Some connectionist models convert meaningful input into meaningful output by mathematically altering the sensitivity of connections between processing units rather than by manipulating rules or symbols. Neural networks may still fail as models of the mind, but they avoid the quicksand of "deep unconscious rules" that sucks down cognitive science, Searle says.

Searle's critique wins little favor with some cognitive scientists. His demand for conscious accessibility to all unconscious mental states "is arbitrary and pointless," contends MIT's Chomsky.

One current version of universal grammar theory, proposed by Chomsky and his colleagues, describes specific types of mental rules and representations handled automatically by a computational system in the brain. These theoretical mental guidelines help explain why people extract meaning from certain expressions (such as "John is easy to catch") but find the wording of other expressions confusing or somehow wrong (as with "John is easy to be caught"), according to Chomsky.

Searle's abandonment of unconscious rules for language understanding in favor of unknown neurophysiological properties presents a prescription for scientific confusion, Chomsky asserts.

Moreover, Searle's contention that unconscious mental states possess intentions or goals appears erroneous, holds psychologist Maria Czyzewska of Southwest Texas State University in San Marcos. Psychological researchers currently assume that the unconscious contains information that lies mainly outside conscious control but that shapes thoughts and behavior, despite lacking a preset agenda, she points out.

For example, multiple-choice or recall tests containing words on a previously studied list evoke conscious memories. However, volunteers often unintentionally use the same words on tests that do not ask for those words, such as lists of word fragments that can be completed in a number of ways (SN: 11/17/90, p.312). The latter memories receive the label "implicit," or unconscious.

Implicit responses do not spring from unconscious intentions, nor do they suggest a distinction only between consciousness and neurophysiological states that produce consciousness, Czyzewska contends.

Another critique comes from physiologists Walter J. Freeman and Christine A. Skarda, both of the University of California, Berkeley. Neurophysiological processes do not cause mental states, conscious or otherwise, as Searle proposes; they are mental states, Freeman and Skarda argue.

Background electrical activity in the brain reflects a chaotic process, in the mathematical sense, they theorize (SN: 1/23/88, p.58). The brain uses this flexible energy state to organize massive numbers of brain cells instantaneously in response to sensory information, the scientists contend. They have reported that chaotic electrical activity appears in both the olfactory and visual cortex of rabbits.

Self-organized, unpredictable electrical activity throughout the brain incorporates past experiences and creates an unconscious "space of possibilities" for handling further experiences, Skarda maintains. As the brain takes in new stimuli, various patterns of chaotic activity assume consciousness in the form of thoughts and memories, she contends, rather than emerging from an unconscious storage bin, as in digital computer models of memory and learning.

"Brains are less like libraries than like nurseries and farms," Freeman notes.

Even if Freeman proves right, cognitive scientists have yet to develop a green thumb for cultivating true insight into the mind, Searle responds. They invoke a bevy of invisible rules and computer programs in the brain that only stoke intellectual chaos, he holds.

No simple scientific remedies will cure what ails cognitive science, Searle remarks, but researchers must remember that "the brain is the only thing in our skulls, and the brain causes consciousness."
COPYRIGHT 1992 Science Service, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1992, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Bower, Bruce
Publication:Science News
Date:Oct 17, 1992
Words:1982
Previous Article:Research on cell-control path gains Nobel.
Next Article:Cornering the unconscious.
Topics:


Related Articles
Consciousness raising.
Twentieth World Congress of Philosophy: a historical meeting of the minds.
Evolutionary explanation and consciousness.
New era of language learning.
Arguments for philosophical realism in library and information science.
Humanist profile: Daniel C. Dennett (1942-present) 2004 Humanist of the Year.
Mavericks at Work--Why the Most Original Minds in Business Win.
Multiple Sclerosis: Understanding the Cognitive Challenges.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters