Printer Friendly

Consciousness raising.

Theories abound regarding the vexing nature of conscious experience

Like a chameleon that splashes a new array of colors across its skin each time it scurries from one place to another, consciousness -- the awareness of oneself and the surrounding world, as well as the sense of free will -- takes on a unique look and texture from one person to the next. Indeed, the chameleon's iridescent repertoire seems narrow compared to such conscious adventures as savoring a good meal, harboring envy toward a successful colleague, making plans for the arrival of a baby, and pondering the meaning of life.

For much of the 20th century, scientists have shunned the slippery, subjective trappings of consciousness that intrigued their predecessors in the 1800s. Instead, psychologists concentrated on the external rewards and punishments that shape behaviors (an approach known as behaviorism), while neuroscientists and biologists searched for a few glimmers of insight into how the brain works.

In another academic corner, philosophers continued their long-standing practice of devising logical arguments -- "thought experiments" and "possible world" scenarios -- to explain the relationship of mind to brain.

Over the last 30 years, these separate disciplines, as well as several others, have gradually banded -- some might say straggled -- together under the umbrella of "cognitive science" to study the mind and how it emanates from the brain's tangle of neurons and gray matter.

Several developments helped spur this cognitive convergence: Neuroscientists gained a better understanding of braincell function and the duties of various brain structures; powerful new digital computers gave rise to a search for "artificial intelligence" and theories of the brain as a highly complex computer; and psychologists and linguists delved into unconscious directives guiding language, memory, perception, and other remarkable mental feats.

Philosophers ambled out of scholarly seclusion to apply cognitive research findings to their thoughts about thought itself.

Only recently, however, has the study of consciousness gained widespread respectability in cognitive science, which has long concentrated on the unconscious mental world.

"For the first time in many years, cognitive scientists of all kinds are interested in consciousness and its relation to the unconscious," says psychologist John Kihlstrom of the University of Arizona in Tucson. "We now have some scientific tools to study this issue."

This trend provides hope that cognitive scientists may someday develop more sophisticated theories of the mind based on data drawn from diverse disciplines. If so, investigations of memory, perception, and other mental activities would undoubtedly undergo radical changes.

For now, of course, scientists of the mind disagree plenty about the nature of consciousness. To make matters stickier, investigators routinely accuse one another of misunderstanding and misinterpreting their respective theories and data. But they share a general affinity for the philosophical concept of materialism, which holds that the brain and its billions of neural minions somehow give rise to the mind.

In contrast, 17th-century French philosopher Rene Descartes expounded the long dominant opposing belief that the mind, or "soul," exists separately from the physical brain and interacts with sensory information shuttled by the brain to a central location -- the pineal gland -- from which conscious experience arises. Scientists have yet to unravel the function of the pineal gland, but it no longer passes for the seat of consciousness.

Nonetheless, cognitive scientists, who sensibly reject Descartes' argument for an ethereal mind, often erroneously cling to his notion that the brain harbors a kind of central theater where its diverse contents "all come together" in an unconscious dress rehearsal for conscious experience, argues philosopher Daniel C. Dennett of Tufts University in Medford, Mass. The "stream of consciousness" that seems to connect us to ourselves and the world around us, long espoused by philosophers and psychologists, evaporates in the harsh light of cognitive science findings, he asserts.

Dennett offers instead a "multiple drafts" model of consciousness, which he presents in his book Consciousness Explained (1991, Little, Brown and Co.). Dennett and psychologist Marcel Kinsbourne of Boston University elaborate on this model and defend it against critics in the June BEHAVIORAL AND BRAIN SCIENCES.

Dennett essentially holds that the brain unconsciously processes numerous streams of information simultaneously. The streams sometimes coalesce, sometimes conflict, but undergo continual revision, or "editing," as time passes and the brain gathers new information. Thus each stream represents a temporary draft that may or may not contribute to a conscious experience.

If, indeed, the brain creates multiple, ever-changing interpretations of experience, researchers cannot pin down the first appearance of a conscious thought to a precise moment, and the distinction between what happens before and after the onset of consciousness remains cloudy, Dennett contends. For this reason, he rejects popular theories that the brain rushes to "fill in" missing information before consciousness occurs, as exemplified by the fact that one perceives a coherent visual field despite a natural blind spot in the retina of both eyes (SN: 4/27/91, p.262). Likewise, he disputes the notion that the brain creates bogus memories after consciousness commences -- say, wrongly recalling that a passing jogger wore glasses because she instantly brought to mind a friend who wears glasses.

Dennett compares human consciousness to an evolved "virtual machine," a sort of computer software program that shapes the activities of its hardware -- the brain. The logical structure of the virtual machine relies on flexible rules that can incorporate one or more drafts into consciousness, fostering the deluded intuition that a single stream of consciousness pours forth, he asserts.

Feelings and experiences unique to each person arise from the bundles of innate and learned dispositions deposited in networks of neurons throughout the brain, he proposes. For instance, a person's perception of the red color of a Santa Claus suit goes beyond simply seeing red; it emerges from the acuteness of his or her color vision, comparisons of the red suit with stored representations of other red items, and doubtless other red-related drafts in the brain.

Over the years, the brain's virtual machine composes the shifting representations of an individual's "self," which are based largely on social experiences, Dennett argues. The self exists as a crucial fiction for getting around in the world, not a real thing; if all goes well, the created self endows its owner with the capacity for free will and moral responsibility, he holds.

Several scientific findings undermine the idea of a unified, time-specific consciousness, Dennett points out. Consider the "color phi," or apparent motion phenomenon. Volunteers watch a screen on which two dots separated by a small space briefly flash in rapid succession, creating the impression of a single dot that moves from one point to the other. If experimenters present two dots of different colors -- say, red followed by green -- the single red dot appears to move and transform itself into a green dot before the second flash occurs.

When a lot happens in a short time, as in color phi, the brain makes simplifying assumptions, Dennett argues. The brain processes that calculate the color of the second dot and falsely determine that motion took place occur at about the same time and influence a third process, which concludes that the first dot moved over and changed color on the way. In what seems like a stream of consciousness to the observer, drafts in the brain rapidly construct an interpretation of what happened that does not coincide with the actual sequence of observed events.

Perceptual "filling in," or an instantaneous memory revision by the brain, cannot explain color phi, because the various drafts that make up the phenomenon do not reach awareness at precisely the same moment, Dennett contends.

He and Kinsbourne also use the multiple drafts model to reinterpret provocative studies directed by physiologist Benjamin Libet of the University of California, San Francisco. In one set of studies, Libet's team observed a delay of up to one-half second before volunteers reported feeling a tingle in their right hand, whether experimenters induced the tingle by electrically stimulating the appropriate spot in the brain or delivering a mild electrical pulse to the right hand. This result provoked considerable surprise, since direct stimulation of the brain should enter awareness more quickly than a tingle that travels from the hand to the brain.

No good explanation exists for these results, Libet says. Perhaps people perceived the sensations as occurring equally fast because both stimuli produce a distinct electrical reaction about one-fiftieth of a second after being delivered, he speculates.

Libet's explanation proves inadequate, Dennett argues, because consciousness does not occur at an absolute time in a central brain location. Several brain processes, or drafts, must interpret jolts to brain tissue and skin. Hasty cerebral editing of these drafts produces a subjective sense that a tingle on the hand occurred as quickly as a tingle express-delivered by the brain.

Dennett also remains skeptical of Libet's claims to have shown that the decision to flex a finger begins unconsciously about one-third of a second before awareness of the decision, although volunteers can consciously veto the act of flexing before it actually occurs (SN: 4/26/86, p.266). According to Libet, this finding suggests that free will, if it exists, selects among and controls unconscious urges rather than initiating those urges.

Again, Libet charted self-reports that reflected volunteers' inaccurate intuitions that they had decided to flex a finger at a specific time, Dennett contends. Depending on how researchers test and question a study participant, they will tap into different cerebral drafts and the volunteer's experience and self-reports of what happened when will vary, he asserts.

Libet considers Dennett's multiple drafts model a "philosophical construction" that has yet to yield testable scientific theories. But some investigators point to related evidence that supports the model.

For example, studies of brain-damaged and healthy people find that diverse types of sensory information do not converge on a single brain structure, notes neurologist Antonio R. Damasio of the University of Iowa College of Medicine in Iowa City. Many brain systems generate the sense of self as well as the false intuition that experience or consciousness happens at one site, Damasio maintains. However, brain activity in each system may undergo integration to produce multiple coherent versions of experience, he asserts.

A perceptual illusion discovered in 1976 taps into different drafts of the same conscious experience, contends psychologist Andy Young of Durham University in England. When volunteers watch a videotape of a person mouthing the sound "ga," which is synchronized with the sound "ba" on the soundtrack, most report hearing the sound "da." An explanation for the "fusion" of the visual and auditory sounds into a kind of compromise sound remains unclear, Young says. "Da" corresponds to one draft of the experience, but another draft -- "ba" -- emerges if participants close their eyes during the screening, relying only on hearing, he asserts.

Consciousness of this type evolved as a side effect in brains capable of seamlessly processing numerous plans and memories independently of immediate environmental concerns, argues psychologist Bruce Bridgeman of the University of California, Santa Cruz. The many plans correspond to multiple drafts, and awareness arises only when it serves a function, such as perceiving particular objects in the world, Bridgeman holds. He rejects the notion of consciousness as a monitor of mental life.

Perhaps Dennett's most controversial assertion involves the claim that human consciousness stems from a virtual machine existing in the brain's voluminous system of interconnected neurons. This argument, a staple of many artificial intelligence advocates and sometimes referred to as "strong AI," holds that a properly programmed computer would possess a conscious self.

Thinking and consciousness -- as well as conscience -- indeed apply to computers as well as humans, argues neurobiologist Andrei S. Monin of the Russian Academy of Sciences in Moscow. Thinking systems, whether human or machine, employ step-by-step processes or programs -- what mathematicians call algorithms -- to receive and remember information, Monin contends in the Aug. 1 PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES. Some set of algorithms in the human brain compares one's knowledge, intentions, decisions, and actions with new information to produce consciousness, or "co-knowledge," he theorizes.

The first consciousness algorithm separates the individual from the surrounding environment and creates self-awareness, according to Monin. A program for self-identification and other aspects of consciousness can conceivably be worked out for modern computers, he asserts.

An algorithm must also exist for generating conscience, or the sense that some acts are good and others evil, Monin adds. This algorithm undoubtedly responds to various social and family influences--and it may play little or no role in the consciousness of some criminals and despots -- but artificial intelligence researchers should begin to experiment with "conscience programs" built into computers, Monin contends.

"It's not that you can't imagine a conscious robot," Dennett tells those who doubt such assertions, citing the friendly R2D2 and C3PO from "Star Wars" and the ominous Hal in "2001: A Space Odyssey." "It's that you can't imagine how a robot could be conscious."

Some critics charge that Dennett himself fails to grasp a fundamental problem: Achieving conscious understanding and insight requires something other than computations and algorithms. In his book The Emperor's New Mind: Concerning Computers, Minds, and the Laws of Physics (1989, Oxford University Press), Roger Penrose argues that quantum physics' subatomic forces, which operate in indeterminate and unprogrammable ways, interact within the brain to produce the rich variety of conscious experience. Modern physics currently cannot explain the quantum leaps the brain makes on the road to feeling an emotion or discerning sophisticated mathematical relations, says Penrose, a mathematical physicist at the University of Oxford in England.

John Eccles, a neurophysiologist at the Max Planck Institute for Brain Research in Frankfurt, Germany, and a Nobel laureate in medicine, agrees with Penrose's dismissal of strong AI. But unlike most researchers, he attempts to update Descartes' theory of a nonmaterial mind.

Consciousness began to evolve more than 200 million years ago in the earliest mammals, Eccles maintains in the Aug. 15 PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES. It proved advantageous as a way to experience the surrounding world and guide behavior beyond simple reflexes, he proposes. Mammalian brains evolved increasingly complex and numerous bundles of pyramidal cells, or "dendrons," capable of interacting through quantum physics with a nonphysical mental world composed of "psychons," each corresponding roughly to an idea or feeling, Eccles proposes.

This theory only accounts for "simple consciousness," Eccles adds. The unique experience of human self-consciousness lies beyond scientific understanding, at least for now, he concludes.

Eccles maintains a minority viewpoint, but even some investigators who consider the mind a product of the brain still reject Dennett's multiple drafts model. His approach "explains away" consciousness by assuming it can never really be pinned down to a particular time or place, these critics assert.

Psychologists Stephen M. Kosslyn of Harvard University and Olivier Koenig of the University of Geneva in Switzerland offer an alternative theory in their book Wet Mind: The New Cognitive Neuroscience (1992, The Free Press). Consciousness serves as a kind of check that signals whether different brain states are in proper balance and mesh properly, they contend.

Neural discharges of electricity in different areas of the brain that process the same stimulus apparently emit a shared electromagnetic rhythm that helps to stitch together a unified representation of the stimulus, Kosslyn and Koenig note. In this way, for example, neural discharges underlying representations of "ball," "red," "large," and "moving to the right," link up at the appropriate time. Consciousness has the same relation to these interconnected discharges as a chord does to individual notes played on a guitar: It exists as an interaction of brain events that produce consonance or dissonance.

If neural balance reigns, only the contents of consciousness -- an object in view, the meaning of a statement, and so on -- reach awareness, according to Kosslyn and Koenig. When dissonance arises, people become aware of and reflect on their conscious state.

The two psychologists agree with Libet that conscious experiences lag slightly behind the brain events that evoke them. Thus, brain processes that work rapidly -- the calculation of an object's orientation, for example -- evade awareness, while slower processes, which coordinate perception, memory, and movement, enter consciousness more easily, they argue.

Other investigators accept Dennett's insistence on multiple drafts in the brain but still maintain that some type of "chief editor" ties them together for conscious presentation.

The brain takes one or more drafts and constructs a stream of consciousness that goes through constant flux and change, holds psychologist Max Velmans of the University of London in England. Although usually thought of as necessary for choice, learning, memory, planning, reflection, and creativity, consciousness performs none of those functions, Velmans proposes in the December 1991 BEHAVIORAL AND BRAIN SCIENCES. Unconscious brain activity guides these abilities and produces the potential for focused attention and awareness of the results of automatic brain processes, he argues.

"Consciousness neither interacts with the brain nor can it be reduced to a state or function of the brain," Velmans contends. Yet consciousness serves a purpose, he adds -- it allows an individual to experience enough of the world to endow his or her survival with a sense of purpose. In a nutshell, consciousness gives us the will to get on with our lives, even though unconscious processes orchestrate our thoughts and feelings.

Some investigators take Velmans' argument a step further by treating consciousness as a useless by-product of the brain. In fact, the term "consciousness" captures nothing over and above an individual's experiences, argues philosopher Stephen Priest of the University of Edinburgh in Scotland.

"The onus is on the advocate of consciousness to prove that it exists," he writes in Theories of Mind (1992, Houghton Mifflin).

Abundant evidence points to the operation of various mental domains that regulate bona fide conscious states, responds Arizona psychologist John Kihlstrom. Neodissociation theory, developed in the 1970s by Stanford University psychologist Ernest R. Hilgard, offers a useful framework for understanding the relation of conscious to unconscious mental activity, Kihlstrom contended at the recent annual meeting of the American Psychological Association in Washington, D.C.

Hilgard characterized the mind as a set of separate units or subsystems that monitor, organize, and control different aspects of mental functioning. Ideally, the various units communicate both with each other and with an "executive ego," which translates information into conscious awareness and intentions, much like the central cerebral theater derided by Dennett. If lines of communication between the executive ego and various subsystems break, divisions in consciousness occur, according to Hilgard's theory.

One such division appears among brain-damaged patients with "blindsight," Kihlstrom maintains. In these cases, people report no ability to see objects in front of them but nearly always offer correct guesses about the location, form, and orientation of these objects. Brain damage apparently disrupts one of two visual subsystems, he asserts; appropriate responses to visual stimuli remain, but the subjective sense of seeing things in the world fades.

A communication breakdown between an intact subsystem and the executive ego explains the alterations of consciousness provoked by hypnosis, Kihlstrom contends. For instance, a hypnotized volunteer reports no pain upon submerging her hand in a bucket of ice because the appropriate subsystem processes the painful stimulus without sending a memo to the executive ego. The subsystem still generates its own effects, such as increasing the volunteer's heart rate as her hand gets colder, despite the absence of subjective pain.

Neodissociation theory also explains why the rapid or incomplete presentation of a stimulus, such as a word, often creates an unconscious, or "implicit," memory in the absence of any conscious recollection of that stimulus, Kihlstrom proposes. A briefly flashed word enters a verbal subsystem before it can hook up with the executive ego, he argues; thus, indirect cues, such as an ambiguous fragment of the same word with several letters missing, quickly evoke the previously viewed word by tapping into the unconscious subsystem where it resides. Research on implicit memory has mushroomed in the past several years (SN: 11/17/90, p.312).

Multiple personality disorder, a condition usually stoked by severe abuse during childhood, appears to involve two or more different executive egos that take turns controlling conscious thoughts and actions, Kihlstrom suggests. Ordinarily, people display awareness of their various social roles -- son, spouse, parent, scientist, and so on. One's ability to juggle these many roles dissolves in multiple personality disorder, he says.

Another group of psychologists, in the spirit of Dennett's multiple drafts model, eschews the concept of a central personality or executive ego. Instead, they argue that the average person creates multiple "selves" that go beyond the social roles Kihlstrom cites. As social companions and situations change, the individual creates -- consciously and unconsciously --a fundamentally different personality, rather than a variation on a basic, underlying personality, maintain psychologists such as Hazel R. Markus of the University of Michigan in Ann Arbor.

Unlike multiple personality disorder, these many selves are not oblivious to one another and do not have separate memories, according to these researchers.

Although the argument for multiple selves remains a minority viewpoint among psychologists, it strikes another blow at the central theater of consciousness Dennett hopes to demolish.

Computer scientist Drew McDermott of Yale University summarizes the multiple drafts model in this way: "I am a character in a story my brain is making up. Consciousness is a property I have by virtue of my brain's attributing it to me. My story doesn't have to cohere completely to be useful."

If brains make up your sense of self as a useful fiction, canons of morality and spirituality may need an overhaul, McDermott contends.

"If people are valuable, it is not because they are imperishable souls connected to bodies only for a brief sojourn," he asserts. "For now, we just have to take it as a postulate that creatures that invent conscious selves are to be cherished and protected more than other information-processing systems."
COPYRIGHT 1992 Science Service, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1992, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:First of two articles
Author:Bower, Bruce
Publication:Science News
Date:Oct 10, 1992
Previous Article:Efficacy of antidepression drugs challenged.
Next Article:Good news, if eating insects bugs you.

Related Articles
Rethinking the mind; cognitive science faces a philosophical challenge.
Intelligent design psychology and evolutionary psychology on consciousness: Turning water into wine.
Reader feedback.
Letters to the editor/reporter news.
FIST.packs a punch.
Allowing older people to make decisions.

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters