Printer Friendly

Conceptual conflict: Mitochondria solving age-old questions.

Religion and science are offspring of the same impulse to understand what it's all about, but, like ill-matched siblings with incompatible characters, they can be at peace with each other when in separate rooms but easily brawl when sharing the same place.

Religion, at least when it's in a good mood, can be warm and supportive--giving meaning and purpose to life in the grandest of terms, giving support and encouragement, friendly and emotional. One of its character flaws, however, is that in its intermittent disputes with science, it has the most difficult time owning up when it is wrong. Just look at the retreat of religion into the petulant "He made it in six days to look as if it took ten billion years!" Perhaps this obduracy arises because it's old and venerable and science is young and brash; perhaps it's a belief that love means never having to say you're sorry.

Science, for all its cold rationality, its rejection of purpose and meaning, its nit-picking passion for collecting facts, does not have this character flaw; it has no problem--at least when all the facts are assembled--in saying to religion, "Sorry, I was wrong."


One of the areas where they cannot avoid each other is origins: where did the universe come from? Where did people come from? They have brawled over these two topics since science was kick-started back to life a few hundred years ago.

For a long time the bickering went something like this:

"The universe started suddenly with light!" "Nonsense, it always existed!"

"The human race started suddenly with the first two people in one place!" "Humbug, we came about as groups of humanoids all over the world gradually evolved into modern humans!"

Science has already gracefully conceded the first point: "Sorry, I was wrong, you were right! It did start suddenly, and light was the main event--I calculate the ratio as ten billion bits of light to each bit of matter."

Science is also coming around on the second point. It's not quite sure about it yet, but a great step in this direction appeared on page 31 of the January 1, 1987 issue of Nature, one of the most prestigious scientific journals in the world, under the heading "Mitochondrial DNA and Human Evolution." While the work was highly technical, its conclusions were starkly shocking:

"Mitochondrial DNAs from 147 people, drawn from five geographic regions, have been analyzed by restriction mapping. All of these mitochondrial DNAs stem from one woman who is postulated to have lived about 200,000 years ago...."

The authors, Rebecca L. Cann, Mark Stoneking and Allan C. Wilson, working at the University of California, Berkeley, had overcome a long and arduous course--not the least of their obstacles being the fulfillment of Nature 's very strict standards--to stake their claim to a spot in the history books.

What it took to get to that point, and the reaction and rejection they received from the "old bones" paleontologists, has been documented in Michael H. Brown's The Search for Eve: Have Scientists Found the Mother of Us All? (Harper & Row, NY, 1990).

While this is not the place to get into details, we can at least lay down the general outline of what they accomplished.


While most have a vague idea of what DNA is (or at least have heard about it), mitochondria probably need a little introduction.

Each of the trillions of cells that make up the body are divided up into compartments that allow incompatible processes to be kept apart. The practical wisdom of industry suggests why: a manufacturing complex--which is pretty much what a cell is--would have an overwhelming problem with quality control if duplicating computer programs onto floppy disks happened in the same quarters as burning coal to power an electric generator. Keeping such incompatible processes in separate areas makes a lot of sense.

One of the great advances in the evolution of living systems occurred when a cell lineage stumbled on the great advantages of compartments and went on to become the common ancestor to all higher forms of life. The other lineages remained as simple bacteria who to this day do not have inner compartments and who, metaphorically, still duplicate their computer disks right next to the furnace.

The largest of these cell compartments is the nucleus, which is packed full of DNA. Industrially, the DNA is equivalent to hundreds of thousands of computer disks (genes) loaded with the instructions needed to program the industrial robots (proteins) that run all the myriads of processes in the industrial complex. The nucleus keeps the master disks safely stored away (chromosomes) and makes duplicates of them (messenger RNA) to send out to where they are needed in the running of the cell.

The mitochondria are usually the second largest compartment in the cell (some cells have one big one, most have lots of smaller ones). The mitochondria are the industrial equivalents of central power plants that burn fuel (glucose and fat) to generate power (ATP) for distribution to the other centers, including powering the computer-department labors of the nucleus.

All higher cells (eucaryotes) have these two compartments: the nucleus for information storage, duplication and dispersal, and the mitochondria for central power generation.

An idea that was shockingly revolutionary as recently as the early nineties--but is now almost universally accepted--is that mitochondria are descendants of bacteria (procaryotes)--that the discovery of the advantages of keeping computer disks and coal is separate compartments involved a large simple cell (which was perhaps energetically inefficient) getting invaded by a smaller bacteria (which was energetically more efficient). While this infection was probably disruptive at first (even fatal), eventually the two learned to live together in mutual harmony--the big cell doing all the work of finding the fuel, the symbiotic bacteria, the proto-mitochondria, doing all the work of burning it up.

This insight caught on quickly because mitochondria are just like bacteria; they have their own little piece of DNA (only tens of disks-worth of information compared to the hundreds of thousands in the nucleus) and they multiply just as bacteria do: They get bigger and bigger, then split into two, with each "daughter" mitochondrion receiving its copy of the mitochondrial DNA. It is this which makes mitochondrial DNA so useful in the exploration of human lineage: Its lineage is quite independent of that of the nuclear DNA.

Matrilineal descent

The second point that makes mitochondrial DNA such a useful tool involves the way human beings are made--recall from Biology 101 that this involves the fusion of an egg cell from the mother with a sperm cell from the father.

The egg cell is huge; it has thousands of mitochondria and bulging fuel stocks all primed and ready to power the development of the new embryo. In cell terms, the egg is a big fat blimp floating lazily along, waiting for destiny to arrive.

If that destiny is not to be the flush of the menses, it will start with a single sperm piercing the egg and sparking the fabulously intricate process that ends up with a human being.

For the sperm cell, this moment of destiny does not come by waiting; the sperm has to take the gold--there is no prize for second place--in an Olympic marathon. As the run is equivalent to that from Moscow to Beijing via Mount Everest in competition with a hundred million others, the sperm can be no fat blimp; it is instead a stripped-down, sleek torpedo--just a head with its precious consignment of nuclear DNA from the father, and a powerful tail powered by massive mitochondria to push it ahead of the pack.

The single sperm that triumphs sends its head and tail to quite different destinies.

The head merges with the egg and injects the father's nuclear DNA. Inside, this combines with the mother's and is packed away into the nucleus of the cell, now a zygote, ready to provide all the information needed in the construction of a human being.

The tail of the sperm, on the other hand, exhausted from its magnificent effort, drops away, its job done, and disintegrates. The result of this sacrificial effort is that none of the father's mitochondria gets into the egg--all the mitochondria in the zygote, and the human being it eventually turns into, come from the mother.

This also makes mitochondrial DNA very useful in studying lineage: all the DNA in the mitochondria in your cells--be you male or female--came from your mother. Furthermore, your mother's mitochondrial DNA all came from her mother--your grandmother--and hers from your great-grandmother, and hers from your great-great-grandmother, etc. All the way back into deepest time.

No sex, thank you

Yet another inducement for scientists to shift the study of human ancestry from fossilized bones to the DNA lab is that mitochondria don't indulge in sex.

Sex is the great mixer; it takes 50 percent of your dad's nuclear DNA and combines it with 50 percent of your mother's DNA to create a whole new 100 percent that is you. Then, in making your sex cells, it scrambles together (recombines) the contents of the dad's chromosomes with the same chromosome from the mom. That's why kids are different from their parents and their grandparents; sex keeps mixing things up in each generation.

This is the greatest thing about sex (from the lineage's point of view, at least): you get a totally different combination each generation. This blending of characters, however, is the worst thing about sex from the study-of-lineage point of view--tracing things back in time through the lineage is impossibly complicated after only a few generations.

Mitochondria don't do sex, so the copy of mitochondrial DNA which is passed on down the generations is an exact copy every time. Well, almost exact. Very, very occasionally (once in thousands of years, perhaps) a mistake is made in duplication and the DNA is changed. Most of the time, these mistakes foul things up and are quickly eliminated from the lineage. If the error is not disruptive (a neutral mutation) and happened in the formation of an egg cell, this little change can be passed on down the lineage from mother to daughter, in the matrilinear lineage.

It is these neutral changes that enable scientists to probe deep time.

Assuming that the rate of change, estimated to be 2 to 4 percent every million years, is constant--a tendentious assumption, but one that only alters the time scale--it is possible to calibrate a "molecular clock." For example, if two lineages differ by 0.3 percent, then their last common ancestor procreated roughly 100,000 years ago.

Search for Eve ...

The Berkeley group devised a technique to isolate large quantities of mitochondrial DNA from placentas (or afterbirths, the few big chunks of human flesh that are regularly chucked away) collected from a wide variety of women representing all the races. The changes in the mitochondrial DNA were identified by snipping them into little pieces with special bacterial enzymes that are very sensitive to DNA patterns--the "restriction mapping" technique.

The assumptions they made in interpreting their results were that a particular change only happened once in history (a very reasonable assumption based on what is known) and "that the giant tree that connects all human mitochondrial DNA mutations by the fewest number of events is most likely the correct one for sorting humans into groups related through a common female ancestry," as Dr. Rebecca L. Cann put it in her excellent overview, "The Mitochondrial Eve," in the Natural Science section of The World & I, (September 1987, Article #13469).

From their data they constructed a lineage that could explain the global distribution of neutral mutations. Combining this with the molecular-clock estimates and with what is known about the timing of human migrations, they concluded that the best explanation of their data was that every human being can trace their lineage back to one woman who lived in Africa about 300,000-150,000 years ago, a woman quickly dubbed "the mitochondrial Eve."

As Dr. Cann is careful to point out, their data does not prove "that all humans stem from a single female ancestor," since the mitochondrial Eve is not necessarily the very first human ancestress. There is the "Smith" phenomenon to take into account, the one that plagues telephone-directory creators--one lineage can thrive at the expense of others (though, of course, this is a patrilineal phenomenon). There could have been a group of ancestral women, all of whose matrilineal lines died out except for one, the mitochondrial Eve whose DNA got passed down to every living human being living today--it only takes one all-sons generation to stop a matrilineage dead in its tracks just as an all-daughters one will end a family name.

But the research is certainly getting close to the original ancestress. Close enough, perhaps, for science to apologize to religion for deriding the Adam and Eve concept so scathingly in the past.

In the July 1997 issue of Scientific American, the work on mitochondrial DNA had progressed far enough for the presentation of a tentative map showing how human beings spread out to populate the planet as revealed by their DNA. (Figure 1)

... and Adam

What about the men?

While there is no such thing as a mitochondrial Adam, there is another route. Sex determination--whether the zygote will develop into a boy or a girl--depends on what sex chromosome came from the father in his 50 percent: an X-chromosome will make a girl, a Y-chromosome a boy. Mothers always contribute an X chromosome: so girls are XX and boys are XY.

Boys get their Y from their dad, and he got his from his dad, and he got his from his dad, etc., etc., in a patrilineal lineage back in time.

Strangely enough, this sex chromosome doesn't get involved in sex. The X and Y that end up in a boy are so different that they don't scramble together the way the two X's do in girls. So, just like the matrilineal mitochondrial DNA in women, the Y-chromosome DNA in men is patrilineally passed on unchanged from generation to generation. Almost unchanged, that is, as it too can slowly collect neutral mutations which can be passed on. These are being studied and you can confidently expect this headline to appear one day: "Scientists find Y-chromosome Adam."

Surrogate parents

It should be noticed that science's apology is conditional: while both now agree that there was an Adam and Eve, there is still a lot of debate and disagreement as to exactly how they got there--religion still has a very difficult time with the relationship to the great apes.

Religion is going to have to unbend, sooner or later, as the mitochondrial patterns found in chimps are closely related to the patterns of mutations found in humans, which implies that the zygote that developed into Eve got its mitochondria from a chimp-like ... what?

I hesitate to use the word "mother" here as it has the implication of like to like, equal to equal. As Eve is, by definition, the first human woman, this source of mitochondria cannot be human or a "mother" in the sense of equals. But, as this female-source-of-mitochondria stood in the position of a mother to Eve, the term "hominid mother-surrogate" is appropriate.

While this does not give the definitive answer in the theological debate on, "Did Adam have a navel?" it suggests, at least, that Eve had one.

The mitochondrial linkage suggests that Eve's hominid mother-surrogate and modern-day chimps had their last common ancestor a few million years ago. Research into this is currently a hot topic of investigation.

If Eve must have had a chimp-like mother-surrogate to get her mitochondria from, you can bet that Adam must have had a father-surrogate to get his Y chromosome from.

While I have yet to see any evidence collected on this subject, bets are that the father-surrogate to Adam was also a proto-human hominid like the mother-surrogate (though, in all likelihood, they came from different lineages, since same plus same generally produces same and Adam and Eve as the first humans were, by definition, different from their parent-surrogates).

While this is speculation beyond the bounds of where experiment has reached so far, it does give hope that one day science and religion will stop their bickering about how people originated and agree that they were both partially right and both partially wrong.

Nervous OS 1.0

The lowest levels of the nervous hierarchy are quite well understood, externally. The lowest level involves the pattern of ion flows across its membrane a neuron sends down its axon, a signal down its 'output' extension that influences other cells. These patterns of electrical signals influence other cells that the axon abuts onto--which can be tens-of-thousands of other neurons in some cases. Massive parallelism is in great evidence.

The best understood aspect of how the mind works is the sensory input--how information about the environment makes it to the level of "awareness" which, in this discussion encompasses a dog seeing a cat and racing in for the kill.

The way the senses work is that a sensory neuron responds to a "bit" of information about the environment such as red photons, a sound frequency, a pressure differential or a chemical concentration, etc.--the senses we call sight, hearing, touch, pain, smell and taste.

Some organisms also possess more than these four, such as a sense for magnetic and electrical interactions, but we humans show little evidence for such sensitivities, perhaps made up for by the possession--if spiritual experiences can be taken into account--of the ability to perceive spirits such as Jesus and Mary, ghosts both benign and malignant. (1) But enough of such speculation, we shall stick to the senses we share with all mammals.

A sensory cell is rarely quiescent. It is usually firing off a series of electrical impulses down its output axon. On stimulation by a bit of information about the environment--such as a bunch of red photons--the pattern of firing changes, a different pattern of impulses is sent of down the axon.

This can be likened to the serial connection used in computers--a modem is a good example--where the pattern of bits is sent out one at a time.

This serial pattern-change might represent a minimal piece of information--a bit, a sensory pixel, so to speak--such as "red detected."

These sensory pixels are analogous to the particles at the bottom rung of the hierarchy of matter. This pattern change in the serial firing of the sensory neuron is at the very bottom of the sensory hierarchy.

The next level in the sensory hierarchy is also quite well understood. Sets of neurons--which, in the case of the eye, are not even in the brain but in the neural nets of the retina--allow these pixels of sensory information to interact with each other with all the possibilities of interference, both constructive and destructive, so well described by complex numbers. The super-systems created by these interactions are the sensory atoms, the next level of the sensory hierarchy. In the eye, for instance, these atoms of sense are items of information such as contrast changes, color gradients, etc.

This level of representation of the environment reads: A transition from deep red to light yellow was detected.

Further up the hierarchy of programmed processing are the nets of neurons that send parallel patterns of firing along their axons to other cells involved in the next level of processing. The hierarchy of visual processing is probably the best-described of the senses, at least in the bottom-up sense of looking at things.

In the brain, neural super-neural-nets, such as the retinal columns, allow the parallel input from such as the optic nerve to interact and form higher super-systems. The internal representations of these super-systems include shapes, such as a square.

Much of the early vision information processing--in massive parallel--involves simple logic such is found in regular computers. A simple example is AND: Are two inputs the same? Yes or no.

I recall reading somewhere, that all the basic logic functions can be accomplished by arrays of NOT-AND, or NAND, that is just the "yes or no" of AND flipped to its opposite, to "no or yes." The primary levels of the visual cortex do something about as simple. Many such outputs are combined into the detection of lines or patches of the same color.

The "you" doing the seeing thinks you are "seeing the outside world." But it's a virtual reality; it's a simulation. Just like my legal copy of Windows thinks it is running a real Intel Chip, it is being "fooled" by a simulation, it is actually running in the virtual reality generated by Virtual PC running on Mac OSX running on ... assembly code running a real Motorala chip.

You think you are "seeing reality" when you open your eyes. But it's a simulation: What is actually happening is intricately-pulsing neuron nets lighting up and fading. But the simulation sure looks real!

The visual cortex seems to be physically organized into columns of cells in which the sensory atoms integrate into more sophisticated entities. These columns of cells fire in correlated patterns when they "perceive" things such as horizontal and vertical lines, areas of color, etc.

Sensory representations have been ascribed a process akin to the external Darwinism in classical evolutionary theory.

This so-called Neural Darwinism has gained supporters in recent years, notably Erdleman and his selection of neural representations by elimination. The law of survival in the sensory hierarchy is survival of the fittest representation. Here "fittest" implies "being a useful way of representing the reality" of the being doing the sensing--"useful" connotating the old biological mandate of survive to reproduce. A sensory image that indicates food, while the reality is a cliff is not at all useful then, in this sense.

This perspective is supported by what little is known about learning. The infant animal has its neurons in a way that can be characterized as "everyone is connected to everyone else." This plasticity is somewhat limited, of course, by the genetic constraints on the development of the brain. But there is not enough room in a trillion chromosomes--let alone the twenty-three of our species--to determine every one of the ways in which a quadrillion cells can connect with each other. In the totally plastic state, this number would be factorial-quadrillion, which is so huge I have no idea how to calculate it.

Then there is the "stuff" that falls into QPF in the nervous system seems to involve synchronized firing of neural nets. Are they also falling into quantum probability forms? And, if so, what might be generating the quantum probability forms for them to "fall into"?

One possibility is the attendant, behind-the-throne glial cells that surround and embrace the well-understood neurons. As no other function except nourishment has been ascribed these mysterious "neuroglia, especially the astrocytes, oligodendroglia and microglia" as Yahoo has it, we will not be stepping on anyone's toes.

Could RNA have a role in carrying the linear programs in the nervous system? Sure. Ten minutes with Google and I came up with this:<BLOCKQUOTE>At learning, a sequence of events leads to a fixation of memory: information-rich modulated frequencies, field changes, transcription into messenger RNA in both neuron and glial, synthesis of proteins in the neuron, give a biochemical differentiation of the neuron-glial unit in millions, a readiness to respond on a common type of stimulus.

At retrieval, it is the simultaneous occurrence of the three variables: electrical patterns, the transfer of RNA from glial to neurons, and the presence of the unique proteins in the neuron, which decide whether the individual neuron will respond or not. (2)

In neurons, localized RNAs have been identified in dendrites and axons; however, RNA transport in axons remains poorly understood.... It is concluded that the specific delivery of RNA to spatially defined axonal target sites is a two-step process that requires the sequential participation of microtubules for long-range axial transport and of actin filaments for local radial transfer and focal accumulation in cortical domains. (3)


(1.) Examples of which are beautifully exemplified in a universally understandable tale of Scrooge and his helpers in A Christmas Carol by Dickens.

(2.) Hyden, H., "The question of a molecular basis for the memory trace." In Pribram, K. H., & Broadbent, D. E. (eds.) Biology of Memory. New York: Academic Press, 1970, p. 116.

(3.) Ilham A. Muslimov, Margaret Titmus, Edward Koenig, and Henri Tiedge, "Transport of Neuronal RNA in Axons," The Journal of Neuroscience, June 1, 2002, 22(11): 4293-4301.

This article was excerpted from a book by Richard LLewellyn Lewis, PhD, The Unity of the Sciences, V.1: Do Proteins Teleport in an RNA World? (International Conference on the Unity of the Sciences, 2005), 297 pages, $24.95 [1-931166-24-2].

Richard LLewellyn Lewis, PhD

Richard LLewellyn Lewis, who holds a doctorate in biochemistry, is an author based in New York City.
COPYRIGHT 2006 News World Communications, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:NATURAL SCIENCE
Author:Lewis, Richard LLewellyn
Publication:World and I
Date:Nov 1, 2006
Previous Article:Tragedy, truth, triumph: Part III.
Next Article:Syringomyelia: cavities in the spinal cord.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters