Printer Friendly

Are human research subjects safe enough? (Ethical Aspects).

Medical science owes much to countless contributions of committed and competent clinical investigators.

But only those willing to trample human rights resist reasonable restrictions on risks to which human research subjects are exposed.

Imagine a world where randomized clinical trials and double-blind studies using human subjects are not linchpin parts of medical research. Applications of new knowledge and technology would be haphazard and dangerous. We would never learn whether good results and bad are due to new treatments or extraneous factors.

Penicillin, paxil, patent medicines and magic potions would be on drugstore shelves alongside ginseng and St. John's wort. U.S. Food and Drug Administration controls and prescriptions would be meaningless. Why bother?

How would we know what to approve and prescribe and what not to? The daring act of consulting a physician would be even more dangerous than it already is.

Safety concerns

Human research must continue. But important popularized and politicized questions about safety of human research subjects must be answered.

In one poll, 83 percent of Americans agree that new drugs should be tested in humans but only 24 percent are very confident that patients in clinical trials are not treated as guinea pigs. (1)

There is nothing new about failing to safeguard the safety of human research subjects. In fact, teachers of 21st century principles-based neo-ethics use some classic examples to illustrate the age-old conflict between utilitarian concern for the common good, the concepts of human rights and personal autonomy, and virtues such as justice, fairness and truthfulness.

For example:

* During World War II, Nazi physicians experimented on concentration camp inmates. As a medical student, I watched a magnificent fluoroscopic study of the swallowing mechanism. As an aside, we were told that this excellent teaching tool was confiscated from the Nazis, that the fluoroscopy time far exceeded well-known safe limits for exposure to X-rays and that the subject undoubtedly died of throat or thyroid cancer within a few years.

* In Georgia in 1932, U.S. researchers began the now infamous Tuskegee syphilis study to better understand the long-term course of untreated syphilis. They enrolled 400 black men and 200 controls, most poorly educated, and did not inform the study subjects of available treatments. The study continued until 1972. In early years of the study, available treatments (arsenic and bismuth) were not very effective. But penicillin was discovered in 1940 and found to cure syphilis. This knowledge was withheld from the study group.

* In the 1970s, in the Danish Obesity Project, clinical researchers wanted to compare the effects of jejunoileal bypass with medical treatment of morbid obesity. Some patients in the medical treatment group were told that surgery was not an option because liver biopsy findings showed fatty infiltration. That was not true.

Horror stories

Exploitation of human research subjects is more common now than in previous times, or else better exposed. Either way, the number of contemporary horror stories is unacceptable. Two recent publicly exposed cases blew this issue wide open.

In 1999 at the University of Pennsylvania, a 19-year-old patient with OTC (ornithine transcarbamylase) deficiency died during a trial of gene therapy. The vector adenovirus used to deliver the therapeutic gene into the patient's bloodstream may have attacked his liver, causing the fulminate hepatitis and accompanying complications that led to his death.

In 2001 at Johns Hopkins, a researcher wanted to understand why some people respond to inhaled irritants by developing asthma and some don't. He enlisted healthy volunteers to inhale a chemical irritant, hexamethonium. One of the volunteers, a technician at the Johns Hopkins Asthma and Allergy Center, died. Allegedly, researchers meant well but failed to adequately research the toxic properties of hexamethonium.

Sometimes a medical scientist hot on the trail of a breakthrough discovery can become a zealot. Nothing must stand in his or her way. Nothing. Colleagues believe that is what happened to the principal investigator in a study approved by the University of Oklahoma that was stopped in 2000 after a nurse reported abuse of human subjects.

The greatest danger to human subjects is today's definition of legitimate research. Clinical investigators may be financially tied to a company not interested in testing a theory, but rather in marketing products.

In Nebraska in 1998, for $460, college students ingested the active ingredient of a popular insecticide spray. The purpose was to defuse accusations that the chemical is harmful to humans. No one was hurt. But studies in which investigators are paid to reach a dictated conclusion are not research. Rather, they are _____. Actually, we need a new term for what they are because such studies are definitely not clinical research.

Efforts to protect human research subjects began in 1946 when the Nuremberg Code was Written as a guideline for the Nuremberg Military Tribunal that tried Nazi leaders accused of war crimes. (2) Subsequent efforts (3,4) built on this beginning.

In the Nuremberg Code is the first use of the term "informed consent." Other important principles listed include:

* A favorable risk/benefit ratio in study design

* Careful attention to qualifications of clinical investigators

* Freedom for the subject to withdraw from the study at any time


In 1974 in the U.S., federal regulations established local Institutional Review Boards (IRBs) as the frontline mechanism to protect human research subjects. (5) Each institution in which clinical research is conducted must have an IRB charged with approving or disapproving research requests and monitoring progress of approved projects. Today, IRBs are monitored by the Office of Human Research Protections (OHRP) of the Department of Health and Human Services (HHS). (6,7)

In several recent instances, IRBs were faulted for lax, careless, unenthusiastic or incomplete performance. IRBs have responded not with denials, but with efforts to improve performance.

Is it truly a surprise that collegial members of a mandated committee fed up with red tape requirements and only marginally acquainted with the subject matter brought before them often do little more than rubber stamp every research request on the agenda?

To be fair, today's IRBs are overwhelmed with research requests because recent breakthroughs create numerous opportunities for human research. Examples include:

* Mapping of the human genome coupled with available recombinant DNA technology

* The urgent need to find another approach to infectious disease as we near the end of the antibiotic era

* Testing of new inventions such as devices used in transplantation and replacement procedures

* New challenges such as the SARS epidemic

* The drug industry's constant quest for new products

But difficulties are not excuses. If members of an IRB cannot rise above personal frustration or if an institution cannot or will not commit adequate resources and time, then fewer, if any, clinical research projects should be done in that institution.

"IRBs are under greater scrutiny than ever before," says Karen Maschke, PhD, of the Hastings Center. "Public trust is shaky," she says, "and to restore it IRBs must show that they are well trained in the ethical aspects of human subjects research and committed to protecting the rights of individuals who agree to participate in the human research enterprise." (8)

The following considerations seem especially critical:

* What selection criteria are used when IRB members are chosen? In addition to a broad multidisciplinary base, IRBs should include individuals knowledgeable about and experienced in using the scientific method on human subjects.

* Who provides the IRB with background ,information about each research request?

If necessary, bring in a relevant expert (not a member of the research team) to provide needed information and help IRB members ask the right questions. If such expertise is not locally available, perhaps the IRB should think twice about its ability to adequately monitor the progress of the study.

* Are conflicts of interest acknowledged? Is the purpose of the clinical trial or double-blind study to confirm the efficacy of a new treatment or the safety of a new treatment known to be efficacious? Or is the purpose to lend credence to a new device, drug or procedure that a company wants to market or that the investigator wants to use in his or her practice?

* How is the progress of approved studies monitored? Don't ask for periodic routine reports as a government agency might. Rather, insist on exceptions reporting. Insist that clinical investigators report adverse incidents and unexpected results and document them immediately.

* How are IRB members oriented? Are IRB members impressed with the importance of their obligation? Or are they told, "Don't worry, this won't take too much of your time. It's just another requirement we have to meet."

More regulation of human research can be expected. Some provisions of HIPAA regarding uses of personal medical information are an example. And a brand new accrediting program, the Partnership for Human Research Protection, Inc. (PHRP) Accreditation Program is just starting to take applications and do surveys. PHRP is a collaborative effort of the Joint Commission on Accreditation of Healthcare Organizations and National Committee for Quality Assurance.

If red tape and alphabet soup worked, safety of human subjects would have been assured years ago. Things won't truly get better until clinical researchers themselves re-discover the true meaning of clinical research.


(1.) Harris Interactive Poll, February, 2002. cited in "At Your Own Risk," Time magazine, April 22, 2002. p. 49.

(2.) Nuremberg code, 1946. In Encyctopedia of Bioethics, Volume 4. Edited by Warren T. Reich. The Free Press. New York. 1978. Pages 1764-1765.

(3.) Declaration of Helsinki: Ethical principles for medical research involving human subjects. World Medical Association. June 1964, last revised October, 2000.

(4.) The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. National commission for the Protection of Human Subjects of Biomedical and Behavioral Research. Washington, D.C., U.S. Government Printing Office, 1978.

(5.) Code of Federal Regulations 45 Part 46. U.S. Department of Health and Human Services (HHS). 1974. Revised June 18, 1991, reprinted March 15, 1994.



(8.) Karen Maschke. Editor IRB: Ethics and Human Research. The Hastings Center, Garrison, N.Y ( Personal communication.

Richard E. Thompson, MD, is president of Thompson, Mohr and Associates in Springfield, Mo. Previously, he was an adjunct instructor of ethics at the Ethics Institute St. Petershurg College, St. Petershurg, Fla. He can he reached by phone at (417) 889-8853 or by e-mail at
COPYRIGHT 2003 American College of Physician Executives
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2003, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Thompson, Richard E.
Publication:Physician Executive
Geographic Code:1USA
Date:Jul 1, 2003
Previous Article:Health care decision-support systems needed now. (Informatics).
Next Article:Revenue cycle management--Part II. (Practice Management).

Related Articles
Ethical, Legal, and Social Implications of Human Genetics and Genomic Research.
International bioethics education and career development award. (Fellowships, Grants, & Awards.
Pesticide testing on human subjects: weighing benefits and risks.
Breaking into clinical research.
Pesticide testing in humans: ethics and public policy.
Ethical staffing--there can be no compromise: safe staffing is an ethical issue. Nurses need moral courage and organisational support to ensure they...
Blood-substitute tests proceed without informed consent.
Cook, Rebecca J.; Dickens, Bernard M.; Fathalla, Mahmoud. 2003. Reproductive Health and Human Rights: Integrating Medicine, Ethics, and Law.
Putting the "ethics" back into research ethics: a process for ethical reflection for human research protection.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters