Printer Friendly

Toxicogenomics: roadblocks and new directions. (Standards).

The toxicogenomics research community may be "building a Tower of Babel" unless it devises ways to communicate and share data in a meaningful way among assorted research groups and across research platforms. Or so Bennett Van Houten, acting science advisor for the NIEHS Toxicogenomics Research Consortium, told a newly created National Research Council (NRC) panel that convened 6 February 2003 to discuss focused efforts to generate useful information in toxicogenomics.

Established at the behest of NIEHS director Kenneth Olden and deputy director Samuel Wilson, the Committee on Emerging Issues and Data on Environmental Contaminants provides a public forum for communication among stakeholders about new evidence and concerns not just in toxicogenomics but also in environmental toxicology, risk assessment, exposure assessment, and related fields.

Standardization of experiments, vocabularies, and other activities within and across DNA microarray platforms is critical to toxicogenomics, said consortium coordinator Brenda Weis. "Currently, there are no standard protocols for toxicogenomics," she said, adding that gene annotation is one of the biggest challenges facing the field. For example, the pAC3 gene, a vector commonly used to clone DNA fragments, has been cited 59 different ways in assorted research papers. [For more on the topic of standardization, see "Data Explosion: Bringing Order to Chaos with Bioinformatics," p. A340 this issue.]

Although researchers must be trained in any new standards before they can be implemented, the effort will be worth it: such standards will ensure that results are credible, that full data sets and annotations are usable, that data from public repositories are accessible, and that data sets are permanently available, said speaker Chris Stoeckert, an associate professor of genetics at the Penn Center for Bioinformatics at the University of Pennsylvania. Projects such as Minimum Information About a Microarray Experiment, a workgroup of the Microarray Gene Expression Data Society (, are already working on standardization.

Standardization poses many complex challenges, however. For example, microarray users must integrate such efforts with existing ontology endeavors. Standardizing protocols across a single platform even within an individual company can be difficult when labs are scattered across the country, said speaker Donna Mendrick, vice president and scientific director for toxicology at Gene Logic.

Ultimately, researchers ' may be required to provide images along with data to enhance quality control, Stoeckert said, although the microarray community is divided on this point, because the images are valuable but expensive to store and distribute. No decisions have yet been made as to whether the NIEHS consortium will reject data that don't meet whatever standards are eventually enacted, Van Houten reported.

Other issues spring from standardization problems. A federal liaison group organized by the NIEHS has identified four potential roadblocks to the optimal use of toxicogenomics, Wilson reported: premature use of information without strong scientific justification, communication of flawed interpretations of data by the scientific community, failure to educate stakeholders (including the public) on the new science, and failure to fill information gaps. "We need to ensure experiments are in line with guidelines [being developed by the toxicogenomics community] for evolution of the field," Wilson said. He discussed the need for a "roadmap" of the field so that progress can be evaluated against expected outcomes. Risk assessment-oriented evaluations of new chemicals and drugs are also a priority.

Challenges exist for virtually all conceivable applications of toxicogenomics technologies, be they industry efforts to build robust predictive models for specific agents across a single platform or academic endeavors to collect comprehensive amounts of data. There are substantial concerns about how toxicogenomics data are going to be used and shared, how proprietary databases will be managed, and how potential privacy issues will be handled.

To address many of these issues, the NRC committee is evaluating three project proposals generated by committee working groups. These studies would examine the potential impacts and limitations of emerging technologies--genomics, proteomics, toxicogenomics, and bioinformatics--on risk assessment, environmental decision making, toxicology research, and public health, as well as whether current knowledge bases and tools fit the needs of scientific researchers and public health policy makers and workers. The standing committee itself will not conduct these studies but will recommend them for separate NRC approval and funding.
COPYRIGHT 2003 National Institute of Environmental Health Sciences
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2003, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Wakefield, Julie
Publication:Environmental Health Perspectives
Date:May 15, 2003
Previous Article:On the 50th anniversary of solving the structure of DNA. (Editorials).
Next Article:Rapamycin throws a Master switch. (Cancer).

Related Articles
A new look for a dynamic journal. (Editorial).
EHP Toxicogenomics. (ehpnet).
Two committees tackle toxicogenomics. (NIEHS News).
Toxicogenomics: an EHP section. (Editorials).
NIH roadmap for medical research.
Taking stock of toxicogenomics: mini-monograph offers overview.
Toxicogenomics through the Eyes of Informatics: conference overview and recommendations.
Note from the editors: toxicogenomics update.
A vision that challenges dogma gives rise to a new era in the environmental health sciences.
Meeting report: Validation of Toxicogenomics-Based Test Systems: ECVAM-ICCVAM/NICEATM considerations for regulatory use.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters