Printer Friendly

An educator's guide to scientifically based research: documenting improvements in student performance & supporting the NCLB act.

Dear Readers:

Schools and school districts are under increasing pressure to make sure federal dollars are used to implement programs and practices proven effective by scientifically based research. The No Child Left Behind Act of 2001 (NCLB) mentions scientifically based research 110 times. The phrase has become so common that it has earned its own acronym: SBR.

So ... what is scientifically based research, or SBR? Who is responsible for doing it? And how can school and district leaders determine the adequacy of SBR claims for a particular program or practice?

To help you answer these questions and address these issues AEL is happy to bring you this Educators Guide to Scientifically Based Research, which:

* provides a definition of SBR

* discusses the importance of SBR to educational practice

* describes who should conduct SBR

* provides a checklist for evaluating research evidence

* lists steps for conducting your own SBR

* offers guidance for identifying high-quality researchers

We'd like to thank the following sponsors for helping to make the distribution of this guide possible: Texas Instruments, American Education Corporation, netTrekker and Inspiration Software.

We trust you will find this guide useful in helping teachers and students in your schools and classrooms benefit from rigorous research.

Sincerely,

Doris Redfield Director of the Institute for the Advancement of Research in Education at the Appalachia Educational Laboratory (AEL)

(AEL houses one often federally funded educational research and development laboratories. Dr. Redfield's experience as a teacher, administrator, researcher, and contractor to the U.S. Department of Education position her well to speak to the realities of evaluating and implementing SBR in the real world of schools.)

WHY IS SCIENTIFICALLY BASED RESEARCH IN EDUCATION SUCH A HOT TOPIC TODAY?

It starts with the emphasis placed on the No Child Left Behind Act which mandates that school improvement plans must incorporate strategies based on scientifically based research; corrective action must include professional development based on scientifically based research; and Title III grantees must use English language acquisition approaches based on scientifically based research (Johnson, 2003). This edited Guide provides educators with strategies for evaluating the extent to which products, services, and programs are supported by scientifically based research and conducting scientifically based research, when doing so is desirable.

In addition to SBR's four basic steps--observe, hypothesize, collect data/evidence, and draw conclusions--scientific research in education does six things (Committee on Scientific Principles for Education Research, 2002):

1. Poses significant questions. Scientific educational research seeks to answer meaningful questions--questions that, when answered, will make an important difference that contributes to student learning. An important question relative to educational products, programs, and services is, "will it help teachers teach more effectively or help students learn more or better?"

2. Links to relevant theory. Products, services, and programs that link to theory take findings from prior, relevant research into account. New products should provide logical building blocks for extending what is already known about effective teaching and learning.

3. Uses valid tools. Valid tools are instruments and procedures that are capable of accurately measuring the effects of educational interventions (products, programs, and services).

4. Rules out alternative explanations. Scientific research studies are designed to rule out as many explanations as possible, other than the intervention, that may have contributed to the research findings. The findings of the research study should clearly explain how potential, alternative explanations were counteracted.
 * Experiments. Experimental designs include at least two groups, an
 experimental group that receives the "treatment" or experimental
 intervention such as a new program, product, or service and a
 control or comparison group that does not.

 * Quasi-experiments. Quasi-experiments also include experimental
 and control/comparison groups. However, the participants in
 these studies are not randomly selected or randomly assigned to
 the experimental versus control/comparison group(s).

 * Other approaches. These are not experimental or quasi-experimental;
 however, other approaches do not, by themselves, demonstrate cause
 and effect relationships between the "treatment" and the outcome.
 Such techniques include correlational analyses, case studies, and
 survey research.

 5. Produces findings that can be replicated. Researchers need to
 describe the methods, instruments, and analyses they used so that
 another party could replicate the study. This also allows reviewers
 of the research, such as education practitioners, to evaluate the
 integrity of the research methods and findings.

 6. Survives scrutiny. This may mean that a product developer or
 publisher contracts with an independent third party to conduct
 product effectiveness research. When a company's research is not
 conducted by an independent party, it should, at least, be
 reviewed by independent and credible third parties.


QUALITY CONTROLS: HOW SHOULD RESEARCH EVIDENCE BE EVALUATED?

In addition to the six tenets of scientifically based research, key phrases from the definition of scientifically based research provided by NCLB provide useful guidance for evaluating the quality of research evidence:

Is it Relevant? Does the evidence provided by the researchers or developers relate to an issue of importance to your needs, particularly student learning needs?

Does the research evidence provided by the developers link to and flow from relevant theory and theory-based research?

Do the research procedures, analyses, and findings support the researchers'/developers' claims?

Is it Rigorous? If the researchers claim a causal relationship between the intervention and an outcome measure such as student achievement, did they include a control or comparison group in the study, in addition to the experimental group?

Were the study participants randomly selected and/or randomly assigned to experimental versus control/comparison groups?

Is sufficient information provided to determine whether the research design, instruments, and procedures are appropriate for answering the research questions posed by the researchers/developers?

Were the research instruments, and procedures applied with consistency, accuracy, and for the purpose intended by the developers of the instruments and procedures? For example, norm-referenced achievement tests were not developed to determine the extent to which students master certain educational objectives. Rather, they are intended to rank order students on their knowledge of the information on the test.

Is it Systematic? Was the research conducted using a carefully planned and logical steps so that "what" questions are clearly and defensibly answered? For example, what interventions or combination of interventions applied to what degrees of intensity contribute to measurable differences in learning?

Is it Objective? Did an independent, third party conduct the research? If not, did an independent third party review it?

Is it Reliable? Could the same researchers repeat the study and obtain time same or highly similar results?

Could other researchers replicate the study's methodology and obtain the same or highly similar results?

Is it Valid? Were the research instruments and procedures accurate in their measurements?

Were the data analyses appropriate for yielding accurate results?

Are the research conclusions clearly linked to the research data?

SEVEN STEPS TO CONDUCTING SCIENTIFICALLY BASED RESEARCH

Briefly, the SBR process includes:

1. Based upon the best available information (e.g., sound theory, prior rigorous research, and/or empirical observation) formulate a hypothesis about the effect of one variable (the independent or "causal" variable) such as a particular instructional strategy on another variable (the dependent or outcome variable) such as student achievement. An example hypothesis might be that when third grade students are exposed to 100 hours of XYZ software for increasing reading comprehension, their scores on a test of reading comprehension will increase.

2. If possible, randomly select participants for the study. Also, if possible, randomly assign individuals to either the experimental or the control/comparison group(s). If random selection and/or assignment are possible, you will have the makings of an experimental study. If not, then you will be conducting a quasi-experiment. Either way, you must have both an experimental group and a control or comparison group.

3. If you are interested in measuring change over time, administer a pretest to both the experimental and control/comparison groups. This is especially important if you are unable to randomly assign participants to groups. Be sure that the pretest is reliable and valid for the purpose at hand. Information about the reliability and validity of commercially available instruments should be available in the accompanying technical manual or in reference books such as Buros Mental Measurements Yearbook (Plake & Impara (2001) or Tests in Print (Murphy, et al., 2002).

4. Apply the treatment intervention to the experimental group, being careful to plan and document the nature, specific elements, length, intensity, and context of the treatment. This will allow for replication.

5. Re-measure (i.e., post-test) both the experimental and control/comparison groups, using the pretest measure or a measure that has been demonstrated statistically to be equivalent to the pretest measure. It is important to document the test-retest reliability of the measure.

6. Analyze the results of the measurements of the experimental and control/comparison groups on the pre and post-test measures. A statistics specialist can help determine the most appropriate types of statistical analyses and tests to conduct. Ultimately, an effect size should be calculated. Effect sizes indicate the practical significance of statistical findings. Large effect sizes tend to be 1.0 or greater, while medium effect sizes center around .50 and effect sizes of .25 or less are generally considered small.

7. Write a report of the findings that includes a description of the rationale for the study; findings from prior research that contributed to the study's underlying hypothesis; the research procedures and instruments that were used, including information about their reliability and validity; demographic information about the participants in the study, as well as information about how they were selected and how they were assigned to groups; how the results were analyzed; the results of the analyses, including effect sizes; and conclusions that can be supported by the data yielded by the study.

IDENTIFYING HIGH QUALITY RESEARCHERS

If instead of conducting your own research, you plan to contract with a research consultant or firm, it is important to get references and establish that they have credentials--for example, advanced degrees with training in research design and methodology as well as active membership and participation in scholarly societies such as the American Educational Research Association (AERA) or the American Evaluation Association (AEA)--that are recognized by professional education researchers. The What Works Clearinghouse (WWC), established by the U.S. Department of Education to review research and evaluate the evidence of effectiveness for specific educational approaches and interventions, plans to establish an evaluator registry of researchers who agree to abide by the WWC standards tot conducting product and program effectiveness research.

WHAT IS AEL?

Founded in 1966 as a not-for-profit corporation, AEL provides rigorous research, professional development, and consulting services to clients in the education arena. Services include research design and implementation, comprehensive research reviews, intensive product and program evaluations and field trials, consulting services, and award-winning professional development programs.

The name AEL originated with the first program the corporation operated--the Appalachia Educational Laboratory--but today's AEL is national in scope and provides a range of services to private and government agencies.

For information about AEL, contact us at aelinfo@ael.org or call 800-624-9120.

ABOUT THE AUTHOR

Dr. Doris Redfield is Vice President for Research and Director of the Regional Educational Laboratory at AEL, one of 10 federally funded educational research and development laboratories. AEL works with education practitioners, researchers, and policymakers to study and promote strategies for student success. Prior to joining AEL, Dr. Redfield served as chief of research, evaluation, and assessment with the Virginia Department of Education. She has published more than 50 articles, book chapters, and technical reports and made over 100 paper presentations at professional, refereed conferences.

LINKING YOUR PATH TO SBR

Please go to either www.districtadministration.com or www.ael.org to access the SBR Planning Tool for Educators--State or Locally Developed Products/Programs Commercial Products/Programs

The comprehensive chart includes: Steps to Conducting Your Own Research Steps to Contracting for Research Services What Educators Should Look for When Developers/Companies Conduct Their Own Research What Educators Should Look for When Developers/Companies Contract with Independent Researchers

SBR Sponsors

Texas Instruments is a market leader in education technology, providing a wide range of advanced classroom tools that enable students and teachers to interactively explore diverse topics across curriculum areas, Connecting students with real-world experiences, TT's products include a wide range of innovative handheld technology, software and data-collecting devices. TI is proud to support education research through initiatives like this SBR Guide. http://education.ti.com/.

The American Education Corporation publishes the A+nyWhere Learning SystemR, a curriculum courseware program that provides research-based, integrated curriculum for grade levels 1-12 for Reading, Mathematics, Language Arts, Science, Writing, and Social Sciences. A+LSR contains computer adaptive, companion academic skill assessment testing tools that are aligned to state and national academic standards. Learn more at www.amered.com.

netTrekker, the award-winning academic search engine specifically designed for school use. Through fast and easy access to more than 180,000 educator-selected, relevant online resources, netTrekker enhances teaching and learning, netTrekker links resources directly to each state's academic standards and benchmarks, making it an excellent tool to integrate technology and standards into daily curricula. Learn more at www.nettrekker.com.

Inspiration Software was recently named to the Inc 500, the ranking of America's fastest-growing private companies. We have also won many awards from educators, educational magazines, and computer learning labs. To learn more, visit http://www.inspiration.com/awards.html.

THE FAQ SHEET

Educators and administrators can find some definitions and practical reasons--not least because SBR is now a requirement for programs that are supported with NCLB funding--for scientifically based research here:

NCLB Definition of SBR Research that involves the application of rigorous, systematic, and objective procedures to obtain reliable and valid knowledge relevant to education activities and programs. (Please go to www.TK.com for the complete legal definition)

Why is Scientifically Based Research Important?

SBR ensures that programs, services, and instruction constitute the most effective approaches for individual students and their needs. Education practices that are grounded in scientific research ore practices that are built an the best available research findings and are tested to determine with which students they are effective for which purposes. Best practices should be proven practices.

Who Should Conduct Scientifically Based Research?

Anyone who claims that a product, program, or service causes or contributes to certain outcomes--such as improvements in student performance--needs supporting evidence for their claims. The best evidence conforms to the six tenets of scientifically based research. Educators have a responsibility to ask far such evidence and evaluate its quality. This advice also applies if a product, program, or service is developed "in house," by a school district, department of education, or university, for example.
CRITERIA FOR SCIENTIFIC RESEARCH

No Child Left Behind National Research Council

Has educational relevance Links to relevant theory Poses
 significant questions

Relies on reliable and valid Uses reliable and valid measurement
measures instruments and procedures

Uses rigorous, systematic, Rules out alternative explanations
objective methods

Studies are presented in Findings can be replicated
sufficient detail and
clarity to allow for
replication

Has been accepted by a Survives the scrutiny of
peer-reviewed journal or independent, expert reviewers.
approved by a panel of
independent experts using
rigorous, objective, and
scientific review.


Reprint information: To order copies of this guide for your staff, please contact: Lisa Marie Smith, District Administration Magazine, Ismith@promediagrp.com, 203-663-0103
SBR bulk-copy pricing: 10-50 copies $2.95/copy
 51-200 copies $2.45/copy
 201-1,000 copies $1.95/copy
 1,001 + copies $1.50/copy
COPYRIGHT 2004 Professional Media Group LLC
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2004, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Redfield, Doris
Publication:District Administration
Date:Jan 1, 2004
Words:2559
Previous Article:Annual report on school spending: a school board in Kansas is hawking extra property on eBay while districts in Florida are pocketing extra cash from...
Next Article:Pressing forward: think it's impossible to deploy state-of-the-art learning technology tools cost effectively? Find out how these three districts did...


Related Articles
Puzzled states: the success of the No Child Left Behind Act largely depends on the states' willingness and ability to implement the law. Will...
States look for tech edge in NCLB.
Ready or not: if your district is finally getting its head above water in meeting NCLB's English and math requirements, get ready for the next wave....
NCLB & IDEA make room for creative learning tools and alternate approaches: federal mandates alert educators of a shift from process to student...
NCLB & IDEA support handheld learning: electronic learning tools fulfill the NCLB requirements for federal funding. Here's how you can tap into some...
Educators left behind.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters