Printer Friendly

The McREL database: a tool for constructing local standards.

McREL's compilation of data from the national reports will be a valuable resource for schools and districts as they develop local content-area standards.

As standards take on a central role in school reform, American educators at the state and local levels have a clear charge: create standards that can be used in today's schools, but move those schools to new levels of achievement. This is clearly stated in the report of the National Education Standards and Improvement Council to the National Education Goals Panel:

It is critically important that a core set of standards be defined that makes sense when communicated to the public and to teachers, students, and school systems. Both NESIC and the states have the responsibility to see that these standards make sense together. Cumulatively, the standards must be feasible to implement within the daily and long-term operation of schools, and they should be adequate to achieve the purposes of schools and the premise of American education (NESIC 1993, p. 5).

In creating such statements, schools and districts frequently turn to the "standards documents" from the various subject specific organizations. The National Council of Teachers of Mathematics, for example, has published Curriculum and Evaluation Standards for School Mathematics (NCTM 1989), and the American Association for the Advancement of Science has issued Benchmarks for Science Literacy (AAAS 1993).

A Difficult Task Made Easier

Constructing standards from these documents, however, is not a simple matter. To articulate standards that reflect local priorities, and that "make sense together" as suggested by the National Education Standards and Improvement Council, schools and districts will need to invest a great deal of work. For one thing, the various documents vary conceptually in a number of important ways. That is, the manner in which standards are described by a document in mathematics, let's say, might be quite different from the manner in which standards are defined in a document that focuses on science.

Additionally, for some content areas, multiple documents identify standards, and each document might take a slightly - or greatly - different perspective on the subject. For example, in addition to the NCTM's recommendations, the following documents describe standards for mathematics:

* Benchmarks for Science Literacy by the American Association for the Advancement of Science 1993.

* Mathematics Assessment Framework by the National Assessment of Educational Progress 1992.

* What Work Requires of Schools: A SCANS Report for America 2000 by the Secretary's Commission on achieving Necessary Skills 1991.

* Workplace Basics: The Essential Skills Employers Want by Carnevale, Gaines, and Meltzer 1990.

In short, establishing standards based on the national documents means that schools or districts must first identify what they mean by standards, as well as the format that their standards will take. Next, they must systematically analyze all the documents and, then, translate the information into a format and conceptual base compatible with their own. Fortunately, some help is available.

With funding from the Office of Educational Research and Improvement, the Mid-continent Regional Educational Laboratory (McREL) in Aurora, Colorado, is currently analyzing all relevant documents - standards drafts as well as relevant subject-area materials - across the various content areas. The end result, we hope, will be a database that schools and districts can easily access.

From the outset, we realized that whatever position we took regarding the nature and function of standards would automatically rule out other positions. Consequently, we set out to define standards in such a way that even a school or district that might take a differing position on standards would still find our work useful. We determined at least three key issues on which our position needed to be very clear: (1) content versus curriculum standards, (2) content versus performance standards, and (3) the need for levels of standards.(1)

Content Versus Curriculum Standards

A number of documents that we analyzed combined content and curriculum standards, yet did not distinguish between the two. In simple terms, a content standard describes what students should know and be able to do; a curriculum standard describes what should take place in the classroom. Specifically, curriculum standards address instructional technique or recommended activities as opposed to knowledge and skill in themselves.

To understand the difference between the two, consider the following statements from the National Council of Teachers of Mathematics framework (NCTM 1989):

a. Use estimation to check the reasonableness of results.

b. Describe, model, draw, and classify shapes.

Element a describes a skill or an ability that a person might use to solve a real-life problem. For example, you might use estimation to check the reasonableness of your calculations about how much wood you would need to buy to build a fence around your backyard. On the other hand, it is difficult to imagine many situations - whether academic or day-to-day - that would require the ability to model, draw, or classify shapes (element b). Rather, this kind of activity is best described as an instructional device to help students understand shapes or to provide a way for them to demonstrate their understanding of shapes. Therefore, it is a curriculum standard.

Our model emphasizes content standards because they describe the goals for individual student achievement, while curriculum standards provide information that contributes to reaching these goals. Additionally, curriculum standards - which usually focus on activities or techniques - if interpreted rigidly, could leave teachers with little or no room for instructional diversity.

Content or Performance Standards?

A significant controversy within the developing science of standards-based education is whether standards should be content- or performance-based. Those who take a clear content position describe standards in terms of knowledge and skill to be acquired; those who take a performance position define standards in terms of tasks in which students demonstrate knowledge and skill.

Where the content position focuses on clearly defined knowledge and skill, the performance position presumes that knowledge or skill is defined if it is embedded in a task, even though this task must be a narrower application of the knowledge. A content standard in science, for example, might specify that students should understand the characteristics of ecosystems on the earth's surface. The performance standard for that piece of knowledge would specify the level of accuracy and the facts, concepts, and generalization about ecosystems on the earth's surface that a student must understand to be judged as having obtained a suitable level of achievement. The performance standard would also put that knowledge in a specific context by stating a form for presenting the information - for example, an essay, a simulation, or an oral report with accompanying graphics. As the National Education Standards and Improvement Council notes:

Performance standards indicate "both the nature of the evidence (such as an essay, mathematical proof, scientific experiment, project exam, or combination of these) required to demonstrate that content standards have been met and the quality of student performance that will be deemed acceptable...." (NESIC 1993, p. 22).

We believe that performance standards are a critical component of a comprehensive standards-based approach to schooling. Performance standards and content standards, in fact, have a hand-in-glove relationship. In short, even though the McREL database focuses on content standards, we assume that schools and districts can and will use it as the basis for constructing complementary sets of performance standards.

The Need for Levels of Standards

Even a cursory review of the standards generated by different groups reveals very different perspectives on the level of generality at which standards should be stated. Here's an example from the Consortium of National Arts Education Associations (1994, p. 34):

* Understand[s] the arts in relation to history and cultures.

In contrast, a draft document from the National History Standards Project (1994, p. 84) lists the following:

* Know[s] the causes of the Civil War.

The history example is obviously more specific than the one from the arts. In addition, the history document provides a much more detailed level of subcomponent information for its standards than does the arts document. The level of specificity in which standards are articulated is critical, because the level of generality adopted by a school or district will affect the level of detail within the standards, the kind of comprehensiveness the standards aim for, and the number of standards produced.

Our approach is to articulate standards at a general level, yet define specific subcomponents - or benchmarks - at various developmental levels. To illustrate, consider the following content standard for a very general area within mathematics: demonstrates number sense and an understanding of number theory. Benchmarks appropriate at the high school level might include:

* understands characteristics of the real number system and its subsystems,

* understands the relationship between roots and exponents, and

* models numbers using three-dimensional regions.

Benchmarks appropriate for middle school might include:

* understands the relationship of decimals to whole numbers,

* understands the relationship of fractions to decimals and whole numbers,

* understands the basic difference between odd versus even numbers,

* understands the basic characteristics of mixed numbers, and

* models numbers using number lines.

Benchmarks, then, describe the specific developmental components of the general domain identified by a standard. Theoretically, benchmarks could be identified at all grade levels. The trend, however, seems to be toward a few key levels. Our database provides benchmarks at four levels, roughly corresponding to grades K-2 (Level I), 3-5 (Level II), 6-8 (Level III), and 9-12 (Level IV).

The Format of the McREL Database

In all, our study has resulted in 157 different standards and their related benchmarks constructed from 22 national documents, organized into 9 major categories:

* Science: 34 standards, 507 benchmarks

* Mathematics: 8 standards, 125 benchmarks

* U.S. History: 37 standards, 143 benchmarks; World History: 31 standards, 138 benchmarks; Historical Perspective: 1 standard, 12 benchmarks

* Geography: 18 standards, 251 benchmarks

* Communication and Information Processing: 5 standards, 125 benchmarks

* Thinking and Reasoning: 6 standards, 68 benchmarks

* Working with Others: 5 standards, 48 benchmarks

* Self-Regulation: 5 standards, 56 benchmarks

* Life Work: 7 standards, 68 benchmarks

As currently formatted, our database provides highly detailed information for each standard and benchmark. Consider the following example:

5. Understands the concept of regions.

Level III

* Understands criteria that give a region identity (for example, central focus of a region, physical and cultural characteristics). (NI,56-57;SE,18;DI,10.3.1)

"Understands the concept of regions" appears as the fifth standard in the geography section, and the benchmark shown is at Level III, signifying that it is appropriate for grades 6-8. In parentheses, next to the benchmark, is a citation log. The citations specify the documents in which the benchmark appears and the explicitness of the benchmark within those documents. For example, the letter "N" indicates that the benchmark is found in the Geography Assessment Framework for the 1994 National Assessment of Educational Progress. "I" indicates that it is implicit in that document. ("E" means that the benchmark is explicit in the document.) The numerals "56-57" identify the page numbers on which the benchmark is implicitly stated. The code "SE,18" indicates that the benchmark is explicitly stated (E) on page 18 (18) of the document National Geography Standards (S) from the Geography Education Standards Project. Finally, the symbol "DI,10.3.1" indicates that the benchmark is very closely related to another benchmark in the McREL database. That particular benchmark is under the standard number 10 at level III and is the first bulleted item.

Obviously, this brief illustration does not thoroughly explain the coding system used in the McREL database. It does, however, provide a sense of the level of detail present within that database. Specifically, using the McREL system, schools or districts can identify benchmarks and the national documents in which those benchmarks are implicitly or explicitly stated. Additionally, they can identify the interrelationship between benchmark elements.

Setting Up a Standards-Based System

Ultimately, McREL has created a tool that schools and districts can use to construct their own standards, benchmarks, and an accompanying set of performance tasks. Although many complex issues are involved in setting up a standards-based system, we have identified four key questions to consider:

1. How many standards and benchmarks will we articulate? In our work thus far, we have reported 1,541 benchmarks embedded within 157 standards. Clearly, a school or district could not expect a student to demonstrate competence in all of these (although they may be a part of instruction). Sheer numbers would make such a system untenable. Given that there are 180 days in the school year and 13 years of schooling (assuming students go to kindergarten), that leaves only 2,340 school days available to students. To address all benchmarks in the McREL database, students would have to learn and demonstrate mastery in a benchmark every 1.5 school days, or more than three benchmarks every week.

Obviously, a school or district will have to select from the standards and benchmarks. A reasonable number of benchmarks is about 600, distributed in roughly the following way:

Level I: K-2: 75

Level II: 3-5: 125

Level III: 6-8: 150

Level IV: 9-12: 250

2. Will we consider all selected benchmarks necessary to demonstrate competence in a standard? One way to alleviate the problem of too many benchmarks is to consider benchmarks as exemplars rather than necessary components of a standard. Using this option, students would be held accountable for demonstrating mastery of a sample of the benchmarks within a level for a given standard. To illustrate, consider the science standard and its related benchmarks in Figure 1.

Students would be required to demonstrate competence in a selected number of benchmarks per level - for example, two out of the three benchmarks for Level I; two out of three for Level II; three out of five for Level III; and three out of five for Level IV. Using this approach, a school or district can incorporate more benchmarks, without exceeding the recommended limit of 600 benchmarks actually assessed. In the classroom, this approach gives teachers the flexibility to use those benchmark components that they judge as most applicable for their students. This approach, however, also results in less continuity of coverage within a content domain because different teachers will no doubt select different benchmark exemplars within the levels for a given standard.

3. Will we report student performance using course grades or standards? While most schools and districts report student progress using summary grades at the conclusion of courses, current research and theory indicate that courses of the same title do not necessarily address the same content (Yoon et al., undated).

By using traditional summary grades, but also implementing a standards-oriented approach, schools or districts would ensure the systematic distribution of the benchmarks throughout the various courses within content areas. Any two courses with the same title would not only entail the same benchmarks, but also place the same relative importance on the benchmarks that they cover. The percentage of the grade allotted to each benchmark would also be in control of the school or district. Clearly, this practice would provide more precision for course descriptions, as well as producing an equivalence between "identical" courses not often found today.

Traditional grading practices and standards-based assessment, then, are not incompatible. Schools or districts simply must distribute and weight the standards identified across the various courses in a systematic, well-reasoned fashion.

A second option is to report student progress by benchmarks. That is, rather than assign a single grade to a course, a teacher would report progress in some way for each benchmark covered in the course. In effect, for assessment purposes only, each benchmark component would be considered independent of the others covered within the course. With this approach, schools and districts commonly employ rubrics as opposed to grades. A rubric is a description of the levels of understanding or skill for a given benchmark. Usually, one of the levels described within a rubric is designated as the targeted level of skill or knowledge.

A school or district that reports by benchmark most commonly relates a specific rubric score for each benchmark. For example, if a student received a score of 2 (out of 4) on a particular benchmark, the 2 would be recorded as the assessment of the student's performance on that benchmark. Reporting out by benchmarks would, of course, require a record-keeping system far different from that currently used in most schools and districts. Because teachers would be assigning a rubric score for each benchmark covered in their course rather than an overall grade, the number of scores assigned to each student would increase dramatically.

4. Must all students meet all standards? A major decision facing a school or district that wishes to emphasize content-area standards is whether students must meet a targeted level of knowledge and skills. This approach is reminiscent of the mastery learning emphasis of the 1970s and early '80s (see Levine et al. 1985) and the more recent OBE or outcome-based approach (Spady 1988). In the context of reporting rubrics described previously, a mastery or outcome-based approach would mean that students would be required to meet the targeted level of skill or knowledge (for example, receive a score of at least 3 on a 4-point rubric). If not, the student would receive additional instructional opportunities until he or she could meet the required proficiency. Such a system, of course, makes a more extreme demand on resources than does a traditional system in which no extra resources are used if a student does poorly in a course.

A variation in the theme of a comprehensive mastery or outcomes-based approach is to require that students meet the performance standards on some, but not all, benchmarks. Those benchmarks that are applied to all students would be considered a set of core requirements.

A Useful Snapshot

The Mid-continent Regional Educational Laboratory is developing a database that identifies content standards and benchmarks from the national standards documents in various content areas. Although we have had to work with draft documents in some areas and will not complete the database until the fall of 1995, our efforts thus far have resulted in what we believe is a useful snapshot of the nature and content of standards and benchmarks as described in the various national reports.(2)

1 For a detailed discussion of these and other issues, see Kendall and Marzano (1994, 1995).

2 School and district educators who wish to use our database can consult The Systematic Identification and Articulation of Content Standards and Benchmarks: Update (1995), available from McREL, and work through the issues described here on their own. Or, districts may work with McREL consultants, who characteristically train a small team of individuals within the district to guide teachers, administrators, and community members in addressing the issues raised in this article.

References

American Association for the Advancement of Science (1993). Benchmarks for Science Literacy. New York: Oxford University Press.

Consortium of National Arts Education Associations. (1994). National Standards for Arts Education: What Every Young American Should Know and Be Able to Do in the Arts. Reston, Va.: Music Educators National Conference.

Kendall, J. S., and R. J. Marzano. (1994). The Systematic Identification and Articulation of Content Standards and Benchmarks: Update, January 1994. Aurora, Colo.: Mid-continent Regional Educational Laboratory.

Kendall, J. S., and R. J. Marzano. (1995). The Systematic Identification and Articulation of Content Standards and Benchmarks: Update, March 1995. Aurora, Colo.: Mid-continent Regional Educational Laboratory.

Levine, D. V., and associates. (1985). Improving Student Achievement through Mastery Learning Programs. San Francisco: Jossey-Bass.

National Council of Teachers of Mathematics. (1989). Curriculum and Evaluation Standards for School Mathematics. Reston, Va.: NCTM.

National Education Standards and Improvement Council. (1993). Promises to Keep: Creating High Standards for American Students. Report on the Review of Education Standards from the Goals 3 and 4 Technical Planning Group to the National Education Goals Panel. Washington, D.C.: National Goals Panel.

National History Standards Project. (March 1993). Progress Report and Sample Standards. Los Angeles: National Center for History in the Schools.

Spady, W. G. (1988). "Organizing for Results: The Basis of Authentic Restructuring and Reform." Educational Leadership 46, 2: 4-8.

Yoon, B., L. Burstein, and K. Gold. (Undated). Assessing the Content Validity of Teacher's Reports of Content Coverage and its Relationship to Student Achievement. (CSE Report No. 328). Los Angeles: Center for Research in Evaluating Standards and Student Testing, University of California, Los Angeles.

Authors' notes: This publication is based on work sponsored wholly, or in part, by the Office of Educational Research and Improvement, Department of Education. The content of this publication does not necessarily reflect the views of OERI or any other agency of the U.S. government.

The complete report is available on the Internet via Mosaic or other World Wide Web browsers. The text of standards and benchmarks can be searched and are linked by hypertext. The Universal Resource Locator (URL) is http: www.mcrel.org.

RELATED ARTICLE: FIGURE 1

Sample Science Standard with Accompanying Benchmarks from the McREL Database

Understands the forms energy takes, its transformations from one form to another, and its relationship to matter.

Level I

* Knows that the sun applies heat and light to earth. (CI,61;SE,23)

* Understands that an energy source, like a battery within a circuit, can produce light, sound, and heat. (CE,68;SE,23)

* Understands that an object in a beam of light can cast a shadow, while other objects might bend or transmit the light. (CI,73;SE,23)

Level II

* Knows that things that give off light often give off heat. (2E,62;CI,73;SE,30)

* Understands that mechanical and electrical machines give off heat; that light, sound, heat, and sparks can be produced in electrical circuits with batteries as an energy source. (2E,62;CI,61;SE,23)

* Knows that when warmer things are put with cooler ones, the warm ones lose heat and the cool ones gain it until they are all at the same temperature. (2E,62;CR,67)

Level III

* Understands that energy comes in different forms, such as light, thermal, electrical, kinetic (motion), and sound, which can be changed from one form to another. (2E,63;CE,62;SE,29)

* Understands that whenever the amount of energy in one place or form diminishes, the amount in other places or forms increases by the same amount. (2E,63;SE,35)

* Understands that energy comes to the earth from the sun as visible light and electromagnetic radiation; the amount and type of radiation depends on the absorption properties of the atmosphere. (CI,62;SE,30)

* Knows that light, which has color, brightness, and direction associated with it, can be absorbed, scattered, reflected, or transmitted by intervening matter; understands the concept of opacity and of refraction as the basis for the operation of lenses and prisms. (CE,73;SE,30)

* Knows that energy changes and physical or chemical changes can be measured in the form of heat. (CE,47;SE,30)

Level IV

* Understands that thermal energy in a material is related to a temperature change and consists of the disordered motions of its colliding atoms or molecules; the loss or gain of thermal energy by a given sample depends on the mass and nature of its material. (2E,63;CE,64;SE,35)

* Knows that any interactions of atoms or molecules involve either a net decrease in potential energy or a net increase in disorder (entropy) or both. (2E,63;CE,66;SE,36)

* Understands that transformations of energy usually produce some energy in the form of heat, which, by radiation or conduction, spreads into cooler places so that less can be done with the total energy. (2E,63;CE,66,SE,36)

* Knows that characteristic energy levels associated with different configurations of atoms and molecules means that light emitted or absorbed during energy transformations can be used to provide evidence regarding the structure and composition of matter. (2E,63;CI,62;SI,36)

* Knows that some changes of atomic or molecular configurations require an input of energy, whereas others release energy. (2E,63:CE,47:SI,36)

Robert J. Marzano is Deputy Executive Director, McREL Institute, and John S. Kendall is Senior Program Associate, Mid-continent Regional Education Laboratory, 2550 S. Parker Rd., Suite 500, Aurora, CO 80014.
COPYRIGHT 1995 Association for Supervision and Curriculum Development
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1995 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Marzano, Robert J.; Kendall, John S.
Publication:Educational Leadership
Date:Mar 1, 1995
Words:4000
Previous Article:Moving up before moving on.
Next Article:How Alverno shapes teachers: a conversation with Mary Diez.
Topics:

Terms of use | Privacy policy | Copyright © 2018 Farlex, Inc. | Feedback | For webmasters