Printer Friendly

Incorporating pupil assessment into teacher preparation.

Abstract

Teacher educators sought to simultaneously improve the performance of teacher candidates and individual pupils in California schools by redesigning selected courses. This redesign represents a major shift in methodology for teacher education, away from the study of texts and exemplary case studies to the use of direct pupil measures as guides to seminar content.

Need for Improved Pupil Outcomes

Despite the billions of dollars spent on education each year, the American public, policy makers, and business community have remained generally dissatisfied with the effectiveness of the nation's educational system. Over the last three decades, the average American pupil's inability to read and execute math at proficient levels has risen to the top of education reform issues. Most recently, the National Assessment of Educational Progress reported that only 32% of fourth-graders could read at a proficient level and test scores have remained essentially flat since 1970 (U.S. Department of Education, 2003). The Elementary and Secondary Education Act of 2001, No Child Left Behind is a direct response to the nation's demand for improved student achievement. No Child Left Behind places an emphasis on state, district, school, and teacher accountability; among other provisions, it requires annual testing in grades 3 through 8. As federal and state accountability for student achievement increases, the need for qualified teachers has become a national area of focus. Under the guidelines of No Child Left Behind, every state is required to provide students with "highly qualified teachers" who are capable of planning, measuring and reporting students' progress toward state standards and annual academic goals. If Dietel (2002) is correct, the new act will force educators to better interpret and use information and will help schools measure their progress in reaching important goals.

The Role of Data-based Instruction and Teacher Preparation Programs

What makes an effective teacher can in part be determined by the quality of initial preparation (Darling-Hammond, 1998; Darling-Hammond, 2003; Laczko-Kerr & Berliner, 2003). As teacher quality comes under public and legislative scrutiny, teacher preparation is being viewed as an important link in producing teachers who are proficient in assessing student knowledge, monitoring student progress, and remediating areas of academic weakness (Koret Task Force, 2003; Moir, 2001; U.S. Department of Education, 2003; Ysseldyke, Thurlow, Kozleski & Reschly, 1998). While it appears an essential function of teacher preparation programs is to promote understanding and appreciation of the powerful role assessment plays in teaching and learning, beginning teachers may view assessment merely as a means of legal accountability, and diminish the importance it plays in their students' learning. Teacher educators have the important role of helping beginning teachers set instructional goals informed by assessment data and content standards, collect and analyze student performance data to assess progress, and use student products as data to guide instruction (Moir, 2001).

To date, there is little evidence that teacher preparation programs have incorporated data-based instruction into general and/or special education credentialing programs (Foegen, Espin, Allinder, & Markell, 2001). For teacher preparation programs to answer the national mandate to produce high quality teachers, it seems vital to simultaneously improve the performance of teacher candidates and individual pupils in their classrooms by directly responding to the results of ongoing data collection.

The Relationship Between Data-based Instruction and Pupil Growth

Consistently, research studies support the premise that student assessment is an essential part of teaching, and that good teaching cannot exist without accurate, objective, and consistent formative assessment (Adkison & Tchudi, 2001; Dietel, Herman & Knuth, 1991; Fuchs, 1995; McLaughlin & Warren, 1995; Moir, 2001). When teachers are better informed of their pupils' learning progress and areas of difficulty, they can make appropriate decisions regarding what to remediate and the instructional methods that can maximize pupils' learning. When applying assessment results, informed teachers consider instructional placement needs, diagnostic decisions, and formative evaluation plans (Fuchs, 1995). Assessment plays an integral part in teaching and learning. Expert teachers across the nation "use an array of assessment tools and strategies to better understand their students' academic needs, to target their instruction, to guide next steps, and then to document their students' achievement" (Muir, 2001, p. 1). Under the expert teacher, assessment results inform instruction, ensuring all pupils are receiving instruction responsive to their individual needs. Offering preservice and beginning teachers the "first steps" toward this expertise is the greatest current challenge of teacher preparations programs.

Project Overview

In this initial experiment, the writers focused on improving our instruction to special education interns in an on-the-job training program that serves 54 public school districts across a 43,000 square mile region of California. The intern teachers who participated in these courses taught as special day teachers or resource teachers serving pupils with mild/moderate or moderate/severe disabilities in grades K through 12, while completing an education specialist credential through our university. Our special education faculty were unanimous in their concern for the achievement of pupils in interns' classrooms and in their desire to develop a model program that would simultaneously improve our interns' skills and the achievement of their pupils. To do so, we redesigned preparation curricula in our instructional strategies courses. This redesign represented a major shift in methodology for teacher education, away from the study of texts and exemplary case studies to the use of direct pupil measures as guides to seminar content. We were fortunate to receive support from the California Commission on Teacher Credentialing for this project. Revised course content centered upon the presentation of instructional interventions tailored specifically to the documented academic needs of pupils in the interns' classrooms, specifically in the areas of reading and math, and adjusted for changes in data over the semester.

Faculty Involvement: Course and Fieldwork Links

This project required the involvement and effort of the entire teacher preparation program. Instructors assisted interns in understanding various assessments, learning how to test students, and interpreting the results. Since class time was limited and intern teachers needed on-site guidance, supervisors also provided assessment support. This support took the form of modeling the testing of a student with a particular test, providing testing materials, guiding the intern teacher in interpreting the results, and/or developing the appropriate instructional program. Having a course Web site or individual sites where faculty, supervisors, and school personnel could communicate regarding the intern teachers' pedagogical needs was critical. Supervisors provided information to the course instructors about specific intern needs; interns requested feedback outside of class meetings; instructors developed class agendas to include instruction in specific areas that were noted on the site; and program directors could monitor program issues.

Course Format Redesigned

We targeted the Curriculum and Instruction courses in the first year of our effort and will be redesigning additional courses in the next year. The agreement between this university and the school districts is that these interns, who are functioning as full-time special education teachers, are released one or two days each month for their coursework; thus, one of our problems is limited instructional time with these interns. Since we see these candidates only six times per semester, we are very concerned about how much information we can introduce them to and how much they are able to accomplish in such a short time.

Again, this points out the need for all faculty and supervisors to work together in assisting interns, who came to us with a wide range of experiences, in understanding the connection between assessment and instructional approaches. It made sense from an instructional perspective, and from the interns' perspectives, to focus on their pupils' data as much as possible. Prior to the beginning of the semester, we offered a whole day assessment workshop that included instruction in the administration of formal (Woodcock-Johnson III Tests of Achievement) and informal measures (curriculum-based measurements), as well as in their interpretation. We conducted a survey to determine what the intern teachers already knew in regards to assessment and what they thought would contribute to their being effective teachers in the area of assessment. This information provided us with immediate baseline data that we were able to respond to during the workshop.

Since so many of our candidates found minimal assessment materials to use at their schools, our department purchased reading, math, and language assessments for their use and they were demonstrated that day. We arranged with a grading software company to use their software with our interns during the year at no cost. During this one day workshop, interns were also trained in using this grading software program to assist them in tracking and analyzing their pupil data. Interns eagerly adopted the software program for tracking theft pupils and found the program's reports and graphs provided them with very specific information about their pupils' growth. They then used these data to implement a more effective instructional program for their pupils. To make the class assignments and sessions manageable, interns reported only on five students as representative of their classes. The interns recorded Woodcock Johnson scores, curriculum-based measurement scores, and other informal reading and math scores. They then forwarded this information to their instructor a week prior to the first class meeting so the instructor had these data in developing the agenda for the class seminar. Over the course of the semester, interns regularly submitted (faxed or emailed) updated data approximately every two weeks, and those exact data guided the course instructor in selecting material for the coming class session.

Each of the course sessions included large group instruction with practices in analyzing assessment data and designing intervention strategies, followed by large and small group discussions and one-on-one meetings with the instructor. As assessment and intervention strategies were introduced and practiced in class, interns gained the information needed to teach their own students. The structured discussions that followed allowed the candidates to share insights, clarify understandings, and learn from their colleagues. Meeting in small groups, they reviewed their own pupil data and planned instructional interventions with their colleagues. They took notes on the suggestions from their peers and shared this information with the instructor during the one-to-one conference. This meeting allowed the instructor to provide feedback based on the most recent data and provided an opportunity for the intern teacher to share insights and clarify any concerns.

Outcomes and Reflections

In summary, aligning with the No Child Left Behind initiative, we engaged our candidates in discussions and analyses of their own pupil data in order to affect the growth of individual students. Faculty monitored the growth of the interns' own pupils through assessment data analysis, and in doing so, influenced the professional development of the interns as teachers. Guiding the intern teachers in selecting instructional strategies to meet the needs of their pupils provided direct assistance to the interns while giving indirect service to their pupils. This effort of preparing special education teachers by focusing on the assessment and instruction of their own pupils represents our attempt to fashion a field-responsive model for postsecondary teacher education institutions.

We are committed to improving the quality of our teachers' professional preparation program by focusing our university instruction around the candidates' continual assessment of their own pupils' progress toward standards and benchmarks. At the conclusion of the courses, candidates, through an open discussion, shared that they felt more effective in their teaching since they are now relying on regular assessment data to refine their instructional methods. Also, they typically commented that they have more information to share with parents or guardians on how pupils are progressing. At the end of the semester, interns were given an opportunity to reflect again on what they needed in order to be effective in the area of assessment. This time their comments showed a more sophisticated knowledge base: how to pinpoint what to assess, how to determine which assessments are more valuable than others, and how to incorporate quality teaching strategies. A few candidates stated they "needed everything that was taught in the courses this past year."

Our intent is to continue to refine this aspect of our special education teacher preparation program. We are considering adding a Web-based section to some courses that would include video instruction to provide additional support between class sessions. Having interns use their own pupil data seems to avoid the problem of relevance in teacher preparation classes: interns clearly see their course instruction has been tailored to meet their needs. By analyzing pupil data from the previous academic year, we will determine what adjustments may be needed in our courses to ensure continued academic improvement for pupils who are being served by interns throughout our region and the state of California.

References

Adkison, S. & Tchudi, S. (2001). Reading the data: Making supportable claims from classroom assessment. English Journal, 91(1), 43-50.

Darling-Hammond, L. (2003). Keeping good teachers: Why it matters, what leaders can do. Educational Leadership, 60(8), 6-11.

Darling-Hammond, L. (1998). Teachers and teaching: Testing policy hypotheses from a national commission report. Educational Researcher, 27(1), 5-15.

Dietel, R., Herman, J.L., & Knuth, R.A. (1991). What does research say about assessment? North Central Regional Educational Laboratory. Retrieved June 10, 2003, from http://www.ncrel.org/sdrs/areas/stw_esyg4assess.htm

Dietel, R. (2002). Quality school portfolio helps educators use test data to improve learning. GSE&IS Forum, 3(4).

Foegen, A., Espin, C.A., Allinder, R.M. & Markell, M.A. (2001). Translating research into practice: Preservice teachers' beliefs about curriculum-based measurement. The Journal of Special Education, 34(4), 226-236.

Fuchs, L.S. (1995). Connecting performance assessment to instruction: A comparison of behavioral assessment, mastery learning, curriculum-based measurement, and performance assessment. ERIC Digest E530. Reston, VA: ERIC Clearinghouse on Disabilities and Gifted Education. (Eric Document Reproduction Service No. ED 381984)

Laczko-Kerr, I. & Berliner, D.C. (2003). In harms way: How undercertified teachers hurt their students. Educational Leadership, 60(8), 34-39.

Koret Task Force on K-12 Education (2003). Are we still at risk. Retrieved 6/8/03 from Education Next: http://www.educationnext.org/2003/10.html

McLaughlin, M.J. & Warren, S.H. (1995). Using performance assessment in outcomes-based accountability systems. ERIC Digest E533. Reston, VA: ERIC Clearinghouse on Disabilities and Gifted Education. (Eric Document Reproduction Service No. ED 381987)

Moir, E. (2001). Formative assessment as a strategy for growth. New Teacher Center Reflections, 4(2), 1-2.

Stamford, B.H. (2001). Reflections of resilient, persevering urban teachers. Teacher Education Quarterly, 28(3), 75-87.

U.S. Department of Education (2003). Introduction and overview: No child left behind--The law that ushered in a new era. Retrieved June 15, 2003 from U.S. Department of Education at: http://www.ncelb.gov/next/overview/index.html

Ysseldyke, J.E., Thurlow, J.L., Kozleski, E. & Reschly, D. (1998). Accountability for the results of educating students with disabilities: Assessment conference report on the new assessment provisions of the 1997 amendments to the Individuals with Disabilities Education Act. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved June 25, 2003 from http://education.umn.edu/NCEO/OnlinePubs/awgfinal.html

Rita Mulholland, California State University, Chico Michelle Cepello, California State University, Chico

Mulholland, Ph.D. is an assistant professor teaching curriculum, reading, special education and technology courses, some of which are online and web courses. Cepello, Ed.D. is an assistant professor, and director of the Moderate/Severe Education Specialist Intern program, and the Level II Education Specialist program.
COPYRIGHT 2003 Rapid Intellect Group, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2003, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Cepello, Michelle
Publication:Academic Exchange Quarterly
Geographic Code:1USA
Date:Sep 22, 2003
Words:2535
Previous Article:Senior inquiry: a university/high school collaboration.
Next Article:Technology and learning collaborative: design and implementation.
Topics:


Related Articles
Perspectives and profiles: the professional preparation of middle level teachers.
Professional Standards: Impetus or Impediment?
Collaborative Action Research to Assess Student Learning and Effect Change.
Three Nigerian primary school teachers: Classroom days.
New Web-wise campaign for small businesses. (IT News).
Beginning teachers and service-learning: lessons learned.
Are teachers in China ready to teach in the 21st century?
Inclusion in Northern Ireland: cracking the code.
Teacher candidates' literacy in assessment.
Beginning with a baseline: insuring productive technology integration in teacher education.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters