Printer Friendly

Beyond assessment: performance assessments in teacher education.

Introduction

Over the last decade, teacher performance assessments (TPAs) have begun to find appeal in the context of teacher education programs and teacher licensing for their innovative ways of assessing teacher knowledge and skills but primarily for their potential to promote teacher learning and reflective teaching. Studies of pre-service teachers who have completed a TPA, portfolio assessments in particular, have examined learning outcomes for teachers and have generally found positive effects on their learning (Anderson & DeMeulle, 1998; Lyons, 1996, 1998a, 1998b, 1999; Snyder, Lippincott, & Bower, 1998; Stone, 1998; Whitford, Ruscoe, & Fickel, 2000).

Background

In 1998, the state of California passed legislation (SB2042) that would require teacher candidates enrolled in credential programs to successfully complete a teaching performance assessment to obtain a preliminary teaching credential. Programs had two options: they could administer the TPA designed by the state in consultation with the Education Testing Service (ETS), or develop their own TPAs, provided they met the state's Assessment Quality Standards. This study was conducted as part of an investigation of the Performance Assessment for California Teachers (PACT), an alternative performance assessment designed and piloted in the spring of 2003 by a consortium of preservice teacher preparation programs throughout the state (all of which are post-baccalaureate programs with lengths ranging from two semesters to two years). (1) Understanding that high-stakes assessments ultimately drive instruction and learning, these programs opted to create and pilot their own performance assessment that was designed to be an authentic representation of teaching and to also reflect their program values and goals.

The PACT performance assessments are subject-specific portfolios of teaching (called "teaching events") with a standardized set of integrated tasks that ask teachers to document their planning, teaching, assessing, and reflecting around a series of lessons on a topic of their own choice. Preservice elementary teachers piloting the assessments in this study enacted two instructional units (comprising 4-5 hours of instruction) in literacy and mathematics in their student teaching placements. The PACT teaching events and scoring rubrics are aligned with the California Teacher Performance Expectations (TPEs) (2) for preservice teachers. They also focus on the assessment of individual students' needs and student learning outcomes as the basis for teachers to evaluate the success of their teaching decisions. (See Appendix A for an overview of the 2003 version of PACT's elementary teaching event.) (3)

This project provided a timely opportunity to examine the impact of a performance assessment on preservice teacher learning and teaching practice as well as the assessment's contribution to teacher education programs. Drawing on case studies of two teacher candidates who participated in the first year (2002-03) pilot of the PACT at one campus, this study disentangles what teacher candidates reported learning from completing the elementary teaching event from other sources of learning in their credential program, examines the way the learning and teaching contexts in which teacher candidates completed the assessment affected their learning experiences, and corroborates teacher self-report with observational data and evidence from lesson debriefs. A focus group and data from two surveys provide for greater generalizability of the findings and a comparison of the experiences of teacher candidates at one campus to those of candidates across campuses.

Literature Review

In the last decade, as TPAs (4) have come into more common use, the body of research concerning the validity of such assessments and their impact on teachers' professional growth has burgeoned. Some of these studies, in particular, research on the impact of the National Board certification process, have provided important insights into the kinds of learning outcomes that are associated with particular kinds of TPAs, as well as some of the conditions that are needed for teachers to benefit from a TPA.

There are three main genres of research on the impact of TPAs on teachers' professional growth. Teacher self-report studies (King, 1991; Athanases, 1994; Tracz, Sienty, & Mata, 1994; Tracz et al., 1995; Rotberg, Futrell, & Lieberman, 1998; Stone, 1998; and Sato, 2000) rely on what teachers report in interviews, focus groups, or surveys about their experiences with a TPA and subsequent changes in their teaching practice. In portfolio artifact studies (Lyons, 1998a, 1999; Snyder, Lippincott, & Bower, 1998), reflections, course papers, or other projects produced by teacher candidates are used as evidence of teacher learning. Finally, in group comparison studies, teachers who did and did not successfully complete a TPA (in this case, the NBPTS portfolio assessment) are compared with regard to their teaching performance (Bond, Smith, Baker, & Hattie, 2000; Darling-Hammond, Atkin, Sato, & Chung, forthcoming), reported learning (Lustick & Sykes, 2006), and student achievement gains (Cavalluzzo, 2004; Vandevoort, Amrein-Beardsley, & Berliner, 2004; Goldhaber & Anthony, 2005; Smith, Gordon, Colby, & Wang, 2005; Sanders, Ashton, & Wright, 2005). None of the studies in this last genre of research were concerned with preservice teachers.

While the findings of previous research on TPAs and portfolios are promising, research evidence documenting what and under what conditions preservice teachers learn from such assessments could be stronger. One of the weaknesses of previous research on TPAs is that the impact of the assessment cannot be easily disentangled from the multiple sources of teacher learning in preservice programs, such as coursework, field and practicum experiences, mentorship, and supervision. Furthermore, there is little evidence that preservice teachers actually enact what they report learning in their teaching practice as a consequence of completing a TPA because of the lack of observational data corroborating the impact of such assessments on teacher practice. Well designed research that can differentiate the contribution of the performance assessment from other sources of learning and that examines subsequent teacher practice would deepen our understanding of the impact of the assessment on teacher learning and practice.

Theoretical Framework

The idea that performance assessments can promote teacher learning is grounded in professional learning theories such as Schon's (1983) concept of "reflection in action," which posits that ordinary people and professional practitioners reflect on what they are doing in the process of carrying out an action and solving a problem. This conception of the "reflective practitioner" is consistent with Lee Shulman's (1987) conception of teaching as "pedagogical reasoning and action," which requires that teachers reason and think through pedagogical decisions in order to investigate, analyze, and solve problems rather than merely enact "best practices." The PACT teaching event explicitly prompts teachers to examine and reflect on a complete cycle of teaching from planning a learning segment to evaluating student learning and devising changes in future practice, thereby enhancing their opportunities to reevaluate and revise their teaching practice, and in so doing, may evoke the "reflection in action" that Schon and Shulman believe underlie professional learning. Last, this research builds on research on the use of performance assessments at the K-12 level to promote student learning and higher order thinking (Baxter, Glaser, & Raghavan, 1993; Darling-Hammond, Ancess, & Falk, 1995; Wiggins, 1998).

The Relevance of Teaching and Learning Contexts. Situated knowledge theory (Bruner, 1996; Greeno, Collins, & Resnick, 1996) and social constructivist theory (Gage & Berliner, 1998; Lave, 1988; Lave & Wenger, 1991) suggest that the teaching contexts in which teachers learn to teach may mediate the extent to which any intervention aimed at improving teachers' instructional practice can have an impact. Studies of novice teachers and their practicum experiences (Feiman-Nemser & Buchmann, 1983; Goodlad, 1990; Zeichner, 1992) have found that the social conditions in which beginning teachers learn to teach have an influence on what they learn from their experiences. Preservice teachers' learning contexts (program experiences) and teaching contexts (student teaching placements) were therefore explored in this study of teacher learning.

The Relevance of Support. Vygotsky's (1978) zone of proximal development (ZPD) suggests that the support of a more highly skilled "other" is needed for a learner to move from his current skill level to the desired level. The work of Tharp and Gallimore (1988) draws on the principle of the ZPD to explicate teaching as assisted performance. Thus, levels of support provided by cooperating teachers and supervisors were also examined in this study of preservice teacher learning.

Methods and Data Source

This study used a mixed-methods design to examine teacher learning and to extricate the impact of the PACT teaching event on unobservable outcomes (teacher knowledge, beliefs, and dispositions) as well as observable outcomes (instructional practice). The qualitative component consists of two case studies of elementary teacher candidates and a focus group consisting of 23 teacher candidates at the same university ("Urban University") who had piloted the Elementary teaching event in the spring of 2003. The case studies involved three to four structured interviews of two preservice teachers over a three-month period, three audio-taped classroom observations, and shadowing in university courses. Transcript and observational data from the case studies were analyzed within cases, using data from across the data corpus for confirmatory and contradictory evidence to determine what teacher candidates reported learning, discern changes in their knowledge or dispositions about teaching, identify the sources of their learning, and check for whether their teaching practices reflected what they reported learning. Cross-case analyses were then conducted to discern patterns in learning reported and confirmed in teaching practices. The focus group transcript was analyzed to identify what candidates reported learning, their attitudes toward the teaching event , how their experiences were shaped by how the PACT was implemented, sources of support, and program components that prepared them for the teaching event. Finally, focus group participants' experiences with the teaching event were compared with the experiences of the case study subjects.

The quantitative component of this study includes results from the Teacher Reflection Survey completed by teacher candidates across the state participating in the 2003 PACT pilot. These survey results were used to determine how candidates at Urban University compared with candidates across campuses in terms of their demographic characteristics, attitudes about the teaching event, perceptions of support, and perceptions of program preparation to complete the teaching event.

Case Studies: Learning and Teaching Contexts--Tracy and Joy

Learning Contexts. Tracy and Joy both began their teaching credential programs during the fall of 2002, and at the time of the study, both were in the second semester of their programs. (See Table 1 for comparisons of Tracy's and Joy's learning and teaching contexts.) Both the intern program (Tracy) and the master's degree program (Joy) were cohort-based programs in which about 30 teacher candidates took all of their classes together during the first year, fostering a strong sense of collegiality and mutual support. Both Tracy and Joy described most of their courses as being very relevant and useful for preparing them for teaching.

Teaching Contexts. By the end of the first year, both Tracy and Joy had had an entire school year of student teaching experience. Faculty in Tracy's school had a great deal of autonomy and were not required to implement any particular curriculum programs. Tracy had had a little experience with independent lesson planning, but most of the content covered was predetermined by her cooperating teacher, and most of the lessons Tracy implemented during the second semester were planned collaboratively with her cooperating teacher.

In Joy's teaching placement, faculty members were required to implement a district-mandated literacy program, and there seemed to be a heightened pressure to teach to the state content standards. The kindergarten class in which Joy completed her student teaching during the second semester was an "English only" class, but she later found out from her cooperating teacher that quite a few of her students were English learners whose parents wanted them to be immersed in English language instruction. Although Joy felt her cooperating teacher was a good model of effective teaching, she expressed a need for much more direction and guidance overall. Joy's cooperating teacher was also less flexible than Tracy's cooperative teacher about lesson planning, expecting her to use the same routines and methods that she used.

Implementation of the PACT Teaching E vent. Tracy's entire cohort was required to complete the teaching event. The professor who co-taught Tracy's practicum seminar for the student teaching experience was highly familiar with the teaching event, its requirements, and scoring criteria. The teaching event was introduced to the cohort at the beginning of the spring semester in January of 2003. During the seminar, which met weekly for three hours, the instructors provided clarification of the teaching event tasks and prompts, and gave assignments that would allow candidates to complete the teaching event in parts. Cooperating teachers were also familiarized with the teaching event during the early part of the semester because it would replace the existing portfolio requirements that were formerly implemented with their guidance. Toward the end of the semester, students were required to submit a draft of one of their teaching event sections (literacy or math) in order to provide them with feedback before completing the entire teaching event.

Joy's master's degree program cohort was not one of the groups that were selected to pilot the teaching event. However, she volunteered to participate in the pilot "because she wanted the challenge." Initially, in order to become familiarized with the teaching event, Joy attended one of the practicum seminars that was piloting the teaching event. This seminar also happened to be taught by her supervisor. The members of this small seminar (with only four teacher candidates) were asked to pilot the teaching event in place of their regular portfolio requirements. However, the seminar met every few weeks and because of health problems, Joy did not make it to the second session, and thus was not privy to the scaffolds provided. In addition, Joy's supervisor was not very familiar with the teaching event prompts, requirements, or scoring criteria, and was therefore unable to provide as much guidance as Tracy's seminar instructors and supervisor. In sum, Joy's experience with the PACT teaching event was less scaffolded than Tracy's experience.

Support Provided by Program Faculty and Cooperating Teacher. Tracy felt very well supported by the faculty member who taught her practicum seminar, by her supervisor who provided feedback on multiple drafts of her teaching event, and by her cooperating teacher, who assisted her in planning her lesson sequences for the teaching event. Tracy also identified fellow teacher candidates in her cohort program as an important source of support in terms of sharing strategies for classroom management and teaching, and providing emotional support.

In contrast, Joy did not feel well supported by program faculty or by her cooperating teacher in completing the teaching event, which is understandable, given that her program was not one of the piloting cohorts. Joy's supervisor also taught several courses in the program and Joy's perception was that her supervisor was spread too thinly (by teaching responsibilities and supervision load) to provide the kind of support and attention that she needed. Like Tracy, Joy felt a strong connection to other students in her cohort; however, because her cohort was not one of the groups piloting the teaching event, she was not able to solicit support from fellow cohort members for understanding or completing the assessment.

Case Study Findings

Reported Learning Gains from the Teaching Event. Tracy and Joy both reported learning a number of teaching skills from the teaching event. These are listed in Table 2--Cross Case Study Findings. In order to determine whether these reported learning gains could be attributed to the teaching event or to their program experiences (including student teaching), the interview transcripts were analyzed to identify sources of learning and whether there were any overlaps in the learning gains accruing from program experiences and from the teaching event. These reported learning gains were then distinguished from those that could be attributed to the teaching event alone. Reported learning gains were then triangulated with evidence from interviews, classroom observations, and debriefs to determine whether any of the candidates' reported learning gains could be observed in their teaching practice.

Sources of Learning--Tracy. Three of the teaching skills that Tracy reported learning from the teaching event could also be traced to her program coursework or student teaching experience: planning curriculum units, the integration of content areas, and modifying lessons based on the assessment of student learning. However, Tracy had never actually taught any of her planned units in her student teaching placement, and she admitted that during her two "solo weeks" she would not have had the opportunity to independently plan and teach two multi-lesson learning segments in literacy and math had the teaching event not been required. Tracy also identified modifying lessons based on the assessment of student learning as a skill that had also been taught in her program courses.

On the other hand, the way Tracy talked about her analysis of a class set of student work for the assessment task of the teaching event suggests that this was not an activity that she had been doing deeply or regularly:
 The assessment piece [of the teaching event] was good. Having to
 really look at, like for the math, look at the group work. I mean,
 I really got into that. Tallied it all up in Excel, and made a
 graph! And that was kind of fun. It was like, "Oh. I could probably
 do this more often," this kind of thing. But you know, really
 digging into their work and looking for what was going on. I should
 make that more of a habit next year than I have this year, now that
 I know. (Tracy, Interview 3)


The learning gains Tracy reported that did not have any discernible overlaps with program experiences are reflective of the teaching event requirements. Tracy's attention to content standards can be traced back to the prompt in the planning task of the teaching event that asked candidates to state what content standards or English Language Development (ELD) standards their instructional plans addressed. Additionally, Tracy's attention to the alignment of assessments with plans is likely related to the assessment matrix that was an optional part of the planning task of the teaching event (but that was required by her seminar instructors). In this matrix, candidates were asked to list the type of assessment given for each lesson, the student learning goals assessed, feedback to students (if any), next steps in instruction, and accommodations for special needs students.

Sources of Learning--Joy. Of the nine teaching skills that Joy identified learning from the teaching event, three were skills that could also be traced back to her program or student teaching experiences: attention to ELs (English Learners), choosing teaching strategies to reflect students' needs, and awareness of content standards. The other six teaching skills that Joy reported learning from the teaching event did not have discernible overlaps with her university program or student teaching experience. As was the case for Tracy, the experience of planning and teaching a sequence of literacy and mathematics lessons was a novel experience for Joy. The PACT requirement to integrate a second content area with either literacy or mathematics (no longer required in newer versions of the PACT) was also a new experience. Modifying lessons based on the assessment of student learning can be traced to the teaching event's requirement for candidates to write daily reflection logs on their lessons and to report what changes they made to the subsequent lessons:
 It's helped me get focused, and kind of, I think it's helped me to
 see that you, there's a, the need for continuity, and to find a
 continuity in the lesson. But also to look at where they are--the
 assessment part, you know, look at where they are at the end of the
 day, and sort of, maybe, change things a little bit to find out
 where they need to go the next day. (Joy, Interview 2)


Joy also reported that the way she was asked to analyze the teaching videotape forced her to observe or look at her teaching "in a different, in a much deeper way, or a more reflective way."

The teaching skill that Joy gained from the teaching event that was the most noteworthy, however, stemmed from her experience with the instructional context task. When asked about her students' backgrounds and skill levels at the first interview in April, Joy was at a loss and said that she would need to ask her cooperating teacher for that information. Later, after having completed the instructional context task for the teaching event, which prompts teacher candidates to report on the characteristics of their students, Joy expressed the value of learning about her students in this way:
 I know that next year, if I teach, or this coming school year, I'm
 gonna get out my, the sheets that the parents fill out, you know,
 'How old is the child? And what is their nationality? And when did
 they come to the United States? And did they have other brothers
 and sisters?' Like, all these, what is their background? You know,
 it really helps you, I think, to understand your class and each
 child much better. And I think I'm gonna make that a real priority,
 where I really wouldn't have thought about doing that, or it would
 have been just too much to do. And I think that's really--I learned
 a lot from that. (Joy, Interview 4)


Changes in Teaching Practice--Tracy. In order to determine whether any of the reported learning gains were reflected in changes in teaching practice, classroom observation transcripts/notes and lesson debrief transcripts were analyzed for confirmatory and contradictory evidence. Of the five types of learning that Tracy reported, at least two were corroborated by her teaching practice. From the second observation and lesson debrief, it was evident that Tracy had begun to independently adjust her instructional plans based on her assessment of students' understanding:
 I'd observed that when we did the energy unit that we would read it
 as a group, and they just really had a hard time figuring out what
 the main idea was, versus the details ... And then last night when I
 checked their work, I noticed that a lot of them had answered the
 question, but they'd done it with details, and not with the main
 ideas ... That's why I led 'em through it so much today 'cause a lot
 of them sort of missed that yesterday. (Tracy, Interview 2)


In addition, the way Tracy talked about the success of her lessons in her second and third lesson debriefs indicates a shift from concerns with teacher activities and activity structures to a greater concern with student learning. Tracy also demonstrated an awareness of students' difficulties with academic language and showed that she was using evidence from monitoring students to inform her instructional decisions.

A second area of growth for Tracy that was corroborated by observational or lesson debrief data was the integration of a second content area and making connections between content areas to reinforce learning goals. During the second lesson debrief, Tracy talked about using spelling assignments to reinforce the vocabulary from the science text on ecosystems. In addition, during the third lesson debrief, Tracy indicated that she had drawn from students' prior knowledge from the science unit on ecosystems to make connections to a book they were reading for language arts on the Exxon Valdez oil spill.

A third area of growth in Tracy's knowledge of teaching (although she did not report this as a change in her teaching) could be traced in part to the requirements of the teaching event, with overlapping influences from her university course "Teaching Second-Language Learners." She gained a greater awareness of the need to learn strategies for reaching English language learners. The assessment task of the teaching event prompts candidates to analyze two students' learning over time, with one of the target students being an English Learner or a student with academic language difficulties.

Changes in Teaching Practice--Joy. Because Joy did not complete writing up the teaching event until after the third classroom observation of her teaching, it was not possible to observe changes in her teaching after completing the teaching event write-up. However, there was some evidence that the activities she had completed for the teaching event (planning and implementing two lesson series in literacy and math, collecting student work, and videotaping) up to the time of the third interview had had an impact on her teaching practice and ways of evaluating her own teaching practice. Of the nine types of learning Joy reported, at least two were corroborated by her teaching practice. First, there was a discernible shift in her concerns, from a focus on student engagement (lesson debriefs 1 and 2) to a concern also with student understandings (lesson debrief 3). Although she still referred to student engagement as the primary indicator of the lesson's success (or lack thereof), she also referred to what students seemed to understand in talking about what was successful or not successful about her lesson.

A second change in Joy's teaching that was evident in her planned lessons and lesson debriefs was her increased knowledge of students' backgrounds and learning needs, as well as an awareness of the need to use specific strategies to reach her English learners. Joy explained that she had designed the third lesson on sequencing (a re-enactment of The Little Red Hen) in order to provide her EL students with more opportunities to interact and practice oral expression.
 ... one of my target students is EL, and there were about five
 others I think that--so it changed how I would do the instruction.
 I think I'd be more aware of using more support and like, they
 [cooperating teachers] don't like the children to talk amongst
 themselves. They want them really quiet. But one of the big things
 for ELs is they need to talk, and they need to have conversations
 ... And that kind of was one of the reasons I did this, is because
 they could talk more and interact more ... (Joy, Interview 3)


A final change that was evident primarily from lesson debrief data was that Joy seemed to gain more independence in the way she reflected on her own teaching. During the debriefs of the first two lessons I observed, Joy repeatedly cited negative feedback received from her cooperating teacher in discussing the success of her lessons. During the third lesson debrief, she seemed to be using more of her own voice in the way she reflected on and evaluated her own practice.

Cross-Case Study Findings--Discussion

Learning Gains and Changes in Teaching Practice. When comparing the learning gains that Tracy and Joy reported, it was found that Joy reported learning a wider variety of teaching skills/knowledge from the teaching event. However, there was substantial overlap in what they both reported learning: (1) planning a sequence of related lessons focused on a central learning goal; (2) modifying lessons based on assessment of student learning; (3) integrating content areas; and (4) attending to content standards. Comparing changes in their teaching practices, there was also some overlap in how their teaching was impacted by their experiences with the teaching event: (1) a shift from concern with teacher activities, activity structures, or student engagement to a greater concern with student learning; and (2) an increased awareness of the need for strategies to reach English learners. If we consider only these last two learning gains as examples of what can be learned from the teaching event, they comprise powerful evidence of the kind of reflective teaching preservice teachers are capable of when they are pushed to engage in activities such as those required by the teaching event. These findings lend support for the principle of "reflection-in-action" that Schon asserts is critical in professional decision-making as well as activity-based learning theories. The enactment and documentation of an entire teaching cycle required by the teaching event increases the likelihood that preservice teachers will have an opportunity to learn about planning, teaching, and assessing in integrated and authentic ways that are based in practice rather than having fragmented experiences with planning, videotaping, and assessing.

Attitudes about the PACT Teaching Event. When asked to describe their experiences with the teaching event and how they felt about it, Tracy and Joy had somewhat different reactions. Tracy did not have many complaints about the requirements of the teaching event and saw value in most activities involved in putting the teaching event together, although she did complain that on top of all her other assignments for her program courses, writing up the teaching event was quite a lot of work. Joy, on the other hand, had more negative feelings about the teaching event and a number of complaints about redundancy in the prompts, the amount of work involved in writing up the commentaries, and sorting out the details of what was required. At times, Joy sent email messages in which she expressed her frustrations and feelings of being overwhelmed. In the end, Joy felt that writing up the teaching event was "challenging," "rigorous," and akin to "giving birth to a baby." At the same time, she acknowledged that she had learned from the process and that she was still glad that she had participated in the pilot.

Possible Factors Related to Learning Outcomes and Attitudes. Although Joy had more negative feelings toward the teaching event, she also reported learning more from the teaching event than did Tracy. What might be some factors related to these differences? First, Tracy and Joy had very different levels of support from their program faculty and cooperating teachers, as well as very different levels of scaffolding in the process of constructing their teaching events. Given the lack of scaffolding and support Joy experienced, it is understandable that she would have such negative feelings about the teaching event. While these variations in support and scaffolding explain the differences in attitudes about the teaching event, they do not seem proportional to the reported learning gains.

Second, Tracy and Joy had a different set of constraints on their ability to engage in the types of activities required by the teaching event. While Tracy had at least some autonomy in the content and learning strategies she selected for the teaching event learning segments, Joy was much more limited by her cooperating teacher's expectations and established routines, as well as a district-mandated literacy program. Despite these limitations, during the final classroom observation (of the story reenactment), Joy was able to utilize a learning strategy that she had selected based on her assessment of her students' learning needs.

Third, Tracy and Joy had somewhat different program experiences even though they were enrolled in elementary credential programs at the same institution. For Joy, many of the activities that were required by the teaching event and the prompts for reflecting on student learning, instruction, and assessment were novel experiences. She had never been asked to investigate her students' ethnic, linguistic, socio-economic, and skill backgrounds; she had never had experience with planning and teaching an extended learning segment for literacy or math; she had never been asked to integrate content areas in her instructional plans. Thus, by actually experiencing these activities for the first time during implementation of her planned teaching event units, she was able to learn something new from them.

Tracy, on the other hand, had had previous experiences with some of the activities required by the teaching event. As part of her program coursework, she had conducted a school community investigation and was thus already quite familiar with students' backgrounds. In at least one of her courses, she had collected and examined student work. She had videotaped and analyzed her teaching reflectively at least twice before the teaching event. She had been taught to think about teaching as a "plan-teach-assess-reflect" cycle. Thus, it may be that because of such overlaps between the activities required in the teaching event and in her program, she did not report learning as much from the teaching event as did Joy.

In sum, it appears that the contribution of the teaching event to candidates' learning experiences is related to whether or not they have had previous experience with the activities required in the teaching event. The less overlap the teaching event tasks had with learning activities in the existing program, the more likely it was that the teaching event would contribute to candidates' learning. It is less clear whether levels of support and scaffolding, while critical in determining candidates' attitudes about the teaching event, are directly related to what candidates' learn. One hypothesis is that low levels of support and scaffolding are co-variant with opportunities to learn the skills promoted by the teaching event in the credential program. The teaching event may contribute more to a candidates' learning even with low levels of support because of less overlap with the learning opportunities available in the program. Additionally, it appears that learning from the teaching event may be dampened by school and classroom-level constraints on teachers' instructional decisions, although even under strong constraints, Joy reported learning a great deal from the teaching event. Such constraints, however, may limit the teacher candidate's ability to reflect authentically on teaching decisions made independently and to enact what they have learned from the teaching event in their student teaching placements. A final hypothesis is that the teaching event contributes to candidates' learning in indirect ways by changing the program in itself. In the case of Tracy's program, which provided strong supports and scaffolding for candidates as they completed the teaching event, the additional attention that the program paid to the teaching skills measured in the teaching event may have added to the candidates' learning experiences.

Another difference between Tracy and Joy's experiences with the PACT was that Tracy's cohort was required to complete the teaching event as a component of the student teaching seminar, while Joy voluntarily participated in the PACT because she thought it might provide an interesting "challenge" (although she still perceived it as an assessment of her teaching, as evidenced by her concern with her scores). It may be that candidates' experiences with the PACT depend to some degree on the how the assessment will be used--whether for a high-stakes licensure decision, for course/program completion, or simply as a formative tool for candidates' reflection on their teaching. Would Joy have been willing to be so open about weaknesses in her teaching had she known her credential depended on it? One thing to note is that Tracy scored significantly higher on her teaching events than did Joy, who passed the literacy teaching event with low rubric scores and received failing rubric scores on the mathematics teaching event. (5) Does Joy's teaching event represent a more authentic representation of her teaching than Tracy's teaching event because no real stakes were attached? This raises the question of how an assessment's purpose interacts with its uses in influencing teacher learning associated with the experience of completing the assessment.

Findings--Focus Group

At the end of the spring semester, 23 of the piloting candidates in Urban University's Internship program participated in a focus group. Because this focus group included only those in Tracy's cohort program, these findings do not necessarily represent the range of experiences that candidates across the elementary credential programs had with the teaching event.

Many candidates were honest in expressing their frustration with the teaching event. However, most of the complaints raised were related to the technical challenges of completing the teaching event (e.g., videotaping, formatting, redundancy in the task prompts, amount of writing involved) rather than with the content or activities required by the assessment. Candidates also faced constraints in their placements due to district-mandated texts, cooperating teacher expectations, established routines, and testing schedules. However, for a few candidates, the opportunity to plan and teach their own sequence of lessons provided a welcome relief from the more scripted lessons they normally taught:
 I actually went into the Houghton Mifflin program and got
 suggestions for books and activities to do, but I then steered away
 from the Houghton Mifflin, and I did my own, creative teaching, and
 the kids really enjoyed it because it wasn't the same format that
 they were used to. It wasn't the--'Okay, now we're going to do
 phonics for five minutes, then we're going to do this for five
 minutes'--and it wasn't as, I didn't feel like I was on stage as
 much as I am sometimes with the Houghton Mifflin program. And it
 seemed to be really relaxed in the classroom, and it was a nice
 change for those three days to actually feel like I was being
 creative, and I was actually interacting with the kids more ...


Not surprisingly, many of the learning gains reported by candidates resembled the learning gains reported by Tracy. They reported growth in a few specific skill areas: (1) assessing student learning; (2) planning inter-disciplinary lesson units; and (3) reflecting on their teaching based on student learning. Confirming what was observed from the case study, that Tracy's program learning experiences overlapped strongly with her learning experiences with the teaching event, one candidate noted that the program had prepared them well to assess and reflect: "I don't think I really needed much help with the assessment, and the reflections, because we had been doing that all semester, and for the past year." However, even though candidates had had previous experiences with assessing and reflecting, they still found the experience of those activities in the teaching event to be valuable:
 I think, for me, the most valuable thing was the sequencing of the
 lessons, teaching the lesson, and evaluating what the kids were
 getting, what the kids weren't getting, and having that be
 reflected in my next lesson, so I think that was the thing that
 really, I found value in, as kind of the
 'teach-assess-teach-assess-teach-assess.' And so you're constantly
 changing--you may have a plan or a framework that you have
 together, but knowing that that's flexible and that it has to be
 flexible, based on what the children learn that day.


Others found themselves paying greater attention to informal and formal assessments, writing down comments that students were making during class activities, as well as spending more time examining formal assessments.

Overall, participants in the focus group felt their program had prepared them well to complete the teaching event and that the assessment did not make extraordinary demands outside of the scope of what they had been prepared to do. At the same time, even though some of the activities involved in the teaching event replicated some of the their program experiences, they still found value in them and felt they had learned from the process of actually implementing what they had learned in their university courses.

Findings--Demographic Data and Teacher Reflection Survey

The purpose of the following analysis was to determine how representative the responses of piloting candidates at "Urban University" were in relation to the whole population of piloting candidates across 11 campuses in California. Responses to items on the Teacher Reflection Survey from piloting candidates at Urban University (N=30) were compared with responses of piloting candidates across campuses (N=527).6 Because all piloting candidates in this study were enrolled in an elementary credential program, the experiences of this cohort of candidates are not necessarily representative of candidates teaching across grade levels. Demographically, most candidates in this program were white and female, and English was their primary language. This is consistent with the ethnic and linguistic backgrounds of elementary-level piloting candidates throughout the state.

In general, candidates' perceptions about sources of support for completing the teaching event at Urban University were more positive than perceptions of candidates across the state. At Urban University, cooperating teachers were rated the highest (on a 5-point Likert scale, with 1= "not very helpful" and 5= "very helpful") for providing support as candidates completed the teaching event. While 70 percent of Urban University candidates reported their cooperating teachers' support as being "helpful" or "very helpful" (Mean=4.07), only 44 percent of candidates across campuses rated their cooperating teachers' support as being helpful or very helpful (Mean=3.46). Ratings of support provided by supervisors at Urban University and across campuses were comparable, with 63 percent of candidates at Urban University and 60 percent of candidates across campuses rating their supervisors' support as being "helpful" or 'very helpful". Ratings of support provided by university faculty were higher at Urban University, with 52 percent of candidates rating their professors' support as being "helpful" or "very helpful", while across campuses only 28 percent of candidates rated their professors' support as "helpful" or "very helpful." Overall, piloting candidates at Urban University (in Tracy's cohort program in particular) reported receiving more support from cooperating teachers and program faculty in completing their teaching events than did piloting candidates across all campuses.

In terms of candidates' perspectives on their preparation to complete the teaching event, nearly all candidates at Urban University agreed or strongly agreed that their program had prepared them to complete both the literacy and mathematics teaching events. This compares with 63 percent of elementary candidates across institutions agreeing or strongly agreeing that their programs had prepared them to complete the literacy teaching event and 61 percent for the mathematics teaching event. (There were comparable levels of agreement for secondary candidates across institutions.) It appears that candidates at Urban University felt much better prepared for completing the teaching event than did candidates across campuses.

The "Teacher Reflection Survey" also measured candidates' perceptions of opportunities to demonstrate a variety of teaching skills in the teaching event. These survey items may be interpreted as representing candidates' attitudes about the assessment or their perceptions of the validity of the assessment for measuring their teaching skills. (See Appendix B for the distribution of responses for Urban University candidates on these items.) Sixty-five to 96 percent of candidates at Urban University agreed or strongly agreed that the teaching event provided them with opportunities to demonstrate their competencies across 13 survey items. For candidates across programs, only 40-60 percent of candidates agreed or strongly agreed that the teaching event provided the opportunity to demonstrate their teaching skills across the 13 survey items. (In subsequent pilot years, the proportion of candidates across campuses who reported that they learned important skills from the process of completing the teaching event was two-thirds. However, there remained wide variations across campuses in candidates' reported learning experiences.)

The difference in attitudes about the PACT between candidates at Urban University and candidates across campuses may be related to differences in the levels of support and preparation that candidates felt they had that first pilot year. Piloting candidates across campuses who reported high levels of support and preparation were significantly more likely than candidates who reported low levels of support and preparation to agree that the teaching event provided them opportunities to demonstrate their teaching knowledge and skills (Chi-Squares were significant at the .001 level on most items). Across survey items, 70-90 percent of candidates who reported high levels of support agreed or strongly agreed that the teaching event provided them opportunities to demonstrate their teaching knowledge and skills, while 40-60 percent of candidates who reported low levels of support expressed the same perceptions. Among those who reported greater levels of program preparation for the teaching event, 80-90 percent of candidates agreed or strongly agreed that the teaching event provided opportunities to demonstrate their teaching knowledge and skills across survey items, while only 5-10 percent of candidates who reported low levels of program preparation expressed the same perceptions. These findings support the case study findings that support and preparation for completing the teaching event are related to candidates' attitudes about the teaching event.

Conclusion

Results from the case studies and focus group strongly suggest that preservice teachers at Urban University who completed the teaching event in the 2002-2003 pilot year were able to learn from the assessment in important ways, including learning about students and addressing their specific learning needs, planning a sequence of connected lessons, assessing student learning, and modifying instruction based on those assessments. In addition, the case studies showed that teachers' self-reports of learning were corroborated, to some extent, in their teaching practice. Thus, the activities involved in constructing the teaching event seem to prompt them to think about teaching in new ways and to enact some of these new ideas in their practice.

This research, though limited in generalizability by a small sample in a somewhat unrepresentative program (during the first year of piloting), has important implications for preservice teacher education reform and suggests that performance assessments like the teaching event, when thoughtfully implemented, can be useful learning tools to strengthen the professional preparation of new teachers in ways that lead to more learner-centered, assessment-driven teaching. As we saw in Joy's case, the novel activities that Joy experienced in completing the teaching event filled certain gaps in her previous program experiences (e.g., learning about students, independently planning and teaching an extended learning segment). In this way, the teaching event seems to have contributed to Joy's learning experience in her credential program. Thus, for teacher credential programs that are organized by cohorts with varied program components, courses taught by different instructors, and field placements over which they have little control, the experience of completing a TPA like the teaching event may serve to provide a more standard set of teacher preparation experiences across a program.

From this study we also learn that there are some important factors that may mediate the influence of the teaching event on candidates' program learning experiences, ranging from opportunities to learn in the existing program, support provided by supervisors and cooperating teachers, to constraints on teaching decisions faced by candidates in their teaching placements. In addition, the proposed uses of the TPA (for high-stakes credentialing decisions, for course/program completion, or as a formative learning tool) may also have an influence on teachers' learning experiences, although further investigation of this influence is needed. These findings have important implications for how teacher performance assessments should be implemented if adopted for use in teacher education programs.

This study also illustrates the impact of a top-down state mandate in teacher education in one local context, and shows the limits of such a reform. Although the new state law did lead some programs to make deep investments in creating and implementing the PACT extensively throughout their programs, others have done as much as possible to minimize the "colonization" of their program curriculum by the new TPA requirement. The impact of TPAs like the PACT on teacher education programs, on teacher learning, and ultimately on the quality of the teaching force will depend on the will of local actors to implement the mandated assessment in accordance with its intent as both a summative high-stakes assessment as well as a formative learning tool.

In October of 2006, California's mandate for new teacher candidates to successfully complete a TPA to obtain the preliminary teaching credential was reauthorized by California Senate Bill 1209. Enactment of the new law is scheduled to begin in July 2008. When the TPA requirement becomes an official part of the credentialing decision, it will be important to study the impact that the policy has on both credential programs and new teachers. This study suggests that TPAs like the PACT, when thoughtfully implemented, have potential as learning tools in teacher education, and that the inclusion of a TPA as a component of teacher education (whether at the preservice or induction level) may contribute to the teacher preparation experience in valuable ways.
Appendix A
Overview of Elementary Teaching Event (2002-03 version)

Focus of
Teaching Event What To Do

A. Instructional Provide relevant information about your
Context instructional context and your students
 as learners of literacy and mathematics.

B. Planning Curriculum For literacy, select a series of lessons
Assessment, and designed to promote designed to promote
Instruction in Literacy students' comprehension and/or
Literacy composition of text with attention to
(TPEs 1, 2, 3, 4 relevant skills and strategies.
7, 8, 9, 10, 11) In planning your literacy lessons or
 lessons, you must show a relevant
 connection to another subject area.
 Create an instructional and assessment
 plan.
 Record daily notes and reflections on
 what happened.

C. Implementing Review your plans and prepare to
Instruction in videotape your class.
Literacy Identify opportunities to to illustrate
(TPEs 1, 2, 4, 5, 7, 11) how you promote students' comprehension
 and/or composition of text.
 Videotape the lesson(s) you have
 identified.
 Review the videotape to identify one
 clip that portrays the required features
 of your teaching; this clip should be no
 longer than 10 minutes.
 Copy or upload this clip into a new
 videotape or file.
 Write a commentary that describes how
 your interactions with students reflect
 the required features of the task.

D. Assessing Identify two focus students and collect
Student Learning at least three samples of their work, at
in Literacy least one of which must come from the
(TPEs 1, 2, 3, 7, 13) learning segment.
 Write a commentary on the two focus
 students' learning progress.

E. Analyzing Review your notes on the effectiveness
Teaching and of daily instruction, your videotape
Learning in clip, and the student assessment data.
Literacy Write a commentary analyzing your
(TPEs 2, 3, 4, 7, 10, 11, 13) teaching during this learning segment in
 light of student learning.

F. Planning For mathematics, select a series of
Curriculum, lessons designed to build conceptual
Assessment, and understanding, computational/ procedural
Instruction in fluency, and mathematical reasoning
Mathematics skills.
(TPEs 1, 2, 3, 4, 6, In planning your mathematics lessons or
7, 8, 9, 10, 11) your literacy lessons, you must show a
 relevant connection to another subject
 area.
 Create an instruction and assessment
 plan.
 Record daily notes and reflections on
 what happened.

G. Implementing Review your plans and prepare to
Instruction in videotape your class.
Mathematics Identify opportunities to illustrate how
(TPEs 1, 2, 4, 5, you build conceptual understanding,
7, 11, 13) computational/procedural fluency, and/or
 mathematical reasoning skills.
 Videotape the lesson(s) you have
 identified.
 Review the videotape to identify one
 clip that portrays the required features
 of your teaching; this clip should be no
 longer than 10 minutes.
 Copy or upload this clip into a new
 videotape or file.
 Write a commentary that describes how
 your interactions with students reflect
 the required features of the task.

H. Assessing Write a commentary that uses assessment
Student Learning data to provide an achievement profile
In Mathematics of the whole class and analyzes the
(TPEs 1, 2, 3, 7, 13) extent to which the class met the
 learning goals.

I. Analyzing Teaching Review your notes on the effectiveness
and Learning in of daily instruction, your videotape
Mathematics clip, and student assessment data.
(TPEs 2, 3, 4, 7, Write a commentary analyzing your
10, 11, 13) teaching during this learning segment in
 light of student learning

Focus of
Teaching Event What To Submit

A. Instructional Task A.1
Context Instructional Context

B. Planning Curriculum Task B.1
Assessment, and Instruction and Assessment
Instruction in Literacy Plan--Overview and Commentary
Literacy Task B.2
(TPEs 1, 2, 3, 4 Daily Instruction and
7, 8, 9, 10, 11) Assessment Plans

C. Implementing Task C.1
Instruction in Videotape
Literacy Task C.2
(TPEs 1, 2, 4, 5, 7, 11) Teaching Commentary

D. Assessing Task D.1
Student Learning Individual Student
in Literacy Learning Commentary
(TPEs 1, 2, 3, 7, 13) Commentary

E. Analyzing Task E.1
Teaching and Reflective Commentary
Learning in
Literacy
(TPEs 2, 3, 4, 7, 10, 11, 13)

F. Planning Task F.1
Curriculum, Instruction and
Assessment, and Assessment Plan--Overview
Instruction in and
Mathematics Commentary
(TPEs 1, 2, 3, 4, 6, Task F.2
7, 8, 9, 10, 11) Daily Instruction
 and Assessment Plans

G. Implementing Task G.1
Instruction in Videotape
Mathematics Task G.2
(TPEs 1, 2, 4, 5, Teaching Commentary
7, 11, 13)

H. Assessing Task H.1
Student Learning Whole Class Learning Commentary
In Mathematics
(TPEs 1, 2, 3, 7, 13)

I. Analyzing Teaching Task I.1
and Learning in Reflective Commentary
Mathematics
(TPEs 2, 3, 4, 7,
10, 11, 13)

Note: This is the 2002-2003 version of the Elementary teaching event.
The most recent versions of teaching event materials can be found at
www.pacttpa.org.

Appendix B
Perspectives o f Urban University's Piloting Candidates on
Opportunities To Demonstrate Teaching Competencies on the Teaching
Event

Indicate your level of
agreement Completing the
teaching event provided strong
me the opportunity to agree agree disag.
demonstrate my: N missing 1 2 3

3A Subject-specific 27 9 17 1
pedagogical skills for 33.3% 63% 3.7%
teaching literacy
(elementary only)

3B. Subject specific 27 1 9 15 2
pedagogical skills for 3.7% 33.3% 55.6% 7.4%
teaching mathematics
(elementary only)

4. Mentoring of student 27 8 15 4
learning during instruction 29.6% 55.6% 14.8%

5. Integration and use of 27 7 15 5
assessments 25.9% 55.6% 18.5%

6. Ability to make content 27 6 18 3
accessible 22.2% 66.7% 11.1%

7. Ability to engage students 27 10 17
in learning 37% 63%

8. Developmentally 27 8 15 4
appropriate teaching 29.6% 55.6% 14.8%

9. Ability to teach English 27 1 4 14 6
learners 3.7% 14.8% 51.9% 22.2%

10. Ability to learn about my 27 7 18 2
students 25.9% 66.7% 7.4%

11. Instructional planning 27 12 12 3
 44.4% 44.4% 11.1%

12. Use of instructional time 27 7 17 3
 25.9% 63% 11.1%

13. Ability to construct a 27 5 18 3
positive social environment 18.5% 66.7% 11.1%
in a classroom

14. Ability to grow as a 27 10 13 4
professional 37% 48.1% 14.8%

Indicate your level of
agreement Completing the
teaching event provided strong
me the opportunity to disag. standard
demonstrate my: 4 mean deviation

3A Subject-specific 1.70 .542
pedagogical skills for
teaching literacy
(elementary only)

3B. Subject specific 1.73 .604
pedagogical skills for
teaching mathematics
(elementary only)

4. Mentoring of student 1.85 .662
learning during instruction

5. Integration and use of 1.93 .675
assessments

6. Ability to make content 1.89 .577
accessible

7. Ability to engage students 1.63 .492
in learning

8. Developmentally 1.85 .662
appropriate teaching

9. Ability to teach English 2 2.23 .815
learners 7.4%

10. Ability to learn about my 1.81 .557
students

11. Instructional planning 1.67 .679

12. Use of instructional time 1.85 .679

13. Ability to construct a 1 2.00 .679
positive social environment 3.7%
in a classroom

14. Ability to grow as a 1.78 .698
professional

Note: Though a total of 30 teacher candidates at Urban University
completed the Teacher Reflection Survey, only 27 completed this portion
of the survey.


References

Anderson, R. S., & DeMeulle, L. (1998). Portfolio use in twenty-four teacher education programs. Teacher Education Quarterly, 25(1), 23-32.

Athanases, S. Z. (1994). Teachers' reports of the effects of preparing portfolios of literacy instruction. Elementary School Journal, 94(4), 421-439.

Baxter, G. P., Glaser, R., & Raghavan, K. (1993). Analysis of cognitive demand in selected alternative science assessments (Technical Report 382). UCLA Graduate School of Education, Los Angeles: Center for Research on Evaluation, Standards, and Student Testing.

Bond, L., Smith, T., Baker, W., & Hattie, J. (2000). The certification system of the National Board for Professional Teaching Standards: A construct and consequential validity study. Greensboro, NC: Center for Educational Research and Evaluation at the University of North Carolina at Greensboro.

Bruner, J. (1966). Toward a theory of instruction. Cambridge, MA: Harvard University Press.

California Commission on Teacher Credentialing (CCTC). (2002). SB2042: Professional preparation programs--Teaching performance assessment. Updated Feb.1, 2002. www.ctc.ca.gov/SB2042/TPA_FAQ.html.

Cavaluzzo, L. (2004). Is National Board Certification an effective signal of teacher quality? (National Science Foundation No. REC-0107014). Alexandria, VA: The CNA Corporation.

Cochran-Smith, M., & Lytle, S. (1999). Relationship of knowledge and practice: Teacher learning in communities. In Iran-Nejad, A., & Pearson, P.D. (Eds.), Review of Research in Education (Vol. 24, pp.249-305). Washington, DC: American Educational Research Association.

Darling-Hammond, L., Ancess, J., & Falk, B. (1995). Authentic assessment in action: Studies of schools and students at work. New York: Teachers College Press.

Darling-Hammond, L., Atkin, M., Sato, M., & Chung, R. (forthcoming). Influences of National Board Certification on teachers' classroom practices, Final report. Commissioned by the National Board for Professional Teaching Standards. Arlington, VA: NBPTS.

Davis, C. L., & Honan, E. (1998). Reflections on the use of teams to support the portfolio process. In N. Lyons (Ed.), With portfolio in hand: Validating the new teacher professionalism (pp.90-102). New York: Teachers College Press.

Feiman-Nemser, S. & Buchmann, M. (1983). Pitfalls of experience in teacher preparation. (Occasional Paper No. 65). East Lansing, MI: Institute for Research on Teaching (Michigan State University).

Gage, N.L., & Berliner, D.C. (1998). Educational Psychology, 6th Ed. Boston: Houghton Mifflin.

Goldhaber, D., & Anthony, E. (2005). Can teacher quality be effectively assessed? Seattle, WA: University of Washington and the Urban Institute.

Goodlad, John I. (1990). Teachers for our nation's schools. San Francisco: Jossey-Bass.

Greeno, J.G., Collins, A.M., & Resnick, L.B. (1996). Cognition and learning. In D.C. Berliner & R.C. Calfee (Eds.). Handbook of educational psychology (pp.15-46). New York: Macmillan.

King, B. (1991). Teachers' views on performance-based assessments. Teacher Education Quarterly, 18(3), 109-19.

Lave, J. (1988). Cognition in practice: Mind, mathematics, and culture in everyday life. New York: Cambridge University Press.

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York: Cambridge University Press.

Lustick, D., & Sykes, G. (2006). National Board certification as professional development: What are teachers learning? Education Policy Analysis Archives, 14(5).

Lyons, N.P. (1996). A grassroots experiment in performance assessment. Educational Leadership, 53(6), 64-67.

Lyons, N.P. (1998a). Reflection in teaching: Can it be developmental? A portfolio perspective. Teacher Education Quarterly, 25(1), 115-27.

Lyons, N.P. (1998b). Portfolio possibilities: Validating a new teacher professionalism. In Lyons, N.P. (Ed.), With portfolio in hand (pp.247-264). New York: Teachers College Press.

Lyons, N.P. (1999). How portfolios can shape emerging practice. Educational Leadership, 56(8), 63-65.

Pecheone, R., & Chung, R.R. (2006). Evidence in teacher education: The performance assessment for California teachers. Journal of Teacher Education, 57(1), 22-36.

Rotberg, I.C., Futrell, M. H., & Lieberman, J. M. (1998). National Board certification: Increasing participation and assessing impacts. Phi Delta Kappan, 79(6), 462-66.

Sanders, W., Ashton, J. J., & Wright, P. S. (2005). Comparison of the effects of NBPTS certified teachers with other teachers on the rate of student academic progress. Retrieved from: http://www.nbpts.org/UserFiles/File/SAS_final_report_Sanders.pdf

Sato, M. (2000). The National Board for Professional Teaching Standards: Teacher learning through the assessment process. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, April 2000.

Schon, D. A. (1983). The reflective practitioner: How professionals think in action. New York: Basic Books.

Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57(1), 1-22.

Smith, T., Gordon, B., Colby, S., & Wang, J. (2005). An examination of the relationship of the depth of student learning and National Board certification status. Office for Research on Teaching, Appalachian State University. Retrieved from: http://www.nbpts.org/UserFiles/File/Applachian_State_Study_Smith.pdf

Snyder, J., Lippincott, A., & Bower, D. (1998). The inherent tensions in the multiple uses of portfolios in teacher education. Teacher Education Quarterly, 25(1), 45-60.

Stone, B.A. (1998). Problems, pitfalls, and benefits of portfolios. Teacher Education Quarterly, 25(1), 105-114.

Tharp, R. G., & Gallimore, R. (1988). Rousing minds to life: Teaching, learning, and schooling in social context. Cambridge, UK: Cambridge University Press.

Tracz, S. M., Sienty, S., & Mata, S. (1994, February). The self-reflection of teachers compiling portfolios for national certification: Work in progress. Paper presented at the annual meeting of the American Association of Colleges for Teacher Education, Chicago.

Tracz, S. M., Daughtry, J., Henderson-Sparks, J., Newman, C., & Sienty, S. (2005). The impact of NBPTS participation on teacher practice: Learning from teacher perspectives. Educational Research Quarterly, 28(3), 35-50.

Vandevoort, L. G., Amrein-Beardsley, A., & Berliner, D. C. (2004). National Board Certified teachers and their students' achievement. Education Policy Analysis Archives, 12(46), 117.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. In Cole, M., Steiner, V. J., Scriber, S., & Souberman, E. (Eds.). Cambridge, MA: Harvard University Press.

Whitford, B. L., Ruscoe, G., & Fickel, L. (2000). Knitting it all together: Collaborative teacher education in Southern Maine. In Darling-Hammond, L. (ED.), Studies of excellence in teacher education: Preparation in the undergraduate years (pp. 172-257). New York: National Commission on Teaching and America's Future & Washington, DC: American Association of Colleges for Teacher Education.

Wiggins, G.P. (1998). Educative assessment: Designing assessments to inform and improve student performance. San Francisco, CA: Jossey-Bass.

Zeichner, K. (1992). Rethinking the practicum in the professional development school partnership. Journal of Teacher Education, 43(4), 296-307.

Notes

(1) The PACT Consortium currently includes all eight of the University of California campuses, six Cal State University campuses, six private universities, and one district intern program. For a more detailed overview of the PACT project, the assessment design and scoring system, reliability and validity studies conducted to date, and a discussion of policy implications, see Pecheone and Chung (2006).

(2) The California Teaching Performance Expectations can be found in Appendix A of the Standards of Quality and Effective for Teacher Preparation Programs (2001) on the California Commission on Teacher Credentialing web page (http://www.ctc.ca.gov/educatorprep/standards/ AdoptedPreparationStandards.pdf).

(3) Updated versions of the PACT assessments can be found at http://www.pacttpa.org.

(4) Teaching performance assessments have also been called "portfolio assessments" in the literature. However, this paper distinguishes between the two, using "performance assessments" to refer specifically to task-based assessments and "portfolio assessments" to refer to more open-ended collections of teacher candidates' work.

(5) Both Tracy and Joy's teaching events were scored at a regional scoring session in which raters recruited from across PACT campuses in the region participated. Candidates' TEs were not scored by their own instructors or supervisors that year.

(6) Response rates are approximately 67% for Urban University and 75% across campuses.

Ruth R. Chung is a postdoctoral scholar in the School of Education at Stanford University, Stanford, California.
Table 1. Case Studies--Comparison of Learning and Teaching Contexts.

 Tracy Joy

Background Age: Early 30s; BA & Age: Mid-30s; BA earned
of teacher MBA in marketing; had a recently (in child
candidates little experience with development); had some
 tutoring and substitute experience with
 teaching. substitute teaching at
 preschool, teaching
 Sunday school, and
 counseling junior high
 students at church

Program type 4-semester intern program 2-year master's program
 (cohort) (cohort)

PACT Well scaffolded, but Not well scaffolded, not
Implementation not well integrated integrated into other
 with other courses; courses; Cooperating
 Cooperating teachers teachers not aware;
 aware; Supervisors very Supervisor (who also
 involved in process; taught Practicum Seminar)
 Practicum Seminar not very involved in
 instructor very familiar process, not very
 with teaching event familiar with teaching
 event

Student Full-year 3rd grade; Full Year (Fall 4th
teaching Part-time with 2 grade, Spring
 full-time solo weeks; Kindergarten); Part-time
 Urban, middle SES school, Fall, full-time Spring
 majority of White with 2 full-time solo
 students; Cooperating weeks; Urban, low SES
 teacher permitted some school, majority of
 autonomy, lessons and students from minority
 united co-planned; ethnic groups;
 Cooperting teacher a Cooperating teacher not
 mentor very flexible, routines
 are sacred; Cooperating
 teacher not a mentor

Table 2. Cross Case Study Findings.

Findings Tracy Joy

Attitude toward Overall positive attitude Mixed feelings
teaching event

Experience with Time-consuming, heavy "Rigorous";
the teaching workload, but not overly "challenging"
event difficult

Reported (1) Planning an extended (1) Planning for
learning gains learning segment *; continuity from
from teaching (2) Modifying lessons lesson to lesson;
event based on assessment (2) Modifying lessons
 of student based on assessment
 learning *; of student learning;
 (3) Integrating content (3) Attention to EL
 areas *; students *;
 (4) Attending to content (4) Learning about
 standards; students;
 (5) Aligning assessment (5) Integrating content
 with plans areas;
 (6) Attending to content
 standards *;
 (7) Analyzing video more
 reflectively;
 (8) Assessing student
 learning;
 (9) Choosing teaching
 strategies to
 reflect student
 needs (e.g., ELs) *

Observed changes (1) Shift from concern (1) Shift from concern
in teaching with teacher with engagement
knowledge, activities and only to student
dispositions, activity structures understandings;
or practice to student learning; (2) Shift from
related to (2) Independently using dependence on
teaching event assessment to guide Cooperating
 instruction; Teacher's feedback
 (3) Awareness of need to independent
 for strategies to reflection;
 reach ELs (3) Increased knowledge
 of students'
 background and
 learning needs;
 (4) Awareness of need
 for strategies to
 reach ELs *

* Interaction/overlap with learning gains associated with program
learning experiences.
COPYRIGHT 2008 Caddo Gap Press
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2008 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Chung, Ruth R.
Publication:Teacher Education Quarterly
Article Type:Report
Geographic Code:1USA
Date:Jan 1, 2008
Words:10747
Previous Article:Approaches to diversifying the teaching force: attending to issues of recruitment, preparation, and retention.
Next Article:"Signs, signs, everywhere there's signs ... and the sign says": you got to have a PRAXIS II membership card to get inside.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters