Printer Friendly

Learning to notice: scaffolding new teachers' interpretations of classroom interactions.

Mathematics and science education reforms encourage teachers to base their instruction in part on the lesson as it unfolds in the classroom, paying particular attention to the ideas that students raise. This ability to adapt instruction in the moment requires that teachers be able to notice and interpret aspects of classroom interactions that are key to reform teaching. This paper defines what it means to "notice" in the context of reform and describes a multimedia tool designed to help teachers learn to do so. The authors then report on a study in which six mathematics and science teachers seeking secondary teaching certification used the software to examine teaching. The results suggest that use of the software helped the teachers to develop new ways to analyze instruction. Specifically, the teachers began to identify particular events in their classroom interactions as noteworthy, to more frequently use specific evidence to discuss these events, and to provide their own interpretations of these events. Thi s research adds to our understanding of teacher cognition and also has implications for those who are designing and implementing teacher education in the context of reform.

* * * * * * * * * *

A key tenet of mathematics and science education reform is the creation of classroom environments in which teachers make pedagogical decisions in the midst of instruction. In the mathematics classroom, teachers and students are expected to listen carefully to one another's ideas, with teachers adapting their instruction, at least in part, based on the ideas that students raise (National Council of Teachers of Mathematics [NCTM], 2000; Smith, 1996). Similarly, teachers of inquiry-based science curricula are encouraged to listen to and interpret students' ideas and to use those ideas to help students investigate authentic questions (Hammer, 2000; van Zee & Minstrell, 1997; National Research Council [NRC], 1996; American Association for the Advancement of Science [AAAS], 1993). The authors claim that this view of teaching and learning requires that teachers develop new ways of noticing and interpreting classroom interactions. However, current programs of teacher education often do not focus on helping teachers l earn to interpret classroom interactions. Instead, they focus on helping teachers learn to act, often providing them with instruction concerning new pedagogical techniques and new activities that they can use (Berliner, 2000; Day, 1999; Huling, Resta, & Rainwater, 2001; Niess, 2001; Putnam & Borko, 2000; Taylor, 2000). Although these techniques and activities are certainly important resources for new teachers, they do not necessarily ensure that teachers will learn to interpret classroom interactions in ways that allow for flexibility in their approach to teaching.

In this article, we report on an investigation of how teachers learn to notice and interpret classroom interactions. We begin by defining what it means to "notice" classroom interactions. Next, we describe a software tool intended to help teachers learn to do so. We then report findings from a study in which we investigated how the use of the software tool may have influenced preservice mathematics and science teachers' analysis of their classrooms. Our findings suggest that teachers who used the software tool began to analyze and interpret teaching in ways that we believe are recommended by current mathematics and science reform efforts.

LEARNING TO NOTICE IN THE CONTEXT OF REFORM

Our study is framed by two main areas of prior research. First, we draw on prior work to define what we mean by teachers' ability to notice classroom interactions. Second, we turn to research on the use of video in teacher education to describe the reasons why video might be a useful tool for supporting teachers' efforts to notice and interpret classroom interactions.

What Does It Mean to Notice?

While we argue that noticing and interpreting are important skills for teaching in the context of reform, we want to point out that discussion of these skills are not entirely new. Prior research has shown that some experienced teachers may engage in these practices already (Berliner, 1994). Furthermore, in these cases, teachers' ability to notice and interpret classroom interactions is something that is perceived as developing over time. In contrast, we claim that teacher education should support teachers in learning to notice. In addition, it may be that while some experienced teachers already have these skills, reform calls for noticing new things. Our goal here is to explore whether or not we can help teachers learn to notice, as well as learn to notice aspects of teaching and learning that are important to reform. We begin by defining what we believe it means for teachers to notice and interpret classroom interactions. Specifically, we propose three key aspects of noticing: (a) identifying what is import ant or noteworthy about a classroom situation; (b) making connections between the specifics of classroom interactions and the broader principles of teaching and learning they represent; and (c) using what one knows about the context to reason about classroom interactions.

Noticing involves identifying what is important in a teaching situation. One characteristic of noticing is learning to identify what is noteworthy about a particular situation. Teaching is a complex activity. In any given lesson, teachers need to attend to what students are doing and saying, how they are thinking about the subject matter, what analogies or representations to use to best convey important ideas, and what experiences to provide students to engage them in learning. Teachers cannot possibly respond to all that is happening in any given moment. Instead, teachers must select what they will attend and respond to throughout a lesson. Frederiksen (1992) referred to this skill of noticing as making a "call-out." He developed this terminology when watching videotapes with teachers; the teachers would call-out when they saw something of note in the video. We suggest that this term also applies to teachers during instruction when they notice something significant. The act of making a call-out signifies an ability to hone in on what is important in a very complex situation. For example, in the midst of a lesson, a teacher might notice that students are working closely, explaining ideas to each other, and helping one another understand a difficult concept. Teachers can then use this information to decide how to proceed with the instructional plan. Similarly, Leinhardt, Putnam, Stein, and Baxter (1991) found that expert teachers have "check points" in mind during instruction that they use to assess the progress of a lesson and to decide how to proceed. An example check point is the teacher knowing to assess students' understanding of place value in the context of addition before moving on to subtraction. Similar to call-outs, being able to identify check points suggests that in some cases, more experienced teachers may be able to recognize what is important to attend to as a lesson is implemented. We believe that this skill is particularly important in the context of reform because lessons can no longer be planne d completely in advance, and teachers must make many decisions in the midst of instruction about how to proceed.

Noticing involves making connections between specific events and broader principles of teaching and learning. A second characteristic of noticing is the ability to make connections between specific events and the broader ideas they represent. Prior research has shown that experts consider specific situations in terms of the concepts and principles that they represent (Chi, Feltovich, & Glaser, 1981; Glaser & Chi, 1988; Larkin & Simon, 1987). The same can be said for expert teachers (Copeland, Birmingham, DeMeulle, D'Emidio-Caston, & Natal, 1994; Hughes, Packard, & Pearson, 2000; Nelson, 1988; Peterson & Cameaux, 1987). When analyzing a video of a class discussion, for example, novice teachers generally provide only a literal description of the events they see. In contrast, expert teachers describe the segment in terms of issues related to teaching and learning, using phrases such as "The teacher really paid attention to the students' ideas" or "It looks like all students have access to learning in this classr oom." Here, they connect the specific event that they see to a concept or principle they understand about teaching and learning (i.e., how students learn or equity). Similarly, Shulman (1996) referred to the importance of extrapolating from the specific to the general when he encouraged teachers and teacher educators to ask themselves, upon analysis of a teaching episode, "What is this a case of?" Responding to this question helps teachers look at a situation and recognize it as an instance of something, a principle of teaching and learning, rather than seeing each instance as an isolated event. Doing so can also bring the rhetoric of reform to life when teachers have a chance to view a "community of learners" or "inquiry learning" first hand.

Noticing involves using what one knows about the context to reason about a situation. A third feature of noticing involves using what one knows about the context to reason about situations. In other words, noticing classroom interactions is tied to the specific context in which one teaches, and it is within this arena that this ability should develop. This is in line with research that has found that as individuals gain more experience in a particular domain, they become more adept at making sense of situations they encounter within this domain (Chi, Glaser, & Farr, 1988; Lesgold et al., 1988). Thus, teachers must use their knowledge of the subject matter, knowledge of how students think of the subject matter, as well as knowledge of their local context to reason about events as they unfold. For example, teachers of science will more accurately reason about a classroom interaction from a science classroom than they will a literature or mathematics classroom. Likewise, physics teachers will better interpret st udents' understanding of force and motion than biology teachers would interpret a students' thinking of the same concepts. And finally, physics teachers are better able to interpret their students' thinking about physics than the thinking of a group of physics students from another teacher's classroom.

In defining what it means to notice, we want to highlight the importance of interpreting classroom interactions. Thus, how individuals analyze what they notice is as important as what they notice. Recently, several researchers have argued that there is value in teachers taking an interpretive stance to their analysis (Hammer, 2000; Putnam & Borko, 2000). Taking an interpretive stance means looking at a teaching situation for the purpose of understanding what happened, what students think about the subject matter, or how a teacher move influenced student thinking, as opposed to examining a situation for criticism or to take action. Although teaching is an action-oriented practice, we see value in having teachers learn to develop skills in interpretation that can then be used to inform pedagogical decisions. Our goal then is to support teachers in learning to first notice what is significant in a classroom interaction, then interpret that event, and then use those interpretations to inform pedagogical decisions . And we argue that these are requisite skills that reformers have in mind when they call for teachers to be flexible in their instructional plans as they are teaching.

DESIGNS FOR TEACHER LEARNING

The next question, then, is how to help teachers develop this ability to notice and interpret classroom interactions. We propose that video can be an effective tool towards this end. In particular, as we have described elsewhere (Sherin, 2002), video analysis affords teachers an opportunity to engage in a set of practices that are very different from what they do in the classroom. First of all, video offers a permanent record of classroom interaction (Latour, 1990). It can be paused and rewound so segments can be re-viewed on multiple occasions, with different perspectives each time if one desires. In addition, video can be collected, edited, and reorganized in a format that differs from its original presentation. This enables teachers to choose specific segments to view based on a particular goal. For instance, a teacher can view several segments of one student and closely examine this student's understandings. Or a teacher can select to examine the discourse among students and find all instances of student- to-student discourse and compare these interactions. Finally, video enables teachers to remove themselves from the demands of the classroom, such as having to take action and make instructional decisions in the moment, and to step back from the events in the classroom and examine them closely. Putnam and Borko (2000) suggested that teachers' actions in the classroom are constrained by familiar routines and that their thinking may have become routinized as well. They argued for teachers to have more experiences in which they can remove themselves from the classroom to study teaching and learning. We believe video analysis has the potential to afford this opportunity.

Despite this, few designers have taken advantage of the affordances of video to help teachers learn to interpret classroom instruction. Instead, video is often used to offer teachers images of expert practice or experts' analysis of teaching episodes with the goal of making experts' tacit knowledge available to novice teachers (Ethell & McMeniman, 2000). An important problem with this approach is that less experienced teachers are being given the "output of an experts' pattern recognition process" (Bransford, Franks, Vye, & Sherwood, 1989, p. 484) but are not developing the skills to make such conclusions themselves. In other words, they are told how experts think about problems, but they may not be able to recognize the specific features of problems to which experts know to attend.

More recently, however, video and multimedia environments have been developed that enable teachers to examine reform teaching (Bowers, et al., 2000; Lampert & Ball, 1998; Marx, et al., 1998). In a similar vein, we designed a software tool, Video Analysis Support Tool (VAST), to support teachers in developing an ability to notice and interpret aspects of classroom practice that are important to reform pedagogy. One unique aspect of VAST is that teachers are asked to do this analysis as they view video from their own classrooms. Thus, teachers use knowledge of their particular context (students, subject matter, curriculum, school, etc.) to reason about the segment they select to view. Following is a description of the specific features of the software (Figure 1) that are intended to scaffold teachers in noticing and interpreting classroom interactions.

Teachers are prompted to analyze three particular aspects of a classroom interaction--student thinking, teacher's roles, and discourse. In Figure 1, the Student Thinking tab is in the forefront. In this space, teachers comment about what they find noteworthy related to this issue. Users can click on the Teacher's Role or Discourse tabs as well and will see a similar screen in which to comment on noteworthy events related to these topics. These topics were selected because of their importance to reform pedagogy. Both mathematics and science reforms emphasize the importance of listening to students' ideas and using those ideas to inform instruction (Arvold, Turner, & Cooney, 1996, NCTM, 2000; NRC, 1996). In addition, reforms call for new roles for the teacher and emphasize the importance of teacher reflection on their thinking and action. Finally, reform efforts encourage classroom discourse, both between the teacher and students and between students as a means of supporting student learning. This design featur e is intended to scaffold teachers in noticing noteworthy events related to these three areas and in making connections between the specific instances they identify of the broader categories of student thinking, teacher's roles, and discourse.

Teachers are scaffolded to identify noteworthy events of teaching, or call-outs. Specifically, users are prompted to begin their analysis of a video excerpt by responding to the question, "What do you notice?" This allows users to begin the analysis with what they find significant in the episode related to the three broad topics, rather than being told what the designers of the software find noteworthy.

Teachers are scaffolded to use evidence from the video to support call-outs. When users make a call-out about an event, they are essentially beginning to make an argument about something they saw or noticed as being worth reflecting on more deeply. Effective argumentation involves using evidence to support claims (Toulmin, 1958). When teachers claim to have identified a noteworthy event, they mark the point in the segment when that event occurred. In addition, the software includes a transcript of the segment so teachers can cut and paste individual statements as evidence. Prior work on video analysis has found that often times people watching video will claim to see things that upon replaying have not actually occurred (Jordan & Henderson, 1995). Viewers have expectations about particular events in particular settings and use those as a frame for analysis. However, those expectations may in fact get in the way of allowing the user to see what actually happened. Further, identifying the actual events connecte d to a call-out provides a record to which the teacher can refer in discussions of the video with colleagues. This allows a group in a teacher education course to return to the video, examine the instance more closely, and discuss if it is in fact a case of the event the teacher identified.

Teachers are scaffolded to interpret the events they notice in the video. Taking an interpretive stance means that teachers focus on understanding why an event occurred or what influence a particular event had on student learning. It also means delving deeply into understanding what students understand about the subject matter and from where that understanding came. However, prior research (Copeland, et al., 1994; Sabers, Cushing, & Berliner, 1991; Sherin, 1998) has found that when teachers analyze video, they are often quick to make judgments about what was good or bad or what they should have done differently, focusing on evaluating pedagogy and solving pedagogical problems. Although there is value to teachers judging the effectiveness of their actions and examining alternative pedagogical techniques, there is also value to having teachers make sense of what occurred.

VAST scaffolds teachers to follow a prescribed sequence in their analysis. As Figure 1 illustrates, users are prompted to first make a call-out, then find evidence from the video to support the call-out, followed with an interpretation of the particular event selected as evidence. Upon interpretation, teachers can then add questions they might have about the call-out they identified, and they can use these questions to guide further analysis. The idea is to have teachers begin their analysis with something that stood out to them, an event that was noteworthy or surprising, and to then examine that event by finding an instance in the video to illustrate the point being made and to explain what they understand it to mean. This sequence is followed within the three main headings--Student Thinking, Teacher's Roles, and Discourse. With this design in mind, we will now describe a study in which a group of preservice mathematics and sciences teachers used VAST to analyze their practice, followed by the results of this study.

RESEARCH DESIGN

Subjects

During the 2000-2001 academic year, we conducted a study with a group of interns enrolled in an alternative certification program at a midwestern university. The certification program is 11 months long, beginning in June and running through the following May. In the summer, the interns team teach summer school classes with experienced teachers in the morning and attend university classes in the afternoon. Then, beginning in the fall, they teach full time, while continuing to take courses in the evening toward certification. The program is intended for individuals making a mid-life career change who have expertise in mathematics or science.

Twelve individuals were seeking certification in secondary mathematics or science education during the 2000-2001 academic year, and all 12 participated in the study. In the fall of 2000, we randomly selected 6 of the 12 interns to participate in three sessions surrounding the use of VAST. Each session lasted approximately one hour and occurred once a month. The goal of the first session was to introduce the interns to the VAST software and provide them with experience using the software as a group. In the second session, the interns analyzed an example video of a classroom interaction using VAST and then discussed their analyses. Finally, in the third session, the interns discussed how they might use VAST to analyze their own teaching. In addition, we considered ways that this tool might be useful for a required assignment for the certification program, a written analysis of their teaching using video as the medium for analysis.

Data

At the end of the summer and fall class sessions, the interns were required to submit a written analysis of their teaching. The procedure was as follows. The interns were asked to videotape their instruction and to use the video as the source of reflection. They were then asked to complete a written analysis in which they discuss the teaching and learning that occurred in a brief two to four page narrative. During the summer session, prior to the intervention with the experimental group, all 12 of the secondary education interns submitted written reflections of their teaching. Copies of the 12 reflections were collected and used as baseline data against which to compare additional analyses. At the end of the fall session, the 12 interns submitted a second video analysis. These analyses were submitted after the six interns in the experimental group participated in the three sessions with VAST and after all twelve interns had been teaching full-time for three months (1). This set of data allows us to examine th e influence that VAST may have had on the experimental group's writing and to compare the writing of the experimental and control groups. Because VAST scaffolds the interns' analysis in particular ways, we thought we might see a difference in the way they wrote about their teaching, specifically in what they wrote about and how they wrote about it. In this article, we focus on aspects of how the interns analyze practice. We will report the findings on what the interns analyze elsewhere.

A FRAMEWORK FOR ANALYZING TEACHERS' ESSAYS

The goal of this project was to examine the development of the interns' ability to notice. When we began the study, we expected to see movement by all teachers toward more advanced ways of noticing and interpreting classroom practice. The interns were beginning to teach and we believed that this experience alone would influence how they analyze practice. However, we also thought that the interns who used VAST might improve in different ways in light of using the multimedia tool.

Based on research described previously, we began with the assumption that essays on one end of the trajectory would be organized around literal descriptions of events in the video segments in which the teacher tended to make judgmental statements. We assumed that the essays representing deeper analyses of instruction would be organized around principles of teaching and learning, would include connections between specifics in the segment analyzed and the concepts and principles they represent, and would be less judgmental and more interpretive in their analyses. Teachers' essays that displayed this more advanced way of analysis consisted of groups of sentences that we call an "analytic chunk," similar to what Hughes et al., (2000) referred to as an "argument chunk." This is a group of sentences in which the interns make a call-out that relates to a concept or principle of teaching and learning, support the call-out with specific evidence (2) from the video segment, and then interpret the event. Discussing thei r teaching in terms of these analytic chunks suggests that the teachers can identify what is important in a teaching segment and do not have to simply recall what occurred step by step. They have developed a discriminating eye for significant classroom interactions; they understand that analysis of teaching and learning involves identifying what is important rather than describing each event that took place.

As we began analyzing the essays, we realized that these end points in the trajectory did not take into account the fact that the interns would not automatically move from novice to more expert ways of noticing and interpreting classroom interactions. We hypothesized that there was likely to be an intermediate stage in which they exhibited some of the features of expert noticing and not others. Therefore, our proposed trajectory contained an intermediate level to account for the development the interns would likely experience. At this level, the interns' essays would either contain a mix of text in which they continue to describe and judge practice, as well as begin to write analytic chunks, or they would begin to make call-outs, but they would not provide specific evidence from the video segment nor would they interpret the events they identified. Thus, they would have some features of analytic chunks but not all. We call these series of sentences "incomplete analytic chunks." Finally, synthesis of the liter ature led us to believe that there are two levels of advanced analysis. One level is a kind of analysis in which teachers primarily interpret classroom interactions. They identify what is noteworthy, connect specific events with principles of teaching and learning, and interpret the events they identified. The other kind of analysis is a level beyond in which teachers begin to make connections among the call-outs they make and concepts and principles of teaching and learning and begin to offer pedagogical solutions based on their interpretations.

From this, we created a trajectory of the levels of development for learning to notice and interpret classroom interactions (Table 1). Following is a description of each of the levels with example excerpts from the teachers' essays to illustrate how they were classified.

Level 1

At Level 1, teachers' writings were predominantly descriptive. The teachers described events as they occurred chronologically in the video. The following excerpt illustrates this style of writing:

I asked Ian what graph his group chose, and he answered that they had chosen Graph D. I asked him why they chose that graph. After a pause and some giggles from his other group members, they admitted that they had not formed their reasons. At that point, a student from another group, Kevin, raised his hand and said that he picked Graph C, the staircase-shaped function. Again, I asked him why and he explained that the Boy Scout would have to pause in between pulls, as he had to change hands. I asked him to point out where on Graph C would the pauses be. He indicated that the horizontal portions of the graph would be where the scout paused to switch hands. Several students nodded in agreement with his answer. (EC-summer)

In addition to chronological descriptions, teachers adopted an evaluative stance in their writing, focusing on what was good or bad, what went well or poorly, or what could or should have been done differently. The following excerpt illustrates an essay with an evaluative stance:

I was more or less occupying a fixed position. On viewing the video, I'm not sure whether this was the best approach under the given circumstances...I was not happy to hear myself. To me, I appeared rushed, and too loud. I did not give the students enough "wait time" to think and respond. I should have been more "silent" and listening. (UT-summer)

The dominant pattern at Level 1 is a literal description of what occurred in the classroom with evaluative, judgmental commentary on the pedagogy.

Level 2

Level 2 represents the intermediate stage in teachers' development. Because teachers were not likely to adopt all aspects of expertise in learning to notice and interpret teaching, this level contains two types of analyses. One type involves essays that are organized primarily around call-outs, but they consist of incomplete analytic chunks. As stated previously, an incomplete analytic chunk does not contain all of the elements of an analytic chunk. For example, it may have several call-outs but lack the corresponding evidence or interpretations. Following are two paragraphs from an essay that consisted of primarily incomplete analytic chunks.

One may observe during this videotape that the students were comfortable with the topic and were able to openly express their ideas. There was some disagreement among the students and some good-natured banter. Proper respect was accorded the designated speaker; the leader paraphrased and expanded upon the students' ideas. Ample opportunity was also given for the students to justify their conclusions and to add to their statements. All opinions were considered and were given equal weight. No opinion was discounted or discredited.

During the course of the discussion, all students participated. While the leader acknowledged volunteers, students who were reticent were also drawn into the fray. No student was put on the spot, but all were encouraged to participate. (BG- summer)

Both paragraphs begin with the teacher making a call-out about what she perceives as noteworthy in the video. In the first paragraph, the teacher points out that students openly expressed their ideas in class. She follows this with several sentences that provide general evidence for this claim. However, she describes what happened in the segment broadly and does not include references to specific student comments or actions nor does she interpret the events described. Therefore, this is an incomplete analytic chunk. The teacher follows this same pattern in the second paragraph, now focusing on student participation.

The second type of analysis at this level consists of a mix of writing styles, containing primarily descriptive and evaluative paragraphs, while also containing analytic chunks. Essays of this type belong in Level 2 because they are a mixture of both descriptive-evaluative chunks and analytic chunks. The teachers are beginning to make progress in being more interpretive in their analysis, but they continue to describe events as they unfold suggesting they are not yet consistent in honing in on significant events.

Level 3

At Level 3, the essays contain predominantly analytic chunks. However, essays at this level continue to contain judgmental statements about the events--whether they are good or bad, and in some cases, whether something else should or could have been done. Following is an example of a segment from an analysis that maintains the judgmental stance while also interpreting practice.

The lesson, however, broke down in several ways. The discussion was so lively that most of the students failed to do what I instructed them to do which was to plan their strategy and put something down on their notebook paper for me to grade. I saw a classroom full of wonderful thinkers so I also forgot about that written assignment. Also, I wanted to use what I call a "speaking ball" so that only one person who held the ball would speak. I was not strong enough about enforcing this rule. There was discussion all over the place, yet there was just one ball. I see this as both failure and success. The failure is due to the fact you cannot understand the words because so many students are talking and the success is due to the level of engagement. I felt I needed to continually repeat what the students were saying because many of the students could not hear what was being said. (ET-winter)

In this excerpt, the teacher begins with a call-out, noticing that the lesson "broke down." Highlighting this aspect of the lesson suggests that she is connecting the specific event she noticed to what she understands about curriculum and student engagement. She then supports the call-out with evidence, that the students failed to complete the written assignment and that she forgot about the assignment. She also refers to the use of a speaking ball and how it was not used as she intended. Following the evidence, she judges the lesson, focusing on what went poorly and what went well.

Level 4

The most dominant characteristic of essays at this level is that they contain only analytic chunks. Following are two paragraphs from one teachers' analysis to illustrate this style of writing. In this essay, the teacher divides the analysis into three different sections, Student Thinking, Teacher's Role, and Discourse and uses those headings to organize what he found noteworthy in the segment. The two paragraphs begin with a call-out, highlighting specific events that represent these important concepts. Specifically, the teacher identifies a noteworthy student comment related to the content being taught, and he also notices specific actions he takes as a moderator to facilitate student learning. He supports each call-out with specific evidence from the video segment, and the evidence is followed by an interpretation of the events.

Next, I observe a student making a hypothesis, not about where the tubes might float in the cylinder, but how, if any, the additional weight of the vial could affect the final outcome. Martin starts questioning the fact that there are tubes being used in the experiment when he says, "Because the glass and that rubber thing on top, all that might change the...density of the stuff inside." I believe the student has analyzed a possible scenario in his mind and in doing so, is trying to eliminate an outside variable which is the container.

Next, I see myself acting as a moderator by rephrasing students responses so that the whole class could be aware of the current state of the conversation. This happens when I say, "Ok, Don said if we drop the tube in there we can figure out if they're the same thing by whether they stayed in the oil or stayed in the water. Oliver, tell him if he is right?" By defining and focussing the results of the experiment, I hope to increase the level of communication between myself and my students. (KM-winter)

A second characteristic of essays at this level is that they provide suggestions for how the teachers might deal with similar pedagogical challenges in the future. The important point is that these pedagogical solutions follow the interpretations of noteworthy events. In this next example, the teacher places the pedagogical solution immediately following an analytic chunk. This is a consistent pattern throughout his essay. The essay begins with the teacher highlighting students' unwillingness to participate, a call-out related to the issue of student engagement. The teacher then provides a specific reference to illustrate what he found noteworthy. He then interprets why the students were not engaged and then offers a potential solution for dealing with similar issues in the future.

Although several students volunteer answers to the initial question, many were reluctant to vote (a technique I employed to increase their engagement!) and relatively few took part in the first phase of the discussion. On the videotape, students are inactive, sometimes even leaning over or to the side in relaxed poses. As discussed above, I believe a major part of this disengagement may have been due to the timing of the lesson--they were tired after a long day--and the relatively high-level, abstract nature of the question presented. In the future, I would probably want to present this lesson earlier in a more concrete setting to alleviate these problems. Alternatively, I could employ group techniques to increase participation--for example, by having small discussion groups write their answers on whiteboards or on paper before sharing them with the class. (QL-winter)

The final significant characteristic at this level is that the call-outs in Level 4 analyses may also be connected conceptually. In some essays, the teachers organize their analysis around headings that identify concepts and principles of teaching and learning (e.g. Classroom Discourse and Questioning, Student Mathematical Ideas, Engagement and Student Learning, or Teacher Moves). Under each heading, they have several analytic chunks related to the particular topic. So, for each topic, they have three or four call-outs, with specific evidence and interpretation. In addition, teachers at this level may also refer to several specific events as evidence to support each call-out. This organization suggests that the teachers are making connections between the specific events they identify and important principles of teaching and learning.

RESULTS

Using the previous definitions, a total of 24 written essays, two per intern, were coded by two coders, the first author and another researcher. The essays were coded blindly; the researchers did not know if the essays were written by an intern in the experimental or control group. Inter-rater reliability was 85%. Any differences between the two coders were discussed and resolved through consensus. After the essays were coded, they were then placed in one of the levels. Placement of teachers' essays in the four levels is presented in Table 2. This section describes the patterns of teachers' development in their written analyses of their practice.

Essays beginning at Level 1. All of the teachers who began in Level 1 moved along the trajectory. The three teachers in the experimental group who started in Level 1 (EC, UT, and FB) moved to Level 3 while the two teachers in the control group (SO and KX) moved to Level 2. All of the teachers' analyses placed in Level 1 were primarily descriptive and evaluative. Movement from Level 1 to all other levels suggests a shift from literal descriptions to a more analytic approach to understanding classroom interactions. It is not surprising to see that the two teachers in the control group moved to Level 2, where they began to become more analytic (i.e., making call-outs, supporting them with evidence followed by interpretation) because they were participating in an alternative certification program that encouraged reflection on teaching. However, as they gained experience and adopted a more analytic approach, they continued to describe and evaluate videos of their practice. In contrast, the teachers in the experime ntal group whose analyses started in Level 1 no longer relied on describing their practice as it unfolded over time, and they shifted to organizing their analyses around call-outs. Also unlike the writings of the control teachers, those in the experimental group began to interpret events they analyzed in the second analysis. However, like the control group, this group of teachers' writing continued to contain judgmental comments.

The significant difference between the writing of the teachers in the experimental group and that of the teachers in the control group who began at Level 1 is that the VAST users no longer described their practice as it unfolded over time. Instead, they organized their writing around call-outs. This suggests that they were more able to identify noteworthy events and connect them to principles of teaching and learning.

Essays beginning at Level 2. Three teachers in the experimental group began at Level 2 (BC, KM, and QL) and all of them moved beyond this level, one to Level 3 and two to Level 4. Four of the teachers in the control group began at Level 2 (BG, LT, ET, and DV). Two of the teachers remained at this level (BG and LT), while one of the control teachers moved to Level 3 (ET) and another moved to Level 4 (DV).

One teacher in the experimental group moved from Level 2 to Level 3 (BC). She no longer organized her essay around chronological descriptions of the video segment. Instead, she organized the analysis around call-outs, followed by evidence, and interpretation. However, she continued to evaluate as she reflected on her practice. The teacher in the control group who also moved to Level 3 (ET) had a similar organizational style to that of the experimental teacher described here. This suggests that VAST may not have necessarily helped teachers become less evaluative even as they became more interpretive in their analyses.

Two teachers in the experimental group who started at Level 2 moved to Level 4 (KM and QL). Both teachers organized their analyses solely by call-outs, supported them with specific evidence, and interpreted the noteworthy events. Unlike the teachers at the other levels, these two teachers' essays no longer contained comments about what was good or bad or what went well or poorly. Instead, they offered potential pedagogical solutions they might employ if they were to encounter similar teaching situations in the future, and these evaluations consistently followed the analytic chunks. Rather than evaluating the teaching and learning positively or negatively, they interpreted events and then provided solutions for similar pedagogical challenges in the future. Further, both of the teachers in this group also made connections between call-outs.

One of the teachers in the control group also moved to Level 4 (DV). She organized her analyses solely by call-outs, supported them with evidence, provided interpretations, and offered pedagogical solutions for future teaching situations. Further, she began making connections between callouts. However, unlike the two teachers in the experimental group who advanced to Level 4, she used only general evidence, as opposed to specific evidence to support claims. In the following excerpts from her analysis, she began both paragraphs with call-outs, followed by general evidence, followed by an interpretation of the evidence.

As the discussion progressed, it was clear that the students were not only listening and participating in the discussion, but that they were truly interested in what was being discussed. Their interest in the discussion could be inferred from their shift from passive listening to active listening. Rather than just accepting what was being said by the teacher and the rest of the class, the students analyzed what was being told to them and challenged the statement they did not agree with.

As mentioned before, the students' initial interactions were solely with the teacher. However, as the discussion continued the students also began communicating directly with one another. As students heard things they did or did not disagree with, they would offer their opinion on the comment and would explain why they agreed or disagreed. They honestly questioned one another's thinking. Although they responded to one another's comments, the students still addressed the majority of their comments to the instructor and there was not a huge amount of direct student to student interaction. However, when the students felt strongly about a particular idea or opinion, they did direct their comments directly to the person whose comments they were responding to. (DV-winter)

The fact that this teacher used general evidence, unlike the experimental teachers at this level who referred to specific events, suggests that the scaffolds in VAST intended to support teachers using more specific evidence were effective. One of the key aspects of VAST is a space where teachers can highlight and cut and paste portions of a transcript connected to the event they identify as being noteworthy. The fact that the teachers who used VAST referred to more specific evidence suggests that they were adopting more advanced ways of interpreting classroom interactions, namely, making connections between specific events and principles of teaching and learning.

Finally, two of the control teachers whose writing started at Level 2 remained at this level (LT and BG). They continued to identify call-outs but they did not support these call-outs with evidence or provide interpretations of events consistently. Further, one of the teachers maintained an evaluative stance in his writing. These two teachers never moved to providing substantive support of call-outs or to dropping judgmental comments.

In summary, we see that all six of the teachers in the experimental group moved to Levels 3 and 4, while only two of the teachers in the control group moved to these levels. Even in the case of the one teacher in the control group whose writing did move to Level 4, she did not provide specific evidence that we would argue is important to substantiate claims teachers make about what they perceive as being noteworthy events in teaching and learning.

DISCUSSION

These results suggest that the experimental teachers' experience with VAST had some influence on their analysis of their practice. First, VAST appears to have supported teachers organizing their analyses around call-outs rather than chronological descriptions of the events in the video. All of the teachers who used VAST moved to levels in which they no longer described events as they unfolded in the video segment. Instead, they identified and organized their essays around noteworthy events by making call-outs. Embedded in these call-outs were ideas about what the teachers found significant, and in some cases, the teachers gave labels or headings to the sections within the analysis to make clear the concept or principle that the events represented. Although teachers in the control group were also beginning to organize their essays around call-outs, they did so less consistently.

Second, the use of VAST resulted in teachers identifying specific evidence to support call-outs they made about teaching and learning. Research has shown that when people watch video, they often recall events that were not in fact in the video. In contrast, our analysis shows that VAST users were more likely to use specific evidence to support call-outs in their writing. Like more experienced teachers, they were making connections between specific events in the video segment and key ideas of teaching and learning.

Third, teachers who used VAST appeared to be more interpretive in their analyses. The teachers in the experimental group, who moved to Levels 3 and 4, were more likely to attempt to explain what students meant when they analyzed student thinking, how a teacher move may have influenced student understanding, or how the discourse evolved in the classroom. However, one trend in teachers' analysis of practice that VAST did not appear to have influenced was their tendency to judge their practice. Four of the six experimental teachers continued to evaluate their practice, even as they began to take a more interpretive stance. We hypothesize that part of the reason the teachers were likely to evaluate themselves in terms of good or bad was that these essays were part of a portfolio that was evaluated and used to determine if the teachers should be certified. It is not surprising then that the teachers would argue for what they did well and defend pedagogical decisions because they may have been trying to prove to th e portfolio reviewers that they are competent teachers.

Furthermore, to be clear, we do not mean to disregard teachers' focus on evaluating their practice. On the contrary, Berliner (1994) proposed that one aspect of expertise in teaching is the ability to evaluate, and we agree that this is an important skill. Yet, while there is merit in teachers being able to quickly judge a situation for the purpose of taking action, research on teacher expertise also argues for the importance of teachers being skilled in interpreting classroom interactions. And this latter skill seems even more critical now in light of the demands of reform in mathematics and science teaching. Because reform calls for an adaptive style of instruction, the ability to notice and interpret what is happening is critical. Furthermore, we argue that this interpretation should, at the least, preceed a teacher's evaluation of a situation. Thus, the goal should not be to immediately focus on whether one has made an effective pedagogical move, but rather to understand how that move responds to students ' ideas, the subject matter being discussed, or another issue at hand. Along these lines, it appears that VAST did support the experimental teachers in becoming more interpretive in their analysis, even though they continued to evaluate themselves. All six of the VAST teachers moved to levels in which they organized the essay around call-outs, provided specific evidence, and interpreted the events. Perhaps the evaluations were more grounded since they came on the heels of interpretations.

One may argue that it is not surprising for teachers to move from Level I to higher levels in the trajectory because their experience teaching influenced their analysis. We acknowledge that learning to teach, while enrolled in the certification program, would influence how teachers analyze their practice. However, if VAST had little or no influence, then we would see all of the teachers moving along the trajectory in a more similar pattern. Instead, we see that all of the teachers who used VAST moved to Levels 3 or 4, while a majority of the teachers in the control group remained at Level 2. This suggests then that VAST may have been useful in accelerating the teachers' movement along the trajectory of learning to notice classroom interactions.

CONCLUSION AND IMPLICATIONS

Our goal in this article has been to discuss how the use of VAST influenced the ways in which a group of intern teachers reflected on their instruction. As we have suggested, we believe that VAST supported the interns in learning to notice classroom interactions in ways that are recommended by current mathematics and science reform efforts. We believe that this has implications both for future research on teacher cognition and for those who are designing and implementing teacher education and professional development. Here, we mention two specific implications of our work and then offer several issues that we hope to investigate in future research.

From the work presented here, we propose that using VAST influenced how teachers analyzed practice. We find this significant for several reasons. First, the intervention we designed was brief, consisting of three 1-hour sessions in which teachers learned to use VAST to analyze two videos of practice. This is particularly interesting because research on teacher learning suggests that professional development must be ongoing and sustained (Abdal-Haqq, 1995; Gusky, 2000; Little, 1993). However, here we see that a brief intervention, in the context of a larger program, can make a difference. Perhaps the reason VAST appears to have influenced how teachers analyzed practice is due to the two programs being consistent in the knowledge and skills they are designed to help teachers develop. The certification program in which the interns were enrolled emphasized facilitating discourse in the classroom and using students' ideas to inform instruction. In this case, VAST may have provided teachers with a framework to help them analyze what they were being asked to do in the certification program.

Second, this research also illustrates what Salomon, Perkins, and Globerson (1991) referred to as the "effects of' technology. While VAST scaffolded the interns to analyze their practice in particular ways, the software did not organize the interns' analysis of their practice into a written product to be submitted as part of their final portfolio. Thus, there appears to be a subsequent cognitive residue for the interns working away from the software, in that they developed knowledge of the skills and strategies for analyzing practice as a result of using VAST and used that to guide their writing. Future research could examine more closely how teachers' partnership with a learning tool like VAST influences teachers' analysis of their practice.

While we find the results of this research encouraging, it also raises several questions for us. First, while we saw that new teachers were able to engage in practices that are believed to develop over many years of teaching, we wonder about the challenges of working with veteran teachers who are already "experts" at looking at classrooms in a particular way. Reform requires that teachers develop new routines and attend to new aspects of practice in new ways. One question we wish to consider is how to help veteran teachers make a shift in not only what, but also how, they notice and interpret classroom practice.

Second, the current work does not investigate how learning to notice influences teachers' instruction. However, this is a critical question. In the future, we hope to examine how the use of a software tool, with particular design principles, influences what teachers notice in the classroom and how they interpret those events while teaching. Specifically, we need to examine whether or not, and the degree to which, software tools like the one described herein support teachers in developing a new way of interpreting classroom interactions, not just by way of video, but also in the moment of instruction, so that they can be more flexible in their instructional plans.

Third, our focus in this work was to examine development in the ways in which teachers' noticed and interpreted classroom interactions. Another important question for us is whether any changes occurred in what the teachers noticed over time. Specifically, current research on mathematics and science education reform suggests that teachers must pay particular attention to students' conceptions and to interpreting the ideas that students raise. In future research, we hope to examine how VAST can help teachers to focus their attention in this area.
Table 1

Trajectory of Development in Learning to Notice

 Level 1 Level 2

Dominant Describe and Mixture of describe and evaluate
pattern in evaluate and complete analytic chunks
writing
 or


 Incomplete analytic chunks

 Level 3 Level 4

Dominant Complete analytic chunks Complete analytic chunks
pattern in
writing and Connections among call-outs and
 evidence
 Evaluate

 Identify pedagogical solutions

Table 2

Placement of Teachers in Trajectory

Level 1 Level 2 Level 3 Level 4

Experimental Group

EC -- summer EC -- winter
UT -- summer UT -- winter
FB -- summer FB -- winter
 BG -- summer BG -- winter
 KM -- summer KM -- winter
 QL -- summer QL -- winter

Control Group

SO -- summer SO -- winter
KX -- summer KX -- winter
 BG -- summer
 BG -- winter
 LT -- summer
 LT -- winter
 ET -- summer ET -- winter
 DV -- summer DV -- winter


Acknowledgements

This research was supported by a grant from the Braitmayer Foundation. The opinions expressed are those of the authors and do not necessarily reflect the views of the supporting agency. An earlier version of this article was presented at the annual meeting of the American Educational Research Association, New Orleans, LA, April 5, 2002.

The authors wish to thank Jean Kim for her assistance with data coding and analysis, as well as Allan Collins, Deidre LeFevre, and Sylvia Smith-DeMuth for their support of this project.

Notes

1. We want to be clear that the experimental teachers did not directly use VAST to write their narratives. However, if they did choose to use VAST to analyze their video, they could save notes from the analysis in a text file, and in the third session with the experimental teachers, we discussed how they might use these notes to assist them in completing the written assignment.

2. Teachers used either specific or general evidence. An excerpt was coded as specific evidence if the teachers wrote the details of the event, identifying the specific student or teacher statement or the specific content being discussed. In many cases, teachers include either an excerpt from a transcript that corresponds to the video segment or the time stamp on the videotape where the exact event occurs. If no details were provided in the description, the evidence was coded as general.

References

Abdal-Haqq, I. (1995). Making time for teacher professional development (Digest 95-4). Washington, DC: ERIC Clearinghouse on Teaching and Teacher Education.

American Association for the Advancement of Science. (1993). Benchmarks for scientifc literacy. New York: Oxford University Press.

Arvold, B., Turner, P., & Cooney, T.J. (1996). Analyzing teaching and learning: The art of listening. The Mathematics Teacher, 89, 326-329.

Berliner, D.C. (1994). Expertise: The wonder of exemplary performances. In J.M. Mangier & C. C. Block (Eds.), Creating powerful thinking in teachers and students: Diverse perspectives (pp. 161-186). Fort Worth, TX: Holt, Rinehart, & Winston.

Berliner, D.C. (2000). A personal response to those who bash teacher education. Journal of Teacher Education, 51(5), 358-371.

Bowers, J., Doerr, H.M., Masingila, J.O., & McClain, K. (2000). Multimedia Case Studies for Teacher Development: Case II: Making Weighty Decisions [CD-ROM].

Bransford, J.D., Franks, J.J., Vye, N.J., & Sherwood, R.D. (1989). New approaches to instruction: Because wisdom can't be told. In S. Vosniadou & A. Ortony (Eds.), Similarity and analogical reasoning (pp. 470497). New York: Cambridge University Press.

Chi, M.T.H., Feltovich, P.J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121-152.

Chi, M.T.H., Glaser, R., & Farr, M. (1988). The nature of expertise. Hillsdale, NJ: Lawrence Erlbaum.

Copeland W.D., Birmingham, C., DeMeulle, L., D'Emidio-Caston, M., & Natal D. (1994). Making meaning in classrooms: An investigation of cognitive processes in aspiring teachers, experienced teachers, and their peers. American Educational Research Journal, 31(1), 166-196.

Day, C. (1999). Developing teachers: The challenges of lifelong learning. London: Falmer Press.

Ethell, R.G., & McMeniman, M. M. (2000). Unlocking the knowledge in action of expert practitioners. Journal of Teacher Education, 51(2), 87-102.

Frederiksen, J.R. (1992). Learning to "see": Scoring video portfolios or "beyond the hunter-gatherer in performance assessment." Paper presented at the Annual Meeting of the American Educational Research Association, San Francisco.

Glaser, R., & Chi, M.T.H. (1988). Overview. In M.T.H. Chi, R. Glaser, and M.J. Farr (Eds.) The nature of expertise (pp. xv - xxvii). Hillsdale, NJ: Lawrence Erlbaum.

Gusky, R.G. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin Press.

Hammer, D. (2000). Teacher inquiry. In J. Minstrell & E. van Zee (Eds.), Inquiring into inquiry learning and teaching in science (pp. 184-215). Washington DC: American Association for the Advancement of Science.

Hughes, J.E., Packard, B.W., & Pearson, P.D. (2000). The role of hypermedia cases on preservice teachers' views of reading instruction. Action in Teacher Education, 22(2A), 24-38.

Huling, L., Resta, V., & Rainwater, N. (2001). The case for a third alternative. Journal of Teacher Education, 52(4), 326-338.

Jordan, B., & Henderson, A. (1995). Interaction analysis: Foundations and practice. Journal of the Learning Sciences, 4(1), 39-103.

Lampert, M., & Ball, D. (1998). Teaching, multimedia and mathematics: Investigations of real practice. New York: Teachers College Press.

Larkin, J.H., & Simon, H.A. (1987). Why a diagram is (sometimes) worth ten thousand words. Cognitive Science, 11, 65-69.

Latour, B. (1990). Drawing things together. In M. Lynch & S. Woolgar (Ed.), Representations in scientific practice (pp. 19-68). Cambridge, MA: MIT Press.

Leinhardt, G., Putnam, R.T., Stein, M., & Baxter, J. (1991). Where subject knowledge matters. In P. Peterson, E. Fennema, & T. Carpenter (Eds.), Advances in research on teaching (pp. 87-113). Greenwich, CT: JAI Press.

Lesgold, A., Rubinson, H., Feltovitch, P., Glaser, R., Klopfer, D., & Wang, Y. (1988). Expertise in a complex skill: Diagnosing x-ray pictures. In M.T.H. Chi, R. Glaser, & M. Farr (Eds.), The nature of expertise (pp. 311-342). Hillsdale, NJ: Lawrence Erlbaum.

Little, J.W. (1993). Teachers' professional development in a climate of educational reform. Educational Evaluation and Policy Analysis, 15, 129-151.

Marx, R., Blumenfeld, P.C., Krajcik, J.S., & Soloway, E. (1998). New technologies for teacher professional development. Teaching and Teacher Education, 14(1), 33-52.

National Council of Teachers of Mathematics. (2000). Principles and standards for school mathematics. Reston, VA.

National Research Council. (1996). National science education standards. Washington, DC: National Academy Press.

Niess, M.L. (2001). A model for integrating technology in preservice science and mathematics content-specific teacher preparation. School Science and Mathematics, 101(2), 102-109.

Nelson, K.R. (1988). Thinking processes, management routines and student perceptions of expert and novice physical education teachers. Unpublished doctoral dissertation, Louisiana State University, Baton Rouge, LA.

Peterson, P.L., & Comeaux, M.A. (1987). Teachers' schemata for classroom events: The mental scaffolding of teachers' thinking during classroom instruction. Teaching and Teacher Education, 3, 319-331.

Putnam, R.T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning? Educational Researcher, 29(1), 4-15.

Sabers, D.S., Cushing, K.S., & Berliner, D.C. (1991). Differences among teachers in a task characterized by simultaneity, multidimensionality, and immediacy. American Educational Research Journal, 28(1), 63-88.

Salomon, G., Perkins, D.N., & Globerson, T. (1991). Partners in cognition: Extending human intelligence with intelligent technologies. Educational Researcher, 20(3), 2-9.

Sherin, M.G. (1998). Developing teachers' ability to identify student conceptions during instruction. In S.B. Berenson, K.R. Dawkins, M. Blanton, W.N. Coulombe, J. Koib, K. Norwood, & L. Stiff (Eds.), Proceedings of the Twentieth Annual Meeting of the North American Chapter of the International Group for the Psychology of Mathematics Education. Columbus, OH: ERIC Clearinghouse for Science, Mathematics, and Environmental Education.

Sherin, M.G. (2002). New perspectives on the role of video in teacher education. Manuscript submitted for publication.

Shulman, L. (1996). Just in case: Reflections on learning from experience. In J.A. Colbert, P. Desberg, & K. Trimble (Eds.), The case for reflection: Contemporary approaches for using case methods (pp. 197-217). Boston: Allyn & Bacon.

Smith, J.P. (1996). Efficacy and teaching mathematics by telling: A challenge for reform. Journal for Research in Mathematics Education, 27(4), 458-477.

Taylor, P.M. (2000). When are we ever going to use this? Lessons from a mathematics methods course. School Science and Mathematics, 100(5), 252-25 5.

Toulmin, S. (1958). The uses of argument. Cambridge: Cambridge University Press.

van Zee, E., & Minstrell, J. (1997). Using questioning to guide student thinking. Journal of the Learning Sciences, 6 (2), 227-269.
COPYRIGHT 2002 Association for the Advancement of Computing in Education (AACE)
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2002, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Sherin, Miriam Gamoran
Publication:Journal of Technology and Teacher Education
Geographic Code:1USA
Date:Dec 22, 2002
Words:10062
Previous Article:Mentoring: a strategy for change in teacher technology education.
Next Article:Developing an online Master of Education in Educational Technology in a learning paradigm: the process and the product.
Topics:


Related Articles
Find out what it means to me respect. (The scholarship of teaching and learning).
Instructional decisions: helping students build links between representations.
Lesson adaptations and accommodations: working with native speakers and english language learners in the same science classroom.
Creating a conducive learning environment for the effective integration of ICT: classroom management issues.
Mathematics and computer-aided learning.
Using video to support teachers' ability to notice classroom interactions.
Patterns of guidance in inquiry learning.
Creating a culture of (in)dependence.
Peer computer conferencing to support teachers' reflection during action research.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters