Printer Friendly
The Free Library
23,375,127 articles and books


A pilot study of a problem-solving model for team decision making.

Abstract

Many schools have problem-solving teams that support teachers by helping identify and resolve students' academic and social problems. Although research is limited, it has been found that teams sometimes fail to implement problem-solving processes with fidelity, which may hinder the resolution of problems. We developed the Team-Initiated Problem Solving (TIPS) model to guide problem-solving processes of Positive Behavior Interventions and Supports (PBIS) Teams and the Decision Observation, Recording, and Analysis (DORA) instrument for measuring the fidelity of TIPS implementation. We conducted a TIPS Workshop for four elementary school PBIS Teams in North Carolina and Oregon and used DORA to assess the teams' implementation of TIPS processes in subsequent meetings. We found DORA was successful at allowing us to gather measures of fidelity of implementation, and that teams implemented TIPS processes with fidelity following the workshop. Limitations of these findings as well as implications for future research and practice are provided.

KEYWORDS: School teams, data-based decision-making, problem-solving, positive behavior interventions and supports

There is a long history of school personnel serving as members of problem-solving teams (Bahr & Kovaleski, 2006). Although these teams are known by different names, such as Teacher Assistance Teams (Chalfant, Pysh, & Moultrie, 1979); Prereferral Intervention Teams (Graden, 1985); Instructional Consultation Teams (Rosenfield & Gravois, 1996); and Instructional Support Teams (Kovaleski & Glew, 2006), their members share the common purpose of supporting teachers by helping to identify and resolve academic and social problems experienced by students, often within a curriculum-based measurement/response-to-intervention framework (e.g., Alonzo, Kerterlin-Geller, & Tindal, 2007; Batsche, Curtis, Dorman, Castillo, & Porter, 2007, Brown-Chidsey & Steege, 2005; Deno, 2005; Shinn, 1989; Sueai & Horner, 2009b).

The processes for effective team problem-solving have been a regular focus of consideration in the education and treatment of children. For example, Deno (2005) adapted the processes of the Bransford and Stein (1984) IDEAL problem-solving model (Identify problem-Define problem-Explore solutions-Apply chosen solution-Look at effects) to develop a data-based problem-solving model for use in schools (Problem identification-Problem definition-Designing intervention plans-Implementing intervention-Problem solution). By describing this as a data-based problem-solving model, Deno (2005) emphasized that access to relevant data is important for informing the various problem-solving processes and team members' related decision making.

In schools that implement School-wide Positive Behavior Interventions and Supports (SWPBIS), the team that identifies and addresses students' social behavior problems is known as the Positive Behavior Interventions and Supports (PBIS) Team (Lewis, Jones, Horner, & Sugai, 2010; Sugai & Horner, 2006, 2009a). Data that can inform the problem-solving processes and the decision making of PBIS Team members are typically drawn from the School-wide Information System (SWIS: Irvin et al., 2006; May et al., 2003). SWIS provides a methodology for defining and collecting data about student office discipline referrals (ODRs), as well as a Web-based computer application for entering hand-collected ODR data and producing predefined and custom reports concerning the ODRs (May et al., 2003). A school's PBIS Team members receive training and technical assistance in the use of SWIS from a SWIS Facilitator who is employed by the school district and who has previously participated in a two-and-a-half day SWIS Facilitator Training event conducted by SWIS developers/researchers (Homer et al., 2008). SWIS Facilitators begin their work with a school by reviewing the procedures currently in use for gathering ODR data, and working to ensure that problem behavior codes describe behaviors that are observable and represent behaviors that are mutually exclusive and exhaustive. The facilitator delivers a 3-hour SWIS training focused on data entry, report production and data use, and returns to the school three additional times to work with PBIS Team members as they review and use data in team meetings.

In spite of the ordered process whereby SWIS Facilitators receive training from the SWIS developers/researchers, and PBIS Team members subsequently receive training from SWIS Facilitators, our follow-up attendance at some PBIS Team meetings revealed that team members are often idiosyncratic in the manner in which they use SWIS ODR data to inform problem-solving processes (e.g., how often meetings include a review of SWIS report data, specific reports reviewed, whether review of report data results in identification of one or more problems, specific methods used to analyze report data for the identification of problems, and so on). We also found that teams demonstrate varied levels of organizational skill regarding the management of the "structural" aspects of their meetings (e.g., how team members prepare for, conduct, and close their meetings; and how they monitor and document accountability for implementing decisions reached at previous meetings). Although research in this area is sparse, the failure to implement problem-solving processes with a high degree of fidelity is certainly not unique to PBIS Team members (e.g., Burns, Vanderwood, & Ruby, 2005; Doll, Haack, Kosse, Osterloh, & Siemers, 2005; Kovaleski, Gickling, Morrow, & Swank, 1999; Telzrow, McNa-mara, & Hollinger, 2000).

These issues convinced us that we could help PBIS Team members improve their meetings by developing a more formal problem-solving model--operationalized to be of particular use to PBIS Teams--that incorporated specific methods for analyzing data in a manner that would inform the model's processes and team members' decision making. In developing the model we chose to focus on processes that could help team members make decisions related to (a) school-wide problem behavior engaged in by a large number of students, often throughout the day; and (b) small-group problem behavior, engaged in by a smaller number of students, often in a particular location or at a particular time of day (e.g., cafeteria, playground, classroom, afternoon dismissal, etc.). We made this decision for two reasons. First, the problem-solving processes associated with addressing the behavior of individual students (e.g., conducting a functional assessment or functional analysis, developing and implementing a function-based behavior support plan) are well known and well documented (e.g., Crone & Horner, 2003; O'Neill et al., 1997), but this is not the case for school-wide and small-group processes. Second, PBIS Teams often "hand off" the responsibility for problem solving for individual students to some other school team, or to an individual (e.g., a school psychologist), while retaining the responsibility for problem solving for school-wide and small-group behavior issues.

The Team-Initiated Problem-Solving (TIPS) Model

The Team-Initiated Problem-Solving (TIPS) model is described in detail in a book chapter (Newton, Horner, Algozzine, Todd, & Algozzine, 2009) and in the manual used to conduct a TIPS Workshop for PBIS Team members (Newton, Todd, Algozzine, Horner, & Algozzine, 2009). The components of the TIPS model are common to most problem-solving models, but have been operationalized to be of particular use to PBIS Teams having access to SWIS ODR data. The TIPS model also includes procedures for establishing problem-solving "foundations" (or structural elements). Although this is not a problem-solving process per se, establishing these structural elements can be an important prerequisite in enabling team members to implement the problem-solving processes in an efficient and effective manner (McNamara, Rasheed, & Delamatre, 2008). The problem-solving processes of the TIPS model depicted in Figure 1 are:

[FIGURE 1 OMITTED]

1. Establish Problem-Solving Foundations

2. Identify Problems

3. Develop and Refine Hypotheses

4. Discuss and Select Solutions

5. Develop and Implement a Problem-Solving Action Plan

6. Evaluate and Revise the Problem-Solving Action Plan.

A purpose of the pilot test of the TIPS model was to determine whether PBIS Team members used the model's problem-solving processes with fidelity in meetings they held following participation in the TIPS Workshop. The issue of the fidelity with which problem-solving processes are implemented is an important one, because it is by adhering to these processes that team members select and implement solutions/interventions for addressing problems that have been identified and targeted for resolution (Burns, Peters, & Noell, 2008). Thus an important prerequisite to determining whether team members used the processes with fidelity--and a major purpose of this pilot test--was developing a data collection protocol and instrument that would allow us to assess fidelity.

Method

Participants and Settings

Two schools in North Carolina and two schools in Oregon were selected for participation in the pilot test based on (a) use of both School-wide Positive Behavior Interventions and Supports (Lewis & Sugai, 1999; Sugai & Horner, 2009a) and the School-wide Information System (SWIS: Irvin et al., 2006; May et al, 2003), (b) existence of a PBIS Team that held meetings at least once a month, and (c) consent of team members to participate in the pilot test.

The two North Carolina schools (Schools A and B) served students in grades K-5 in a large urban school system. School A was a Visual and Performing Arts Magnet School with 379 students and 28 instructional staff. Sixty-three percent of students qualified for free and reduced lunch, 11% percent were identified as having disabilities, and 5% were identified LEP (Limited English Proficiency). School B was a partial magnet school for Learning Immersion (grades K-2) and Talent Development (grades 3-5), with 751 students and 46 instructional staff. Eight percent of students were identified as having disabilities and 32% were identified LEP (Limited English Proficiency). Schools A and B were among 42 elementary schools participating in a district-wide Positive Behavior Interventions and Supports (PBIS) initiative (Algozzine et al., 2008). The PBIS Teams at the schools had received professional PBIS training and been assigned a PBIS Coach employed by the school district.

The two Oregon schools (Schools C and D) served students in grades K-5 in a suburban school district. School C was a combination of two schools in single location, a traditional K-5 school and a Spanish Immersion K-5 school, with a combined enrollment of 485 students. The schools shared an administrator, office staff, supervisory staff, and other staff, with FTEs consisting of 1.0 administrator, 19.0 teachers, 5.2 educational assistants, and 6.4 other staff. School D was a traditional K-5 school with 376 students; faculty FTEs were 1.0 administrator, 20.3 certified teachers, 7.4 educational assistants, and 5.6 other staff. School C and School D were part of a school district in which SWPBIS had been operating since 1996 in conjunction with grant-related activities of University of Oregon research faculty (Lewis & Sugai, 1999).

The smallest PBIS Team had six team members, and the largest had 10 members. PBIS Team membership for the schools was similar. In each case a school Principal or Assistant Principal led a team made up of one or more general education teachers, special education teachers, front office staff members, student achievement coordinators, student services specialists, physical education teachers, social workers, educational assistants, and family/school advocates. Team meetings were held before or after school hours and were scheduled to last approximately one hour. School A and School B held meetings twice a month; School C and School D held meetings once a month.

Prior to providing the TIPS Workshop for the PBIS Team members, we attended at least four meetings at each school. Our informal assessment of the meetings confirmed that TIPS could be of use to team members given that meetings were often unfocused, extended past the scheduled end time, and involved unsystematic use of problem-solving processes (e.g., generating broad agenda items spontaneously at the start of a meeting; "reading" ODR report data without identifying specific student problems to address; discussing individual students' behavior problems, but without recommending solution/interventions for implementation).

Procedure

Members of the four PBIS Teams participated in a one-day, 5-hour TIPS Workshop conducted by the researchers. Presentation content for the workshop sessions conducted in both North Carolina and Oregon TIPS Workshops was drawn from the TIPS Training Manual (Newton, et al., 2009), which is referenced to the six components of the TIPS model. Team members listened to the presentations, participated in activities that provided opportunities to practice some of the problem-solving processes, and were given related "homework assignments" to complete prior to their next team meeting. Each team member was given a notebook containing print-outs of the presentation slides, as well as copies of materials to be used during their subsequent team meetings.

Establish problem-solving foundations. The Establishing Problem-Solving Foundations workshop session focused on (a) construction of an agenda; (b) team meeting roles; and (c) preparing for, conducting, and following up on a meeting. Participants were shown a copy of the Meeting Minutes and Problem-Solving Action Plan form (Appendix A) contained in their notebooks, and told that they would be instructed in how to use the form to establish an agenda and create a record of decisions reached during a team meeting.

Identify problems. In this session, participants learned to define a student social behavior problem as (a) a discrepancy between a behavior's current status and the expected/desired status (e.g., Bransford & Stein, 1984) that (b) team members judged to be significant enough to address (e.g., Deno 1989, 2005). The session presented three methods for using SWIS data to identify social problems. First, following a presentation, participants engaged in the practice activity of comparing a hypothetical school's office discipline referrals (ODRs) per day per month (per 100 students) against the ODRs per day per month from the national SWIS database. Participants were also given the homework assignment of comparing their own school's ODR data against the data from the SWIS database in preparation for their next team meeting, at which they were to discuss (a) possible discrepancies between their school's data and the national data (e.g., whether their school's ODRs per day per month, per 100 students, exceeded the national average) and (b) whether they judged any existing discrepancies to be significant enough to address (i.e., whether they found the discrepancy to constitute a problem).

Participants then learned a second method for identifying problems by comparing SWIS graphic data showing both (a) a school's average ODRs per school day per month for each month of the current school year, against (b) the same school's average ODRs per school day per month for each month of the previous school year. Participants practiced this skill using data presented for a hypothetical school, and then they were given the homework assignment of using their school's SWIS database to produce this SWIS graphic report for their school in preparation for discussing any discrepancies between the school's ODRs for the current and previous school years at their next meeting. Finally, participants learned a third method for identifying problems: reviewing a SWIS graphic of the average ODRs per school day per month for the months of a school's current school year, and examining those month-by-month data to detect an undesirable trend (i.e., ODRs accelerating across successive months; or ODRs with zero celeration across months, but with a stability that demonstrated an undesirably high level of ODRs). Participants practiced this skill using data presented for a hypothetical school, and then were given the homework assignment of using their school's SWIS database to produce this SWIS graphic report in preparation for discussing any discrepancies between an undesirable and a desirable trend at their next meeting.

Once participants had learned to broadly identify problems using these three methods, they were taught how to clarify those problems to achieve greater precision. In brief, participants were shown samples of four additional reports that could be produced from their school's SWIS database and that provided information about the "what when, where, and who" of the broadly-identified problems. The SWIS reports that allow for this additional level of precision are:

* ODRs by type of problem behavior (what)

* ODRs by time of day (when)

* ODRs by school location (where)

* ODRs by student (who).

Participants viewed samples of these reports (which present data in both a graphic and tabular format) for a hypothetical school and practiced identifying the specific problem behaviors most frequently associated with ODRs (e.g., aggression, disrespect), the times of day when ODRs were most frequently occurring (e.g., between 11:00am and 12:00pm), the locations in which ODRs were most frequently occurring (e.g., playground, classroom), and the students who most frequently received ODRs. Participants used this information to practice writing a "precise problem statement," such as "Many students involved in aggression on playground between 11:00-11:30am." Then participants were given the homework assignment of (a) producing the four SWIS reports from their school's SWIS database; (b) using the reports, in conjunction with their previous broadly-identified problem, to write a precise problem statement; and (c) recording the precise problem statement on a copy of a "Meeting Minutes and Problem-Solving Action Plan form" provided in their workshop notebooks. (Appendix A). Taken together, this session's presentation, practice activities, and homework assignments were designed to ensure that participants could engage in the behaviors associated with the first component of the TIPS model: Identifying Problems.

Develop and refine hypotheses; discuss and select solutions. During this session, which combined the second and third components of the TIPS model, participants were encouraged to think of the processes of (a) developing hypotheses about why a precisely-defined problem was occurring and (b) discussing/selecting solutions that might resolve the problem, as processes that are logically paired and best discussed in tandem. For example, if participants had written a precise problem statement that read "Many students are involved in aggression on the playground between 11:00am and 11:30am," they were taught to ask the logical "why" questions:

* Why does aggression account for a large majority of ODRs in our school?

* Why does aggression happen most often among the group of students who are on the playground between 11:00am and 11:30am?

Participants were encouraged to think of possible solutions in behavioral terms. They were taught how the following broad solution strategies could be applied in specific ways to address a precisely-defined problem:

* Prevent - Remove or alter the "trigger" for the problem behavior

* Define/Teach - Define students' behavioral expectations; provide demonstration/instruction in the expected behavior (i.e., the alternative to the problem behavior)

* Reward/reinforce - Prompt students to engage in the expected/alternative behavior if necessary; reward the expected/alternative behavior when it occurs

* Withhold reward/reinforcement - Withhold any inadvertent rewards for the problem behavior when it occurs, if possible (Extinction)

* Use corrective consequences - If problem behavior occurs, use corrective consequences

Participants were shown how to append their hypothesis as a "Why" statement accompanying the What, When, Where, and Who elements of the precise problem statement they had already recorded on the Meeting Minutes and Problem-Solving Action Plan, and then shown where to record their potential solutions to the problem in the column of the Problem-Solving Action Plan headed "Solution Actions."

Develop and implement a problem-solving action plan. This session was devoted to continuing the discussion of how to use the Meeting Minutes and Problem-Solving Action Plan form on which participants had previously recorded a precise problem statement and one or more solution actions to address the problem. Participants were shown how to use the form to ensure accountability for their proposed solution actions by recording both the person(s) who would be responsible for completing tasks ("Who"), as well as the timeline for completing the tasks ("By When"). Finally, participants learned to use the form to record an overall goal, timeline, and decision rule for revising their hypothesis and/or solution actions, if necessary (e.g., "Reduce ODRs on playground between 11-11:30am by 50% by 02/21/11; review hypothesis & solution actions if goal not met by target date").

Evaluate and revise the problem-solving action plan. In the discussion of this last component of the TIPS model, participants learned about the need to monitor implementation of their solution actions and to assess the extent to which implemented solution actions were producing the desired effect on problem behavior. Participants were advised to use a portion of their team meetings to (a) monitor the fidelity of implementation of solutions by reviewing the extent to which previously-assigned solution actions were being completed; (b) review current data from SWIS reports relative to any problems the team was currently addressing; and (c) use their previously-recorded goal, timeline, and decision rule to revise their hypothesis and/or solution actions as necessary, based on information provided by the SWIS reports.

Technical assistance. After PBIS Team members participated in the TIPS Workshop, a researcher attended at least four meetings of each team to provide technical assistance in implementation of the TIPS problem-solving processes. We anticipated that participants' attendance at the workshop might not be sufficient, in and of itself, to ensure their implementation of the TIPS processes with fidelity and that follow-up technical assistance visits were likely to become a standard component of any TIPS "treatment package." As described below, these technical assistance visits also played a crucial role in helping us refine the previously-developed data collection protocol and instrument used to measure the extent to which PBIS Team members implemented the TIPS problem-solving processes with fidelity during meetings when technical assistance was not provided.

Measurement and Data Collection

We determined whether PBIS Team members used the TIPS model's problem-solving processes with fidelity during meetings held following participation in the TIPS Workshop with the aid of the Decision Observation, Recording, and Analysis ("DORA") data collection protocol and instrument (Newton, Todd, Horner, Algozzine, & Algozzine, 2009), which appears as Appendix B. At a team meeting a research project member used DORA to record data concerning both the extent to which team members (a) established the TIPS foundations (see section of DORA labeled "Foundations of Efficient and Effective Team Problem-Solving" for foundation items) and (b) implemented the TIPS problem-solving processes (see section of DORA labeled 'Team Problem-Solving Processes" for process items).

Data recorded on DORA were later used to produce fidelity "scores." A team's Foundations Score for a meeting was simply the percentage of DORA's Foundation items implemented by the team. Similarly, DORA produced scores concerning the fidelity with which team members' implemented the problem-solving processes. Those scores were derived as follows:

1. Problem Precision (PP) Score - Average percentage of "precision" demonstrated across all of the problems identified by team members during a meeting. (See section of DORA titled "Description of Problem.") A problem statement that included information about the "what, when, where, and who" of the problem received a Problem Precision Score of 100%; a problem statement that included information about only one of the "four Ws" received a precision score of 25%, and so on.

2. Thoroughness (Th) Score - Average percentage of "thoroughness" of a team's problem solving across all problems identified during a meeting. To achieve a Thoroughness Score of 100%, a PBIS Team would have to, for each problem it identified, (1) state a hypothesis about why the problem is occurring; (2) review quantitative data (e.g., SWIS data) to identify, clarify, or monitor the problem; (3) discuss at least one possible solution for resolving the problem; (4) make a decision to implement at least one solution to resolve the problem; and (5) assign someone to implement, or coordinate implementation of, the solution by a specified date. A team earned one "point" for simply identifying a problem, and thus could earn a total of 6 points for each problem. For example, if a team identified a single problem during a meeting and clarified the problem by re/erring to SWIS data, but engaged in none of the other problem-solving processes, the team would be awarded 2 "points" and a Thoroughness Score of 33% (2/6 X 100%).

3. Solution (S) Score - Average percentage of problems for which team members made a decision to implement at least one solution to resolve the problem. (Although this score was also an element of the Thoroughness Score, we were particularly interested in highlighting a team's commitment to resolving an identified problem).

During the final PBIS Team meeting attended by a research project member for the school year in which team members attended the TIPS Workshop, DORA data were collected and no technical assistance was provided. The DORA scores from this meeting reflected team members' independent performance at implementing the TIPS problem-solving processes. During the winter of the following school year, a research project member returned to three of the four schools and gathered DORA data to assess the stability of TIPS implementation. These DORA data were collected at three meetings of School A's PBIS Team and at one meeting each for the teams at School C and School D.

Results

The purpose of the pilot test was to determine whether PBIS Team members used the TIPS model's problem-solving processes with fidelity in meetings held following their participation in the TIPS Workshop. A prerequisite to making this determination was our development of the DORA data collection instrument and protocol. Using these materials we gathered problem-solving fidelity (DORA) data, and interobserver agreement data, at post-workshop team meetings.

DORA Data

A summary of DORA data collected during the final PBIS Team meeting at each of the four schools appears in Table 1. In general, team members at all four schools succeeded in implementing the TIPS model's foundation items for conducting effective and efficient team meetings, with an average F Score of 93% (range, 73% to 100%). Team members implemented the problem-solving processes of the TIPS model with varying degrees of success. Members of the PBIS Teams described problems with a high degree of precision (average PP Score of 88%; range, 50% to 100%), showed thoroughness in implementing the problem-solving processes (an average Th Score of 88%; range, 67% to 100%), and were effective in terms of the percentage of problems for which at least one solution action was selected for implementation to resolve a problem (an average S Score of 88%; range, 50% to 100%).
Table 1

DORA Scores for Four PBIS Teams

                    DORA Scores

School   F Score  PP Score  Th Score  S Score

A           100%       50%       67%      50%
B            73%      100%      100%     100%
c           100%      100%     91.5%     100%
D           100%      100%     91.5%     100%
Average      93%       88%       88%      88%

Note. F Score = Percentage of foundation items implemented; PP
Score = Average percentage of precision demonstrated in description
of problem; Th Score = Average percentage of thoroughness
demonstrated in addressing a problem; S Score = Average percentage
of problems for which team members selected at one solution for
implementation.


Follow-Up Visits and Interobserver Agreement Data

During the following school year no additional technical assistance was provided to the PBIS Teams, but we conducted three follow-up visits at School A and a singJe follow-up visit at School C and at School D. During these visits a research project member used DORA to gather data allowing us to measure the extent to which PBIS Team members maintained use of the TIPS problem-solving processes in their meetings. In addition, interobserver agreement for DORA data was assessed at the three meetings of School A.

DORA scores. Results for DORA data gathered on follow-up visits appear in Table 2. School A's PBIS Team members achieved an average F Score of 76% across the three follow-up visits, an average PP Score of 66%, an average Th Score of 51%, and an average S Score of 83%. On average, these "maintenance scores" show a slight decline in the team's implementation of foundations and overall thoroughness of problem solving when compared against scores from the previous school year (Table 1), but improvements for both precision and solutions.
Table 2

Follow-up DORA Scores for Three PBIS Teams

                          DORA Scores

School       F Score  PP Score  Th Score  S Score

A (visit 1)      82%       80%       55%     100%
A (visit 2)      64%       60%       45%     100%
A (visit 3)      82%       57%       52%      50%
A(average)       76%       66%       51%      83%
C                82%       67%       58%      50%
D                82%       53%       50%      50%

Note. F Score = Percentage of foundation items implemented; PP
Score = Average percentage of precision demonstrated in description
of problem; Th Score = Average percentage of thoroughness
demonstrated in addressing a problem; S Score = Average percentage
of problems for which team members selected at one solution for
implementation.


The maintenance scores for the PBIS Teams at School C and School D showed slight declines in their implementation of foundations, each achieving an F Score of 82% at the time of the follow-up visit, as compared against the score of 100% they each achieved during the previous school year. The teams' maintenance scores for precision, thoroughness, and solutions all showed declines from the scores of the previous school year. On average, precision scores declined by 40 points, thoroughness scores declined by 38 points, and solution scores declined by 50 points.

Interobserver agreement. Interobserver agreement was assessed for DORAs completed by two independent observers at the three follow-up visits to School A. In calculating interobserver agreement on the observers' foundations scoring for a team meeting, we examined the 11 foundation items on their DORAs and compared the scoring on a discrete trial (item-by-item) basis by (a) adding the number of items both observers agreed the team demonstrated at the meeting (e.g., agenda distributed) to the number of items both observers agreed the team did not demonstrate; (b) dividing that total by 11, and (c) multiplying the quotient by 100% (e.g., Cooper, Heron, & Heward, 2007; Page & Iwata, 1986). As indicated in Table 3, the average interobserver agreement for the Foundations score across the three visits was 85% (range, 73% to 100%).
Table 3

DORA Interobserver Agreement for School A at Follow-Up
Visits

             Variables Contributing to Thoroughness

Visit      F      PI   PP    FT    DU    S   AP   Th
Number

1         82%    0%   NA    NA    NA   NA   NA   NA
2         73%  100%  80%   83%   67%  89%  90%  82%
3        100%   50%  80%  100%  100%  67%  80%  87%
Average   85%   50%  80%   92%   84%  78%  85%  85%

Note. F = Foundation items implemented; PI = Problems identified;
PP = Precision demonstrated in description of problem; PT = Problem
type (old/new; group/individual; & social/academic); DU = Data were
used to identify, define/clarify, or monitor problem; S = At least
one solution action selected for implementation; AP = Action Plan
accountability demonstrated in planning implementation of solutions;
Th = Overall thoroughness in implementing problem-solving processes.


In calculating observers' agreement on problems identified by PBIS Team members, we examined the two DORAs to determine what descriptions of student problems (e.g., instances of aggression on the playground) the two observers wrote on their DORAs as a record of problems identified by team members during the meeting. We then (a) divided the number of problems that both observers agreed the team identified (i.e., identical problem descriptions); by (b) the number of problems that both observers agreed the team identified, plus the number of problems that only one observer indicated the team identified; and (c) multiplied the quotient by 100%. The average interobserver agreement for the problems-identified scoring across the three visits was 50% (range, 0% to 100%). (The single instance of a score of 0% agreement occurred when the Primary Observer's DORA noted that the PBIS Team identified one student problem during its meeting, while the Reliability Observer's DORA indicated that the team did not identify any student problems during the meeting. In such an instance, only an agreement score of either 0% or 100% was possible.)

Finally, to calculate observers' agreement on the remaining TIPS process variables presented in Table 3, we restricted the analysis to the problems that both observers agreed team members identified. (This resulted in DORA data from visit number one being eliminated from these analyses). We compared the two observers' scoring on the individual items associated with a TIPS process on a discrete trial (problem-by-problem) basis by (a) adding the number of items that both observers agreed the team achieved for an identified problem (e.g., the what, who, where, when, and why items associated with the precision of team members' description of a problem) to the number of items that both observers agreed the team did not achieve; (b) dividing that total by the number of items associated with the process (e.g., five); and (c) multiplying the quotient by 100%. The average interobserver agreement across the three visits for these TIPS process variables was as follows: Problem Precision, 80%; Problem Type, 92%; Data Use, 84%; Solutions, 78%; Action Plan, 85%; and overall Thoroughness, 85%.

Discussion

Consistent with the purpose of the pilot test, we have presented information about the extent to which PBIS Team members implemented the TIPS model's problem-solving processes with fidelity in meetings held following their participation in the TIPS Workshop. The data demonstrate that the teams were generally successful at implementing the TIPS model's foundational elements and its problem-solving processes with fidelity. For practitioners, the results are encouraging in that members of typical elementary school teams implemented problem-solving processes with fidelity when provided with focused training and follow-up technical assistance. This is an important consideration in light of some previous research demonstrating that school teams often do not succeed at achieving fidelity of implementation of problem-solving processes (e.g., Doll et al., 2005; Flugum & Reschly, 1994; Telzrow et al., 2000).

The pilot study's encouraging results are tempered by some limitations. There was a decline in the fidelity with which team members implemented the TIPS problem-solving processes in meetings held during the following school year. Although we cannot define factors contributing to these declines with certainty, several variables may have played a role, including turnover in the composition of PBIS Team membership, failure of administrative personnel to stress the importance of fidelity of implementation of the problem-solving process at the onset of a new school year, and team members' lack of access to local/regional personnel capable of providing TIPS technical assistance as needed. This suggests that to achieve sustained implementation of TIPS, it may be important to develop a process whereby TIPS training can be provided to local/regional school personnel (e.g., PBIS Coaches) who can then deliver follow-up technical assistance and "booster" training sessions near the beginning of a new school year. This may be particularly important in cases where there has been turnover in team membership. Ongoing access to follow-up technical assistance is likely to be particularly important, as highlighted in the work of Joyce and Showers (2002), who found that post-workshop coaching is integral to ensuring that teachers succeed at applying skills that are presented and demonstrated in a workshop context. This is consistent with recent findings of Coffey and Horner (in press), who note that sustained implementation of basic SWPBIS practices is associated with administrative support and feedback systems.

An additional limitation is that during the school year in which the TIPS Workshops were held, DORA data were presented only for the final meeting we attended. This was due to our correctly anticipating that participants would need some technical assistance in implementing TIPS following the workshop, and to our continuing to revise the DORA data collection protocol and instrument based in part on what we observed during these technical assistance visits. Finally, our convenience sample of PBIS Teams from four elementary schools disallows any discussion of external validity.

Since the completion of the pilot test, we have revised the DORA instrument and data collection protocol to make the target variables more focused, to achieve clearer operational definitions, and to improve interobserver agreement. We have also developed and tested a DORA Training Manual (Todd, Newton, Horner, Algozzine, & Algozzine, 2009) for providing more systematic instruction to DORA data collectors. However, in some respects the lower values of percent interobserver agreement obtained during the pilot test were a function of the small number of data-coding options associated with some DORA variables. For example, as previously noted, the single instance of a score of 0% agreement for identification of a problem by team members occurred when the Primary Observer's DORA noted that the PBIS Team identified one student problem during its meeting, while the Reliability Observer's DORA indicated that the team did not identify any student problems during the meeting. In such an instance, an agreement score of only 0% or 100% was possible. Nevertheless, even further refinements of the instrument, protocol, and training procedures may be forthcoming, given that "observing" the verbal behavior of team members during team meetings can present data collection challenges that are not inherent in the observation of motor behavior alone.

We believe the pilot study shows sufficient promise to spark researchers to extend this line of analysis. At a minimum, future research might focus on improving the fidelity with which teams implement problem-solving practices and on the measurement of implementation fidelity, while also discovering how best to monitor and measure the extent to which solutions (interventions) derived from the problem-solving process are implemented by teams with fidelity. Finally, to test the ultimate benefit of problem-solving processes, research must be conducted within the context of rigorous experimental designs to assess the possible functional relation between a team's implementation of problem-solving processes with fidelity (including the solutions/interventions derived from those processes) and resolution of student problems at which the solutions are targeted (i.e., reductions in the rate at which a precisely-defined problem is occurring). It is likely that these studies will necessarily involve the use of both single-subject experimental designs, in which the success of problem-solving process at resolving individual problems can be visually analyzed; and large-scale randomized controlled trials in which both the internal validity and external validity of the processes can be assessed.

References

Algozzine, B., Cooke, N., White, R., Helf, S., Algozzine, K., & McClanahan, T., (2008). The North Carolina Reading and Behavior Center's K-3 prevention model: Eastside Elementary School case study. In C. Greenwood, T. Kratochwill, & M. Clements (Eds.) Schoolwide prevention models: Lessons learned in elementary schools (pp. 173-214). New York: The Guilford Press.

Alonzo, J., Ketterlin-Geller, L, R., & Tindal, G. (2007). Curriculum-based measurement in reading and math: Providing rigorous outcomes to support learning. In L. Fiorian (Ed.), The SAGE handbook of special education (pp. 307-318). Thousand Oaks, CA: Sage.

Bahr, M. W., & Kovaleski, J. F. (2006). The need for problem-solving teams: Introduction to the special issue. Remedial and Special Education, 27, 2-5.

Batsche, G. M, Curtis, M. J., Dorman, C, Castillo, J. M., & Porter, L. J. (2007). The Florida problem-solving/response to intervention model: Implementing a statewide initiative. In S. R. Jimerson, M. K. Burns, & A. M. VanDerHeyden (Eds.), Handbook of re-sponse to intervention: The science and practice of assessment and intervention (pp. 378-395). New York: Springer Science.

Bransford, J. D., & Stein, B. S. (1984). Vie IDEAL problem solver: A guide for improving thinking, learning, and creativity. New York: W. H. Freeman and Company.

Brown-Chidsey, R., & Steege, M. W. (2005). Response to intervention: Principles and strategies for effective practice. New York: Guildford Press.

Burns, M. K., Peters, R., & Noell, G. H. (2008). Using performance feedback to enhance implementation fidelity of the problem-solving team process. Journal of School Psychology, 46, 537-550.

Burns, M. K., Vanderwood, M. L., & Ruby, S. (2005). Evaluating the readiness of pre-referral intervention teams for use in a problem solving model. School Psychology Quarterly, 20, 89-105.

Chalfant, J. C, Pysh, M. V., & Moultrie, R. (1979). Teacher assistance teams: A model for within-building problem solving. Learning Disability Quarterly, 2, 85-96.

Coffey, J., & Horner, R. (in press). The sustainability of school-wide positive behavioral interventions and supports. Exceptional Children.

Cooper, J. O., Heron, T. E., & Heward, W. L. (2007). Applied behavior analysis (2nd ed.). Upper Saddle River, NJ: Pearson.

Crone, D. A., & Horner, R. H. (2003). Building positive behavior support systems in schools: Functional behavioral assessment. New York: Guilford.

Deno, S. L. (1989). Curriculum-based measurement and alternative special education services: A fundamental and direct relationship. In M. R. Shinn (Ed.), Advanced applications of curriculum-based measurement (pp. 1-17). New York: Guilford.

Deno, S. L. (2005). Problem-solving assessment. In R. Brown-Chidsey (Ed.), Assessment for intervention: A problem-solving approach (pp. 10-40). New York: Guilford Press.

Doll, B., Haack, K., Kosse, S., Osterloh, M., & Siemers, E. (2005). The dilemma of pragmatics: Why schools don't use quality team consultation practices, journal of Educational and Psychological Consultation, 16,127-155.

Flugum, K. R., & Reschly, D. J. (1994). Prereferral interventions: Quality indices and outcomes, journal of School Psychology, 32,1-14.

Graden, J. L. (1985). Implementing a prereferral intervention system: Part I. The model. Exceptional Children, 51, 377-384.

Horner, R., Todd, A., Sampson, N., Dickey, C, May, S., Amedo, M., & Talmadge, N. (2008). School-wide Information System Facilitator Training Manual. Educational and Community Supports, University of Oregon, Eugene, Oregon.

Irvin, L. K., Horner, R. H., Ingram, K., Todd, A. W., Sugai, G., Sampson, N. Kv & Boland, J. B. (2006). Using office discipline referral data for decision making about student behavior in elementary and middle schools: An empirical evaluation of validity. Journal of Positive Behavior Interventions, 8,10-23.

Joyce, B. R., & Showers, B. (2002). Student achievement through staff development (3rd ed.). Alexandria, VA: Association for Supervision and Curriculum Development.

Kovaleski, J. F., Gickling, E. E., Morrow, H., & Swank, P. R. (1999). High versus low implementation of instructional support teams: A case for maintaining program fidelity. Remedial and Special Education, 20,170-183.

Kovaleski, J. F., & Glew, M. C. (2006). Bringing instructional support teams to scale: Implications of the Pennsylvania experience. Remedial and Special Education, 27, 16-25.

Lewis, T. J., Jones, S. E. L., Horner, R. H., & Sugai, G. (2010). School-wide positive behavior support and students with emotional/behavioral disorders: Implications for prevention, identification, and intervention. Exceptionality, 18, 82-93.

Lewis, T. J., & Sugai, G. (1999). Effective behavior support: A systems approach to proactive school-wide management. Focus on Exceptional Children, 31(6), 1-24.

May, S., Ard Jr., W., Todd, A. W., Horner, R. H., Glasgow, A., & Sugai, G. (2003). Schoolwide information system. Educational and Community Supports, University of Oregon, Eugene, Oregon.

McNamara, K., Rasheed, H., & Delamatre, J. (2008). A statewide study of school-based intervention teams: Characteristics, member perceptions, and outcomes. Journal of Educational and Psychological Consultation, 18, 5-30.

Newton, J. S., Horner, R. H., Algozzine, R. F., Todd, A. W., & Algozzine, K. M. (2009). Using a problem-solving model to enhance data-based decision making in schools. In W. Sailor, G. Dun-lap, G. Sugai, & R. Horner (Eds.), Handbook of positive behavior support (pp.551-580). New York: Springer.

Newton, J. S., Todd, A. W., Algozzine, K. M., Horner, R. H., & Algozzine, B., (2009). Team-initiated problem solving training manual. Educational and Community Supports, University of Oregon, Eugene, Oregon.

Newton, J. S., Todd, A. W., Horner, R. H., Algozzine, B., & Algozzine K., (2009). Decision observation, recording, and analysis: DORA. Unpublished instrument. Educational and Community Supports, University of Oregon, Eugene, Oregon.

O'Neill, R. E., Horner, R. H., Albin, R. W., Sprague, J. R., Storey, K., & Newton, J. S. (1997). Functional assessment of problem behavior: A practical handbook (2nd ed.). Pacific Grove, CA: Brookes/ Cole.

Page, T. J., & Iwata, B. A. (1986). Interobserver agreement: History, theory, and current methods. In A. Poling & R. W. Fuqua (Eds.), Research methods in applied behavior analysis: Issues and advances (pp. 99-126). New York: Plenum Press.

Rosenfield, S. A., & Gravois, T. A. (1996). Instructional consultation teams: Collaborating for change. New York: Guilford Press.

Shinn, M. R. (1989). Curriculum-based measurement: Assessing special children. New York: The Guilford Press.

Sugai, G., & Horner, R. H. (2006). A promising approach for expanding and sustaining school-wide positive behavior support. School Psychology Review 35, 245-259.

Sugai, G., & Horner, R. H. (2009a). Defining and describing school-wide positive behavior support. In W. Sailor, G. Dunlap, G. Sugai, & R. Horner (Eds.), Handbook of positive behavior support (pp. 307-326). New York: Springer.

Sugai, G., & Horner, R. H. (2009b). Responsiveness-to-intervention and school-wide positive behavior supports: Integration of multi-tiered system approaches. Exceptionality, 17, 223-237.

Telzrow, C. F., McNamara, K., & Hollinger, C. L. (2000). Fidelity of problem-solving implementation and relationship to student performance. School Psychology Review, 29, 443-461.

Todd, A. W., Newton, J. S., Horner, R. H., Algozzine, B., & Algozzine, K. M. (2009). Decision observation, recording, and analysis (DORA) training manual Educational and Community Supports, University of Oregon, Eugene, Oregon.

Correspondence to Robert H. Horner, Educational and Community Supports, 1235 University of Oregon, Eugene, OR 97403-1235; e-mail: robh@uoregon.edu.

Preparation of this manuscript was supported by Grant R324A070226 from the Institute of Education Sciences, U. S. Department of Education. The opinions expressed herein are those of the authors, and no official endorsement should be inferred.

J. Stephen Newton Robert H. Horner Anne W. Todd University of Oregon

Robert F. Algozzine Kate M. Algozzine University of North Carolina at Charlotte
COPYRIGHT 2012 West Virginia University Press, University of West Virginia
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2012 Gale, Cengage Learning. All rights reserved.

 Reader Opinion

Title:

Comment:



 

Article Details
Printer friendly Cite/link Email Feedback
Author:Newton, J. Stephen; Horner, Robert H.; Todd, Anne W.; Algozzine, Robert F.; Algozzine, Kate M.
Publication:Education & Treatment of Children
Article Type:Report
Geographic Code:1USA
Date:Feb 1, 2012
Words:7298
Previous Article:An assessment of the evidence-base for school-wide positive behavior support.
Next Article:Secondary prevention efforts at the middle school level: an application of the Behavior Education Program.
Topics:

Terms of use | Copyright © 2014 Farlex, Inc. | Feedback | For webmasters