Printer Friendly

Effective integration of instructional technologies (IT): evaluating professional development and instructional change.

 It is well known that society is currently experiencing great
 technological momentum and that technology has found its way
 into our classrooms. Yet, many teachers have not demonstrated an
 adoption of such advances and their instructional practices do
 not reflect the integration of instructional technologies. This
 article examines Trek 21, an innovative professional development
 embracing the need for technology training, and its effects on
 teachers' ability to integrate instructional technologies into
 their educational practice. The study findings suggest that
 following the intense training provided during Trek 21,
 participants integrated significantly more instructional
 technologies into their practice and an increase was also noted
 in active student engagement within these practices. Findings
 also indicate that although teachers changed their instruction
 to increase the number of instructional technologies and active
 student engagement, the basic instructional design remained
 intact.


**********

Jl. of Technology and Teacher Education (2003) 11(3), 397-414

Research on schools and teaching has suggested for decades that student success and achievement are intricately associated with students' interactions with effective teachers (Darling-Hammond, 1997; McCaleb, 1994; Mizell, 2001). Recommendations in the National Plan for Improving Professional Development (Sparks & Hirsh, 2001) indicated that professional development has the power to change the culture of a school and inform educators of more effective methods of teaching. By providing effective professional development, teachers can excel at helping all students reach the high levels of achievement they need to succeed (Sparks & Hirsh, 2001). To assist in efforts to create effective training, the National Staff Development Council has revised their professional standards and suggested that today's professional development must "... push the boundaries of normative staff development practice" (Mizell, 2001, p. 19) to improve the quality and results of public education (National Staff Development Council, 2002). It is evident that professional development of our educators is an important factor in the academic success of our students.

Paralleling the assumption described above are the technological changes that have dominated the twentieth century. These changes have introduced new instructional technologies into our classrooms and pushed educators to make decisions about how to use these tools (Riel & Becker, 2000). Today, 98% of all schools and 77% of classrooms are connected to the Internet (National Center for Education Statistics, 2001). With all this connectivity in our schools, teachers must be trained to create intellectually powerful and technology rich learning environments for students while maintaining sound pedagogical practices (Anderson & Becker, 2001). The U.S. Department of Education, through its Office of Educational Technology, acknowledged this need for training by designating that trainers should " ... increase the quantity, quality, and coherence of technology-focused activities aimed at the professional development of teachers" (Office of Educational Technology, 2000, p. 8) as a national goal in its National Technology Plan (Office of Educational Technology, 2000). Further, the National Center for Education Statistics (2001) found that only 33% of teachers feel prepared to use computer related tools in their teaching.

To assist in the development of effective teacher training regarding technology integration, Congress created the Department of Education's Preparing Tomorrow's Teachers to Use Technology grant program (PT3). PT3 is built on the premise that educators must understand how to create and deliver high quality, technology-infused lessons that engage students and improve learning (http://www.pt3.org/index.html). As new social and cultural shifts due to technology occur, teachers must be adequately prepared in all aspects of their professional practice allowing schools to become the leader, rather than merely a participant in technology reform efforts. Children, schools, and society cannot afford for teachers not to be the critical players in the current efforts to prepare our classrooms for technology. If the goal is to create learner-centered classrooms where technology is embedded, technology must be embedded in the training of teachers. In a sense, technology becomes not just a tool in the transformation of schools, but the engine of change (Darling-Hammond, 1997).

If technology is the engine of change, then teachers must be the agents of that change in the classroom. Trek 21 embraced this idea of teachers as agents of change. Trek 21, a PT3 funded grant, was an innovative model of teacher preparation and professional development. This professional development model included K-12 teachers in West Virginia's Professional Development Schools (PDS), faculty from the College of Human Resources and Education at West Virginia University (WVU), and students enrolled in the five-year teacher preparation program at the university. In its simplest form, Trek 21 sought to effect sustained change in the culture of teacher practice, with participating educators demonstrating exemplary forms of effective integration of instructional technologies.

Trek 21 intended to prepare educators to use and integrate instructional technologies for teaching and learning through a professional development-training program. To ensure that these practices were sustained, Trek 21 took the unique approach of creating a coherent and inclusive training program. This program targeted all parties needed to implement and sustain such an innovation; namely, the potential trainers (higher education faculty), current implementers (K-12 faculty), and future implementers (student interns). The Trek 21 model included K-12 teachers of West Virginia, faculty from West Virginia University, and student interns in their fifth year of the West Virginia University teacher preparation program. To accomplish effective understanding and integration of instructional technologies, Trek 21 encouraged participation in a series of professional development activities. These activities formed a year-long cycle of professional development. During the grant period, Trek 21 repeated the professional development cycle three times.

Preassessment. Participants interested in participating in Trek 21 completed a preassessment to assist in the development of training activities and materials. The preassessment included submission of participant developed lesson plans needed for the Trek 21 training and completion of a survey to assess computer skill levels.

Summer institutes. The three-week technology integration summer institutes for West Virginia K-12 teachers and ten-day technology integration institute for university faculty continued the annual Trek 21 cycle. These institutes addressed genres of instructional technology applications, targeted technology training, preparation of instructional technology materials, and resources necessary for immediate integration into classroom instruction (Harris, 1998). Participants of the K-12 summer institutes developed a web-based unit based on lesson plans submitted before the institute and implemented their units in the fall. As the Trek 21 trainers were developing the content and process of these institutes, special attention was given to Senge's Stages of Learning (1995): (a) novice; (b) advanced beginner; (c) competent; (d) proficient; and (e) expert. According to Senge (1995), without an understanding of the stages of learning, a trainer may feel that more than enough time and resources have been spent training and then is disappointed when the level of knowledge is less than desired. Or worse, the innovation is discarded as being without merit because results are expected too soon. The Stages of Learning proposed by Senge (1995) provided information regarding the type of training activity, sequencing of activities, and the participants level and degree of participation during the summer institutes. Table 1 provides examples of how this occurred.

Continuity meeting. Following the summer institutes, summer institute participants attended two continuity meetings, one each semester. At these meetings, participants had the opportunity to address issues specifically related to the implementation of their lessons and the integration of instructional technologies into their practice.

Site visits. In addition to the continuity meetings, follow-up visits were scheduled throughout the academic year with each K-12 participant at his/her school. Trainers used these visits to assist teachers in continuing to integrate instructional technologies into their practice as well as troubleshoot and remove barriers to that integration.

Intern component. Students in the five-year teacher preparation program received instruction from higher education faculty who had participated in Trek 21 and whose instructional methods and assignments showed evidence of technology integration (Katayama & Lemani, 2002). These students then completed their internship with a K-12 Trek 21 participant who modeled, encouraged, and required the effective integration of technology.

In summary, the goals and objectives of Trek 21 merged the participation of K-12 teachers, university faculty, preservice teachers, and educational organizations into a self-sustaining process of teacher preparation and professional development to promote the following: (a) the tools necessary to adjust current curriculum to reflect technological advances; (b) technical and pedagogical knowledge and skills necessary for effective integration of instructional technologies; (c) sustained support and resources requisite to the pervasive integration of instructional technologies into practice; (d) the means to change instructional procedures, evaluation of student success, and instructional methods that are fundamental to successful teaching and learning environments; and (e) a profession that shares a common pedagogical practice.

The purpose of this evaluation was to investigate the extent to which participation in Trek 21 produced instructional change in the K-12 teacher population following the second cycle of Trek 21. This evaluation was concerned with how these participants achieved the goals previously described and demonstrated this achievement through overt indicators. Evaluators identified the following components of instructional units as potential indicators of instructional change: (a) active student engagement, (b) instructional variables, and (c) instructional technologies. The specific questions guiding this investigation were:

1. To what extent was participation in Trek 21 associated with a shift in instructional design as measured by improvement in indicators of instructional change in the K-12 population?

2. To what extent was participation in Trek 21 associated with an increase in active student engagement in the K-12 population?

METHOD

Participants

Two preschool teachers, 17 elementary school teachers, 4 middle school teachers, and 4 high school teachers volunteered to participate in Trek 21 training. Table 2 displays the participant demographics. Two of the 27 teachers were males, and three were special educators. Most participants (n=21) came from professional development schools that are part of the Benedum Collaborative, a partnership that includes five West Virginia school districts and West Virginia University. Four participants were classified as "institute repeaters" since they returned for a second cycle of Trek 21 professional development activities. Participants completed the Self-Evaluation of Basic Computer Use Survey (EDMin, 2000; Johnson, 1977). This survey assesses the level of achievement users self report according to a set of computer competencies. Teachers were assessed across three developmental levels of computer use including: (a) basic computer use; (b) advanced computer use; and (c) Internet use. Teachers were asked to rate themselves under a series of categorical questions with options including four levels of increasing complexity of use. The assessment yielded a computer-use score identifying each participant's computer use ability as beginner (1), low-level (2), intermediate (3), or experienced user (4). The mean rating across participants on this assessment was 1.96-beginner/low-level (beginner n = 5, low n = 10, intermediate n = 8, experienced n = 1; 3 participants did not report). Trek 21 trainers used this computer use score to design training materials and activities that addressed individual entry skill levels, encouraging efficient movement through Senge's Stages of Learning (Senge, 1995).

Instrumentation

Indicators of instructional change instrument--random lesson sweep and comprehensive evaluation. This 37-item instrument (See Appendix A) was designed to compare unit plans developed by the K-12 teachers submitted prior to the institute with units created during the institute. The instrument was developed by the Trek 21 evaluation team, reviewed by the Trek 21 trainers and project director with recommendations incorporated into the final instrument. The final instrument was piloted by the evaluation team assessing units developed by K-12 teachers during the first cycle of Trek 21. The first two authors independently evaluated 25% of the first cycle units and reached 100% agreement on coding the absence or presence of the variable as well as evidence of active student engagement. The purpose of the instrument was to assess indicators of instructional change including active student engagement, increase in instructional technologies, and the inclusion of instructional variables. The instrument organized the pre and post units by grade levels: preschool, elementary, middle, and high.

Each pre and post unit was reviewed to assess varied indicators of instructional change using the following coding scheme: 0 = absence of the variable; 1 = presence of the variable; 2 = assessment is linked to objectives/extension involves instructional technology (IT); 3 = each objective is assessed; + = active student engagement. Active student engagement was assessed for all items excluding the lesson objectives. Active student engagement was scored as + when students provided an overt response to an instructional prompt. Examples of active student engagement included: (a) sequencing cards, (b) responding verbally and in writing, (c) retrieving information from a web-site, and (d) participating in discussion. Non-examples included: (a) listening to lecture, (b) looking at a web site or PowerPoint presentation, and (c) silent reading. Participant name and curricular area were noted to assist in later analyses. Three main areas evaluated included: (a) instructional procedures; (b) instructional strategies; and (c) IT integrations. A description follows for each area.

Instructional procedures comprised seven items including a motivating introduction, a check for prerequisite skills or review, the presentation of new content, guided practice, independent practice, a closure activity, and extension activities. The template guiding the selection of these items was the effective instructional cycle as presented by Hunter (1982).

Instructional strategies comprised 13 items including advanced organizer, whole group instruction, peer-mediated instruction, group discussion, active responding, problem-solving, research, inquiry, hands-on/manipulatives, dramatic representation, journaling/writing, student presentation, and teacher demonstration. To identify these strategies, the researchers adapted a literature-based, field-tested scoring rubric (Hawthorne, Walls, & Wells, 1999). Adaptations were made following a review of the preinstitute unit plans to glean the pertinent instructional strategies for this evaluation. Although this is not an exhaustive list of instructional strategies, the selection was based on those strategies represented in the teacher units to facilitate manageability.

IT integrations comprised 13 items including Computer Aided Instruction (CAI)/drill and practice, simulation and educational games, word processing, information retrieval, internet access, e-mail, bulletin boards and listservs, authoring and multimedia development, desktop publishing, electronic presentations, video development, open lab access, and web-page development. Adaptations were made to the field-tested rubric (Hawthorne, Walls, & Wells, 1999) following review of all K-12 teacher-developed units. This review resulted in the deletion of one instructional technology (MUD/MOO) from the list due to its absence and the separation of information retrieval and Internet access to allow for the clarification of the two items.

Independent Variable

Trek 21 project. The treatment of the participants during their participation in the Trek 21 professional development activities is the independent variable in this study. The K-12 teachers participated in the following professional development activities: (a) a three-week summer institute during which a technology integrated instructional unit was designed, (b) two continuity meetings where K-12 participants addressed specific technology and instructionally related issues within individual units, and (c) site visits during which Trek 21 trainers visited the K-12 participants' schools to meet individually with teachers and address any implementation needs. Each K-12 participant was encouraged to attend all activities. The project director reported that absenteeism at all activities was less than 1% (J.G. Wells, personal communication, October 15, 2002).

Procedure

Each K-12 participant developed and submitted to Trek 21 a five lesson instructional unit prior to the summer institute (pre unit). During the summer institute, the teachers applied the Trek 21 training to their units, allowing the units to be augmented with instructional technologies and new practices stemming from the application of the technologies (post unit). The 37-item Indicators of Instructional Change instrument previously described was applied to one lesson in each unit, both pre and post, (the random lesson sweep) and to all five lessons in the comprehensive evaluation, both pre and post, (the random comprehensive evaluation). The pre and post units were randomly assigned to an evaluator and one lesson was randomly selected from each unit for the lesson sweep. Both evaluators independently coded seven sweep reviews comprising 26% of the total to obtain inter-rater agreement. Point by point inter-rater agreement calculated on presence or absence of the variable and evidence of student engagement yielded 100% agreement. Each evaluator independently completed an additional ten sweep reviews to complete the review process. To ensure representation across grade levels during the comprehensive evaluation, units were stratified (preschool, elementary, middle, high) and randomly selected to generate at least a 25% representative sample. Due to only two preschool offerings, one unit, representing 50% of the sample, was evaluated. The comprehensive unit evaluations were completed by both evaluators with any rating disagreements being discussed and resolved at the time of the evaluation.

For each lesson in the random sweep and each unit in the comprehensive evaluation, scores were obtained by summing the items in each of the following areas: objectives; assessment; total instructional procedures; total active instructional procedures; total instructional strategies; total active instructional strategies; total IT integrations; total active IT integrations; total active across procedures, strategies, and IT integrations. These scores were entered into Stat View for further analysis.

RESULTS

The purpose of this investigation was to examine the extent to which participation of K-12 educators in Trek-21 was associated with (a) a shift in instructional design as measured by improvement in indicators of instructional change and (b) an increase in active student engagement. Pre and post review scores on the random lesson sweep and random comprehensive evaluation were recorded and analyzed to document any change in (a) indicators of instructional change (total procedures, strategies and IT integrations included in the lesson) and (b) the active student engagement (total procedures, strategies, and IT integrations incorporating active student engagement). Table 3 shows mean pre and posttest scores obtained during the lesson sweep.

Effect sizes were calculated to provide an indication of the practical meaningfulness of the results (Kirk, 1996). The standard mean difference (SMD) effect provides an estimate of the magnitude of the result independent of n-size. The SMD effect size was calculated by dividing the mean difference for each variable by the pretest standard deviation. According to Cohen (1988), an effect size of .2 is small, .5 is medium, and .8 is large. Overall, total IT integrations, total active IT integrations, and total active across procedures, strategies, and IT integrations produced large effect sizes (2.74, 2.69, 1.10). Medium effect sizes were obtained for total active instructional strategies, total instructional strategies, and total instructional procedures (.60, .54, .51). Total active instructional procedures produced an effect size of .45.

Table 4 shows the t-value, degrees of freedom, and one-tail level of significance for mean differences on the pre and postlesson sweep. At the .05 level of significance, only objectives and assessment yielded non-significant p-values.

Table 5 shows mean pre and posttest scores obtained during the comprehensive evaluation of the teacher units. The standard mean difference (SMD) effect size was again calculated by dividing the mean difference for each variable by the pretest standard deviation. Overall, total IT integrations, total active IT integrations, and total active across procedures, strategies, and IT integrations produced large effect sizes (7.04, 6.82, 1.05). A medium effect size was obtained for total instructional procedures (.72) while total active instructional strategies produced an effect size of .46. Small effect sizes were noted for total instructional strategies, assessment, and total active instructional procedures (.43, .39, .23).

Table 6 shows the t-value, degrees of freedom, and one-tail level of significance for mean differences on the pre and post comprehensive evaluation. At the .05 level of significance, assessment, total instructional strategies, and total active instructional strategies yielded non-significant p-values. The difference between pre/posttest means for all other variables is an unlikely chance occurrence, assuming null is true, given repeated random sampling of that n-size.

DISCUSSION

The purpose of this evaluation was to investigate the extent to which K-12 teacher participation in the second cycle of Trek-21 was associated with an improvement in indicators of instructional change and an increase in active student participation in participant-generated instructional units. Overall, posttest scores were statistically significantly higher than pretest scores except in the areas of objectives and assessment. As should be expected, the results indicated that participants integrated significantly more instructional technologies in their units after completing the Trek 21 institute. The increase was both statistically and practically significant. Not surprisingly, the large effect size for overall procedures, strategies, and IT integrations was produced almost entirely as a result of the large increase in the number of instructional technologies that participants integrated into their units. This is supported by the non-significant findings produced in the comprehensive lesson sweep on total instructional strategies and total active instructional strategies.

With regard to improvement in active student engagement, results also indicated a statistically and practically significant increase in this overall. However, this was again primarily due to the large increase in IT integrations that, almost invariably, involved active student engagement. These results indicate support for the overall goals of Trek 21, which included providing participants with the technical skills and pedagogical knowledge necessary to integrate instructional technologies meaningfully into their web-based units while increasing active student engagement.

These findings suggested that as teachers begin to train to integrate more instructional technologies into their teaching, our focus can be on the methods and processes of integrating technologies, not the procedures of instructional design. The findings from this study indicated that as educators increase awareness, understanding, and use of instructional technologies, there is also an increase in their instructional procedures in regard to effective design and student engagement along with the increase in the use of instructional technologies.

Also indicated by the results are factors that may influence the effectiveness of the Trek 21 professional development cycle. Although not the focus of this study, several factors that may affect the effectiveness of professional development emerged. The study of these factors may provide trainers with valuable information regarding the planning and delivery of effective training. These factors include: (a) the duration of training and support activities; (b) the planned movement of participants through Senge's Stages of Learning (1995); and (c) the immediate application of new knowledge to a practical product. The Trek 21 staff considered each of these factors as they planned and implemented each cycle of the Trek 21 project.

Since Congress recognized the need to create the PT3 grant program to ensure that educators understand how to create and deliver high quality, technology-infused lessons that engage students and improve learning, it is now time to go beyond training and ensure that this innovation is adopted and sustained by educators. Future research should examine the extent to which these changes in instruction: (a) are sustained by teachers; (b) are transferred to other curricular or content areas; and, perhaps most importantly, (c) improve outcomes for students.
Appendix A--Indicators of Instructional Change Instrument Random Lesson
Sweep and Comprehensive Evaluation

 Preschool Elementary

Participant Name Pre Post Pre Post

Curricular Area
Objectives (action verb/measurable)
Assessment (0,1,2,3)
Instructional Procedures
Motivating Introduction
Check for Prerequisite Skills (Review)
Present New Content
Guided practice
Independent Practice
Closure
Extensions (0,1,2)

Total Procedures
Total Active

Instructional Strategies
Advanced Organizer
Whole Group Instruction
Peer-Mediated Instruction
Group Discussion
Active Responding
Problem-Solving
Research
Inquiry
Hands-on/Manipulatives
Dramatic Representation
Journaling/Writing
Student Presentation
Teacher Demonstration

Total Strategies
Total Active

IT Integrations
CAI/Drill and Practice
Simulation/Educational Games
Word Processing
Information Retrieval
Internet Access
E-mail
Bulletin Boards/Listservs
Authoring/Multimedia Development
Desktop Publishing
Electronic Presentations
Video Development
Open Lab Access
Web-Page Development

Total ITs
Total Active

 Middle High

Participant Name Pre Post Pre Post

Curricular Area
Objectives (action verb/measurable)
Assessment (0,1,2,3)
Instructional Procedures
Motivating Introduction
Check for Prerequisite Skills (Review)
Present New Content
Guided practice
Independent Practice
Closure
Extensions (0,1,2)

Total Procedures
Total Active

Instructional Strategies
Advanced Organizer
Whole Group Instruction
Peer-Mediated Instruction
Group Discussion
Active Responding
Problem-Solving
Research
Inquiry
Hands-on/Manipulatives
Dramatic Representation
Journaling/Writing
Student Presentation
Teacher Demonstration

Total Strategies
Total Active

IT Integrations
CAI/Drill and Practice
Simulation/Educational Games
Word Processing
Information Retrieval
Internet Access
E-mail
Bulletin Boards/Listservs
Authoring/Multimedia Development
Desktop Publishing
Electronic Presentations
Video Development
Open Lab Access
Web-Page Development

Total ITs
Total Active

Key

0 = Absence of variable
1 = Presence of variable
2 = Assessment is linked to objectives/Extension involves IT
3 = Each objective is assessed
+ = Active student engagement

Indicators of Instructional Change Instrument--Random Comprehensive
Evaluation--Preimplementation

Participant Name Preschool Elementary Middle High

 Lesson 1 Lesson 2 Lesson 3

 Pre Post Pre Post Pre Post

Curricular Area

Objectives (action
verb/measurable)

Assessment (0,1,2,3)

Instructional Procedures

Motivating Introduction

Check for Prerequisite Skills
(Review)

Present New Content

Guided practice

Independent Practice

Closure

Extensions (0,1,2)

Total Procedures

Total Active

Instructional Strategies

Advanced Organizer

Whole Group Instruction

Peer-Mediated Instruction

Group Discussion

Active Responding

Problem-Solving

Research

Inquiry

Hands-on/ Manipulatives

Dramatic Representation

Journaling/Writing

Student Presentation

Teacher Demonstration

Total Strategies

Total Active

Participant Name Preschool Elementary Middle High

 Lesson 4 Lesson 5

 Pre Post Pre Post

Curricular Area

Objectives (action
verb/measurable)

Assessment (0,1,2,3)

Instructional Procedures

Motivating Introduction

Check for Prerequisite Skills
(Review)

Present New Content

Guided practice

Independent Practice

Closure

Extensions (0,1,2)

Total Procedures

Total Active

Instructional Strategies

Advanced Organizer

Whole Group Instruction

Peer-Mediated Instruction

Group Discussion

Active Responding

Problem-Solving

Research

Inquiry

Hands-on/ Manipulatives

Dramatic Representation

Journaling/Writing

Student Presentation

Teacher Demonstration

Total Strategies

Total Active

 Lesson 1 Lesson 2 Lesson 3

 Pre Post Pre Post Pre Post

IT Integrations

CAI/Drill and Practice

Simulation/Educational
Games

Word Processing

Information Retrieval

Internet Access

E-mail

Bulletin Boards/Listservs

Authoring/Multimedia
Development

Desktop Publishing

Electronic Presentations

Video Development

Open Lab Access

Web-Page Development

Total ITs

Total Active

 Lesson 4 Lesson 5

 Pre Post Pre Post

IT Integrations

CAI/Drill and Practice

Simulation/Educational
Games

Word Processing

Information Retrieval

Internet Access

E-mail

Bulletin Boards/Listservs

Authoring/Multimedia
Development

Desktop Publishing

Electronic Presentations

Video Development

Open Lab Access

Web-Page Development

Total ITs

Total Active

Key

0 = Absence of variable
1 = Presence of variable
2 = Assessment is linked to objectives/Extension involves IT
3 = Each objective is assessed
+ = Active student engagement

Table 1

Stages of Learning (Senge, 1995) and Trek 21 Activities

Stage of Learning Trek 21 Training Activities Addressing Each Stage

Novice * Trainers presented overview of Trek 21
 * Trainers demonstrated skills
 * Participants engaged in supervised, hands-on
 practice with structured activities

Advanced Beginner * Participants and trainers discussed how
technolo- gy integration could augment individual lesson
 plans
 * Participants applied newly acquired skills to
 lesson plans with trainer support and
 assistance

Competent lesson * Participants applied new skills to individual
 plans with trainer support when solicited
 * Participants identified alternate applications of
 new skills to enhance original lesson plans

Proficient * Participants implemented lessons in individual
 classroom settings
 * Participants modified and adjusted lessons to
 fit classroom needs

Expert * Participants recruited as peer trainers.
 * Peer trainers collaborated with original Trek 21
 trainers to refine training for new participants
 * Peer trainers delivered training

Table 2

Participant Demographics

Gender Grade Level General/Special PDS/ Non Institute
 PDS Repeater

25 Female 2 Preschool 24 General 21 PDS 4 Repeaters
 2 Male 17 Elem 3 Special 6 Non
 4 Middle
 4 High

Gender Technology Use
 Score

25 Female 5 Beginner
 2 Male 10 Low
 8 Intermediate
 1 Expert
 3 did not report
 Mean 1.96

Table 3

Mean Pre and Posttest Scores on Lesson Sweep

Variable Mean Pretest Mean Posttest
 (SD) (SD)

Objectives .92(.28) .96(.20)

Assessment 2.0(.87) 2.20(.91)

Total instructional procedures 8.0(2.29) 9.16(1.99)

Total active instructional 5.44(1.61) 6.16(1.43)
procedures

Total instructional strategies 4.28(2.01) 5.36(2.31)

Total active instructional 4.12(1.92) 5.28(2.26)
strategies

Total IT integrations .36(.70) 2.28(1.54)

Total active IT integrations .36(.70) 2.24(1.59)

Total active across procedures, 9.92(3.41) 13.68(4.07)
strategies, and IT integrations

Variable S.M.D.
 Effect Size

Objectives .14

Assessment .23

Total instructional procedures .51

Total active instructional .45
procedures

Total instructional strategies .54

Total active instructional .60
strategies

Total IT integrations 2.74

Total active IT integrations 2.69

Total active across procedures, 1.10
strategies, and IT integrations

Table 4

t-Value, Degrees of Freedom, One-Tail Level of Significance for Mean
Differences on Pre/Post Lesson Sweep

Variable t-value Degrees of Freedom

Objectives -.569 24

Assessment -.894 24

Total instructional procedures -.2.28 24

Total active instructional procedures -2.30 24

Total instructional strategies -3.04 24

Total active instructional strategies -3.01 24

Total IT integrations -6.29 24

Total active IT integrations -6.01 24

Total active across procedures, -5.30 24
strategies, and IT integrations

Variable p-value

Objectives .29

Assessment .38

Total instructional procedures .02

Total active instructional procedures .02

Total instructional strategies .003

Total active instructional strategies .003

Total IT integrations <.0001

Total active IT integrations <.0001

Total active across procedures, <.0001
strategies, and IT integrations


Table 5

Mean Pre and Posttest Scores on Comprehensive Evaluation

Variable Mean Pretest Mean Posttest
 (SD) (SD)

Objectives 4.80(.45) 4.80(.45)

Assessment 10.80(3.11) 12.0(2.55)

Total instructional procedures 40.0(8.37) 46.0(13.32)

Total active instructional 27.80(7.26) 29.60(9.81)
procedures

Total instructional strategies 24.20(12.11) 29.40(12.76)

Total active instructional 23.80(12.28) 29.40(12.76)
strategies

Total IT integrations .80(1.79) 13.40(6.19)

Total active IT integrations .80(1.79) 13.0(6.60)

Total active across procedures, 52.0(19.48) 72.40(24.38)
strategies, and IT integrations

Variable S.M.D.Effect
 Size

Objectives 0

Assessment .39

Total instructional procedures .72

Total active instructional .23
procedures

Total instructional strategies .43

Total active instructional .46
strategies

Total IT integrations 7.04

Total active IT integrations 6.82

Total active across procedures, 1.05
strategies, and IT integrations

Table 6

t-Value, One-Tail p-Value for Mean Differences on Pre/Post Comprehensive
Evaluation

Variable t-value Degrees of One-Tail
 Freedom p-value

Objectives * 4 *

Assessment -3.21 4 .02

Total instructional procedures -1.83 4 .07

Total active instructional procedures -.96 4 .20

Total instructional strategies -2.56 4 .03

Total active instructional strategies -2.71 4 .03

Total IT integrations -4.05 4 .008

Total active IT integrations -3.69 4 .01

Total active across procedures, -6.94 4 .001
strategies, and IT integrations

* Pre/post objective measures were identical


References

Anderson, R., & Becker, J. (2001). School investments in instructional technology. Teaching, Learning, and Computing Report, Report 8. Retrieved October 14, 2002 from: http://www.crito.uci.edu/tlc/findings/report_8/startpage.htm

Cohen, J. (1988). Statistical power analysis for the behavioral sciences ([2.sup.nd] ed.). Hillsdale, NJ: Lawrence Erlbaum.

Darling-Hammond, L. (1997). Doing what matters most: Investing in quality teaching. New York: The National Commission on Teaching and America's Future.

EDMin. (2000). Building performance communities, technology use planning, TMM. Retrieved October 5, 2000 from: http://www.edmin.com/tp/tmm.cfm

Hawthorne, R., Walls, R., & Wells, J. (1999). 1999 Trek 21 Rubric. Unpublished scoring rubric, West Virginia University.

Harris, J. (1998). Curriculum-based telecommunication: Using activity structures to design student projects. Retrieved March 12, 2000 from: http://ccwf.cc.utexas.edu/-jbharris/Virtual-Architecture/Articles/Structures.pdf

Hunter, M. (1982). Mastery teaching. El Segundo, CA: TIP Publications.

Johnson, D. (1977). What does it look like? Part 1: The code 77 rubrics. Technology Connection. Retrieved October 9, 2002 from: http://www.doug-johnson.com/dougwri/Rubbeg.HTM

Katayama, A., & Lemani, C. (2002). [Trek 21 higher education evaluation report]. Unpublished raw data.

Kirk, R.E. (1996). Practical significance: A concept whose time has come. Educational and Psychological Measurement, 56, 746- 759.

McCaleb, S.P. (1994). Building communities of learners. New York: St. Martin's Press.

Mizell, H. (2001, Summer). How to get there from here. Journal of Staff Development, 18-20.

National Center for Education Statistics. (2001). Internet access in U.S. public schools and classrooms: 1994-2000. Retrieved October 14, 2002 from: http://nces.ed.gov/pubs2001/200107.pfd

National Staff Development Council (2002). NSDC standards for staff development. Retrieved August 14, 2002 from: http://www.nsdc.org/list.htm

Office of Educational Technology. (2000). eLearning: Putting a world-class education at the fingertips of all children. Retrieved October 14, 2002 from: http://www.ed.gov/Technology,elearning/e-learning.pdf

Riel, M., & Becker, H. (2000, April). The beliefs, practices, and computer use of teacher leaders. Paper presented at the meeting of American Educational Research Association, New Orleans, LA. Retrieved October 14, 2002 from: http://www.crito.uci.edu/tlc/findings/aera/startpage.html

Senge, P. (1995). Organizational learning. Retrieved September 10, 2002 from: http://www.entarga.com/orglearning/index.htm

Sparks, D., & Hirsh, S. (2001). A national plan for improving professional development. National Staff Development Council. Retrieved October 16, 2002 from: http://www.nsdc.org/educatorindex.htm

KATHERINE MITCHEM, DEBORAH L. WELLS, AND JOHN G. WELLS

West Virginia University

USA

jgwells@mail.wvnet.edu
COPYRIGHT 2003 Association for the Advancement of Computing in Education (AACE)
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2003, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Wells, John G.
Publication:Journal of Technology and Teacher Education
Date:Sep 22, 2003
Words:5432
Previous Article:Electronic discourse in preservice teacher preparation.
Next Article:Exploring the influence of web-based portfolio development on learning to teach elementary science.


Related Articles
Information Technology and Staff Development: Issues and Problems Related to New Skills and Competence Acquisition.
Designing effective instructional strategies for a web-enhanced course on web-based instruction.
Reflections on a technology integration project.
Technology professional development: a case study.
Principles of effective instruction--general standards for teachers and instructional designers.
Professional development of instructional designers: a proposed framework based on a Singapore study.
Knowledge to Support the Teaching of Reading: Preparing Teachers for a Changing World.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters