Beginning with a baseline: insuring productive technology integration in teacher education.
THE CONTEXT AND CHALLENGE
The International Society for Technology in Education ([ISTE] Kelly, 2002) asserts that preservice teachers must complete a sequence of experiences that develop an indepth understanding of how technology can be used as a tool in teaching and learning. In addition, teacher candidates must see technology modeled by faculty in their university classes, and in field placements. However, research (Office of Technology Assessment, 1995; Fabry & Higgs, 1997) has found that most faculty lack the skills and knowledge to model technology use and/or teach their students how to effectively infuse technology into the learning environment. Colleges of education are faced with the challenge of providing programs that develop both faculty and students as effective technology integrators.
To facilitate this type of massive transition, faculty members must first catch a vision for the ways in which the incorporation of technologies can enhance and strengthen their teaching (Albion & Ertmer, 2002). These beliefs, coupled with a sense of self-efficacy, can encourage instructors to dedicate the time and energy required to revise their courses (Snider, 2002). These revisions result in increased modeling of technology integration strategies (Franicis-Pelton, Farragher, & Riecken, 2000) and the provision of opportunities for students to use various technologies to increase their learning (Vannatta & Beyerback, 2001). Strong support structures are necessary to accomplish this metamorphosis (Dusick, 1998), and supplemental grant funding is surely one method of providing resources to ensure success.
For the past four years, Bowling Green State University (BGSU) has implemented an extensive program to infuse teacher education with technology experiences that ensure that our teacher candidates are equipped to effectively model and integrate computer technologies in their future PK-12 classrooms. As the largest producer of P-12 teachers in the state of Ohio, BGSU graduates nearly 700 teacher candidates each year. These teacher candidates complete programs in early childhood (EC), middle childhood (MC), adolescent/young adult (AYA), special education (SE), or other specialty program areas including foreign language, music, art, physical education, business, and technology education. While these programs address their own unique goals and standards, all students complete similar course-work in the arts and sciences (general preparation), and educational foundations (professional preparation), culminating in their methods courses and student teaching in their final year (Morey, Bezuk, & Chiero, 1997).
Funded through the United States Department of Education's Preparing Tomorrow's Teachers to Use Technology (PT3) initiative (http://www.pt3.org), Project PICT (Preservice Infusion of Computer Technology) sought to restructure our teacher education programs by integrating technology at every level of a teacher candidate's preparation: general education curriculum, teacher education curriculum, and PK-12 field experiences. Project PICT implemented numerous activities to increase the technology experiences for teacher candidates throughout their university education. To facilitate technology use among freshman/sophomore teacher candidates, PICT provided mini-grants to arts and sciences faculty for the development of technology-rich curriculum. Technology infusion in teacher education curriculum was encouraged through multiple strategies: Program Curriculum Grants, extensive faculty training on technology applications and pedagogy, partnerships with K-12 schools, and increased technology equipment and support. Finally, providing technology-rich field experiences for both methods and student teaching was supported through University/School partnerships, extensive K-12 teacher (clinical faculty) training on technology applications and pedagogy, and increased technology equipment and support in the field. Figure 1 illustrates the progression of coursework and teacher candidate technology experiences that began during this time
[FIGURE 1 OMITTED]
While PICT resulted in significant increases in technology proficiency and integration among participating faculty, K-12 teachers, and teacher candidates, faculty continued to struggle with meaningful integration of technology in their instruction. Faculty members were frustrated with the enormous range of technology skills among teacher candidates. While faculty initially sought to model technology integration strategies, and require students to create some computer-generated artifacts (graphs, charts, digital images and video, presentations, etc.) related to their coursework, it became clear that students were not up to the expectations. As a result, faculty felt that skill development consumed too much instruction time, leaving little time for technology integration pedagogy. To provide adequate skill development among teacher candidates, faculty members were compelled to instruct technology-related lessons in the computer lab, which was typically unavailable. Thus faculty often eliminated such lessons.
As these issues became apparent, faculty along with PICT staff began brainstorming ways in which teacher candidates could begin their professional preparation with a foundation of technology skills. One option was to offer the "technology course" during one's sophomore year rather than the junior or senior year. While most felt this to be the ideal solution, it was not realistic since the number of sophomore teacher education students was far greater than instructors and labs could accommodate. In addition, many felt that the technology course should remain with methods since it should emphasis planning and integration of technology in the classroom. Another idea was to add a freshmen level course that was skill-based only. Again, this was impossible due to numbers of students. Finally, PICT staff posed the idea of implementing an assessment that would require students to demonstrate a core of technology competencies (Deal, 2002; Gomm, 2003; Liu, Johnson, & Maddux, 2001a, 2001b; Strickland, Salzman, & Harris, 2000). Basically, our teacher education programs would expect incoming students to enter with a foundation of skills that can be used in general education and applied in the context of PK-12 education within one's professional preparation (Frieden & Scott, 2003). The following paragraphs discuss the broader context of performance assessment and accountability in P-16 education. After this brief overview, we present the Assessment of Technology Competencies (ATC), delineating its place and function within the teacher education programs, the administration and scoring process, and the results of its use to date.
PERFORMANCE ASSESSMENT AND TECHNOLOGY
In the United States, as well as other nations, educational institutions are being carefully scrutinized. Evidence of student progress is expected to be documented and distributed (Reeves, 2002; Whittaker & Young, 2002). The interest in accountability and continuous improvement has impacted assessment processes in P-16 education, increasing the use of standardized tests, as well as performance assessments (Bartlett, 2002; Brown, 2000; Gettinger, 2001; Kimball & Cone, 2002; Persichitte & Herring, 2002).
Performance assessments are characterized by focus on student products or artifacts that demonstrate certain skills or achievements that cannot be easily measured through traditional, standardized tests. Portfolio assessment, and more specifically, electronic portfolio development, have grown out of a need for students to collect and organize multiple performance assessment products (Holt, Claxton, & McAllister, 2001; Quatroche, Duarte, & Huffman-Joley, 2002). BGSU teacher education programs, as a part of meeting accreditation standards through the National Council for the Accreditation of Teacher Education (NCATE), have been developing key assessments (performance assessments) that provide evidence of teacher candidate competency throughout the core curricula. When the need for basic technology skills among teacher candidates was identified, developing a performance assessment to allow students to demonstrate their skills in this area seemed a reasonable solution. This assessment would then become a part of the key assessment documents that students would compile, in an electronic portfolio format, to document their professional development. Recognizing that most students would not enter the university setting with all identified skills, PICT staff identified and/or created several support mechanisms that would be independently accessed by such students.
The assessment, supported, in part, through P[T.sup.3] grant monies, is now a baseline instrument, used to insure that the freshmen entering teacher education have a solid foundation of computer skills. This article describes the components of the Assessment of Technology Competencies (ATC) delineating its structure, implementation process, and evaluation. Data from the first year pilot and its subsequent implementation are presented.
THE TECHNOLOGY COMPETENCIES ASSESSMENT
Currently, the assessment (Appendix A) is a five-page document that details the construction of four digital products to be completed by the student in a proctored, two-hour session in the college's computer lab. The products use word-processing, spreadsheet, presentation, and graphics software applications, and integrate Internet and file management expertise. These computer skills have been identified by the International Society for Technology in Education (www.iste.org), as well as BGSU faculty, and are considered to be essential for first-year education students. ISTE has been developing an online assessment system, the Internet and Computing Core Certification (I[C.sup.3]) (Certiport, 2004), and regional professional development organizations have explored the option of using teacher technology self assessments to target skill development (MyTarget, 2004), but these options would not provide our students with strong mechanisms to demonstrate their skills. Instead we chose to modify an existing instrument from our computer use course. This option allowed us more local control over content and security. A similar assessment had been used as a midterm examination in computer use courses at the freshman and junior levels at two Midwestern universities over the past five years. While it could be argued that some of the skills included in the assessment are arbitrarily selected, they represent a range of beginning competencies. In other words, if students can demonstrate these skills, they are also able to navigate the menus and applications well enough to produce quality word-processed, spreadsheet and presentation digital documents.
The assessment was piloted (n=125) in one large lecture section of the college's Introduction to Education course (EDHD 201) in the spring of 2003 and was fully implemented for all students enrolled in this course during the 2003-2004 school year. Consequently, the assessment system described herein is impacting over 1,200 beginning education majors each year.
The infrastructure of the assessment system includes the scheduling of students' initial assessment-taking experience, the evaluation of the products submitted (four products from 600 students creates 2,400 digital files for evaluation), the communication and record-keeping procedures for the enterprise, the support for students who do not pass the assessment on their first attempt, and the opportunity to retake the assessment to achieve a passing score. A description of these components is presented and specific implementation methods are discussed.
Targeted Student Group
The technology competency assessment is tied to the student's enrollment in EDHD 201, an introduction to education (IE) course that all entering education students at BGSU are required to take (Figure 2). Instructors for this course are not expected to teach any technology skills, nor oversee the testing periods. Students are given copies of the assessment upon their university enrollment, and encouraged to practice the targeted skills before they begin their coursework. Online written and video tutorials are provided through the university's website, and incoming students are given information on how to access these resources with specific citations corresponding to the skills included in the assessment. During the first month of the semester, students are asked to sign-up for an assessment time in the college's computer lab. The four products created in this two-hour, proctored exam are transferred to space on the college's dedicated server to be evaluated.
[FIGURE 2 OMITTED]
Evaluation of Products
Examining and scoring thousands of digital files each semester obviously requires a significant investment of time. To this end, a faculty member with quarter-time release, and four 10-hour graduate assistants, are given this responsibility in lieu of other teaching responsibilities. Using the evaluation checklists (Appendix B) developed for the products, graduate assistants are able to score the files of at least 10 students in an hour's time. At this rate, scoring of the first round of the assessment was finished within two weeks of the students' completion of the ATC. Based on data gathered during the assessment pilot, we projected that 50-60% of the students would pass the exam on the initial experience. This projection was significantly higher than the 21% who initially passed all four sections of the pilot, but it was thought that requiring all EDHD 201 students to participate and incorporating the ATC into the course grading plan would motivate more students to prepare. Those who did not pass were provided support to acquire the skills needed and were allowed to retake the assessment once more, to achieve a passing score. The points given for the ATC reflected 10% of the total points for the course, so failing the ATC could potentially impact the student's grade for IE.
Communication and Record-Keeping
In addition to evaluating the student-submitted products from the assessment, the supervising faculty member also distributed written notification to students and their IE instructors regarding the results of the examination. Results were also posted within an online gradebook section of the Blackboard shell for the course. Since the exam has four distinct sections, it is possible that students could pass some sections and not others. Students had the option of only retaking the sections that they did not pass initially. A searchable database was developed, detailing the times that assessments were taken and/or retaken, and the scores achieved.
We expected that approximately 40% of the freshmen taking the technology competency assessment would not pass all of the sections on their first attempt, based on pilot data. However, the university, as well as the College of Education and Human Development, offered various support systems to help students master the skills required to do so. Online written tutorials (www.bgsu.edu/offices/its/docs/) and video tutorials (edhd.bgsu.edu/atc/tutorials) are available on the university website that include step-by-step instructions in all the software applications related to the assessment. The university also provides a student technology support center (StudentTech) that offers individual tutoring in technology applications (www.bgsu.edu/offices/studenttech/). Finally, the College of Education and Human Development offered support hours in the college's Technology Resource Center (www.bgsu.edu/colleges/edhd/trc/) for students desiring additional help in preparing for the assessment.
To encourage students to use these support mechanisms in the preparation of the initial assessment, a packet of information about the ATC was given to students once they had been accepted to BGSU and had indicated an interest in majoring teacher education. This packet included a copy of the assessment and checklist, a description of the assessment process, and the support systems available for preparation.
The Assessment of Technology Competencies (ATC) was piloted in the spring semester of 2003 with 185 IE students. The majority of these students were freshmen. The course instructor agreed to use the assessment as a replacement for one of the regular course assignments. The authors met with these students to introduce the ATC during a class session in the second week of the semester. A 15-minute overview of the rationale and procedure for the assessment was presented; in addition, the authors handed out information sheets, copies of the assessment instrument, scoring checklists, and URL's for the online video tutorials that corresponded with the skills included on the ATC. Schedules were distributed, and students were asked to sign up for a time to take the assessment during the month of February. Since this was a pilot, course credit for the exam was limited to extra credit so that student failure or poor performance would not impact the course grade.
During the three-week exam period, 125 (69%) students attempted the assessment. Of the 125 students, only 27 (21%) passed all four sections while 36 (28%) passed three sections. Survey data indicated that an overwhelming majority (90% or more) did not use external support mechanisms to prepare for the assessment, rather they reviewed the practice test independently. Focus group data revealed student frustration and confusion regarding the assessment process and the specific skills being tested. The pilot result had numerous implications for fully implementing the assessment the following semester; specific modifications are delineated next.
The pilot described informed the subsequent full implementation of the ATC for the fall of 2003. Issues concerning communication, lab accessibility, technology support, and clarity of exam elements were addressed. Lead faculty from the IE course met with the ATC staff and college technology staff to discuss pilot results and student input from surveys and focus groups. Strategies for making the ATC process more student-friendly and less labor-intensive for staff were explored. From this collaborative effort, the following modifications and procedures were established.
Assessment content. Several revisions were made within the ATC to more concisely identify the technology skills to be demonstrated. The spreadsheet portion was simplified by eliminating higher-end formulas and number formatting. The improved spreadsheet section focused on more authentic spreadsheet use and basic formulas. The language in the graphic illustration portion of the ATC was clarified and a likeness of the graphic to be constructed was included with the description. The emphasis was placed on replicating the example given. To solve security issues, e-mailing completed ATC files was discontinued and students were directed to copy their files to dedicated server space, instead. Scoring checklists were revised to better align with the specific skills targeted in the assessment, and model products were included on these score sheets. Finally, multiple versions of the ATC and corresponding checklists were created to thwart attempts at academic dishonesty.
Communication. Freshmen entering teacher education were given a packet of information about the ATC during the university's summer orientation sessions. In this way, students were aware of the ATC before they began their college experience. The packet included a descriptive overview of the ATC, a copy of the practice test and corresponding checklist, and a list of support resources. Students were encouraged to review the information, take the practice test, and take advantage of the online video tutorials to learn the skills.
During the second week of the semester, a member of the ATC staff gave a 10-minute presentation about the ATC to the IE students. This staff member reiterated the information from the orientation packet, and explained the procedure for scheduling and taking the ATC. Students were encouraged to communicate with ATC staff through a web-based portal that provided access to electronic copies of all the documents related to the ATC, links to support systems for students, and e-mail capabilities.
Student choice and support. Students were provided with a variety of days and times in which to take the ATC. They were also given a choice of platform on which to complete the assessment (Macintosh or Windows). Face-to-face support sessions were offered, as well as online support systems, including communication and document distribution through a dedicated BlackBoard shell, and written and video tutorials.
Financial considerations. Implementation of the ATC does come at a price, primarily with respect to labor necessary to supervise, communicate, proctor, and evaluate the ATC. At BGSU, four master's level 10-hour GAs are used to proctor and evaluate the ATC; a cost of approximately $16,000. In addition, 30% of the Instruction Technology Coordinator's time is used to supervise the GAs, coordinate the testing process, and communicate testing expectations and results. Including both salary and benefits, this approximates a cost of $16,000. While the price tag of $32,000 may seem exorbitant, our college is committed to establishing the baseline of technology competencies. In addition, we continue to explore more efficient methods of staffing the ATC.
The results of the ATC can be divided into two discrete categories, presented in the following paragraphs. First, student achievement on the four portions of the ATC are summarized and discussed. Next, data from a survey about student preparation for the ATC is presented.
Student achievement. Students had two opportunities to complete and pass the ATC. Only failed sections needed to be retaken, however some students chose to retake passed sections to improve scores. Of the 568 participants, only 532 students attempted all four sections in the first round of testing. Student achievement results for the first round are presented in Table 1. The remaining 36 students chose not to participate in the first round of ATC testing. Passing rates for word processing, spreadsheet, and graphic illustration sections were over 70% of the student participants. In contrast, the PowerPoint section showed the highest rate of failure, with only 42.3% passing. Only 28.6% of the students passed all four sections of the ATC during the first attempt. Nearly 8% failed all four sections of the ATC (Table 2). These first round achievement results clearly emphasized a need for the authors to better direct students to available support mechanisms for ATC preparation.
The second round of testing provided students with an opportunity to retake previously failed sections of the ATC. Since many students also retook sections to increase test scores, student achievement data for this second round, presented in Tables 3 and 4, includes all 568 participants. However, 417 participants retook one or more sections of the ATC. Passing rates for the retake approximated 90% for each of the four sections of the ATC with 74% of the students passing all four sections. Students who attempted the ATC during the retake passed at least one section. The authors were pleased with the retake results in that an overwhelming majority of entering education students were able to demonstrate basic technology competency and now be able to advance these skills in their future education courses. However with nearly 26% of participating students still incompetent in one or more technology skill areas, the authors were disappointed but felt that with time and better communication regarding support mechanisms and expectations, students would better prepare for the ATC and ultimately exhibit higher passing rates.
Finally, the authors investigated student achievement differences by platform and test version. A t-test of independent samples was conducted to examine achievement differences created by platform (Mac versus PC) for the initial and retake rounds. For the initial round, significant differences were found as the Mac users (n=97, M=19.13, SD=13.21) had a significantly lower mean than the PC users (n=436, M=24.94, SD=12.34), t(531)=-4.134, p<.001. However, these differences did not continue into the retake as means for the Mac uses (n=62) and the PC users (n=260) were almost identical. These achievement differences by platform from the first round to the retake is most likely due to students "accidentally" signing up for a test session in the Mac lab. The majority of participants had used PCs in high school and were much more comfortable with that platform. As a result, students were very careful in signing up for the retake--making sure one's platform preference was selected.
Student achievement differences created by test version were also examined using a one-way ANOVA. For the first round of testing, four versions were used. Results indicated no significant differences by test version, F(3, 529)=2.39, p=.07.
Student preparation for the ATC. Survey data (N=417) was collected at the beginning of the retake session to elicit information regarding test preparation and use of ATC information. Although participants had received a packet of information describing the ATC during summer orientation, surprisingly 38% indicated that they did not read it. Only 16% of the participants read the packet and used it to prepare for the assessment. Students also received much information regarding the ATC during the first couple weeks of class. This information detailed resources available for ATC preparation. On the survey, five options of support were listed: practice test, online tutorials, support sessions, Student Tech, and other. Of the 417 survey participants, 67% used the practice test to prepare, while only 3-5% used support sessions or Student Tech. Interestingly, 32% used the online tutorials, and 43% used "other" resources, which typically referred to a knowledgeable friend or classmate. The top resource used to prepare for the retake was "other" (57%). Fifty percent (50%) of retake participants continued to use the practice test as well. A final aspect of the ATC process addressed in the survey was communication through the Blackboard site. The majority of students (80%) used the site to check assessment scores/results. Students also used the site to download practice test (58%) and check test schedule (51%). In contrast, few survey respondents used the site for reading frequently asked questions (32%) or e-mailing the ATC graduate assistants (6%). These results only reinforced the need to better communicate the expectations regarding the ATC as well as the available support mechanisms for preparation.
Situating the ATC: Meeting the NETS-T. Some might argue that a skills-based technology assessment is inappropriate in teacher education and that curriculum integration strategies and higher-order thinking should be the foci. However, support for targeting skills first, exists, both in the current literature and in the ranks of teacher education faculty and administration. Early studies by the Apple Classrooms of Tomorrow (ACOT) researchers delineated five stages of teacher technology use. The ACOT studies identified these stages as entry, adoption, adaptation, appropriation, and invention (Sandholtz, Ringstaff, & Dwyer, 1997). For teachers to become effective technology integrators in their classrooms, they must first acquire the skills that would support the entry and adoption level work. As mentioned earlier, both I[C.sup.3] (Certiport, 2004) and MyTarget (MyTarget, 2004) are examples of national and regional attempts to address teacher technology competencies. The North Central Regional Educational Laboratory's Digital-Age Literacy initiative (NCREL, 2004) lists "technological literacy" as one of the eight vital literacies that students must possess to thrive in the 21st century.
The first tier of the NETS-T, "Technology Operations and Concepts," specifically points to the development of technology skills for teachers in areas of word-processing, spreadsheet, presentation, Internet, and media (Kelly, 2002). Furthermore, development and demonstration of these skills are to be a part of the teacher candidates General Preparation phase of their program. Requiring students to demonstrate basic technology skills, and giving them support systems to strengthen their development of these skills, insures that they are poised to expand their use and understanding of educational technologies as they move into their Professional Preparation phase. Student survey data indicate that students are realizing the important role computer technologies will play in their future work as teachers, and were grateful for the initial prompting to master the basic skills they would need for their professional development.
Faculty from the College of Arts and Sciences, and from the College of Education and Human Development, have demanded that students be more prepared to use technology as a part of their coursework, without requiring the instructors to teach the technology. The ATC, coupled with its support systems, provides the faculty with an assurance that students have the skills necessary to complete assignments for courses, using various computer applications. Students can be expected to prepare a multimedia presentation for their history class, or create a chart for their statistics class. Students who indicate that they cannot do these things can be asked if they passed the ATC and referred to StudentTech for personal tutoring or to the online video and written tutorials. This type of response further stresses to the student that these basic skills are important and are a requirement of their profession.
The ATC has also been added as a component of the college's performance-based assessment initiative. National and state accreditation bodies are requiring evidence to demonstrate teacher candidate achievement of adopted standards. Since the ATC is an artifact related to teacher candidate achievement of the NETS-T 1, it is included in the list of 23 key assessments that have been currently identified for inclusion in student's electronic portfolios. Because of this, even the small percentage of students who do not pass the ATC during their IE course will need to complete it successfully during their tenure at the university, in order to include this documentation in their portfolios.
Student experience. Because the students experience the ATC as a part of their IE course, they are introduced to ISTE and the NETS-T standards during their freshman year. They become aware of the expectations for teachers regarding technology and can begin to consider their own expertise in this arena (Kemp, 2000). As indicated by the ATC scores from the initial testing to the retakes, students were able to identity their skill deficiencies and make improvements in these areas within a short amount of time (one month).
As a part of this process, students also developed strategies regarding their approach to learning new technology skills. They were able to determine how they might master these applications, choosing from an array of support structures available. While many used the campus labs and the practice exam to prepare for the ATC, a smaller number used tutorial services and online tutorials. It is probable that the strategy indicated in the "other" category of the student survey involved working with a technology-skilled friend. Identifying and practicing strategies for learning various computer applications on campus are foundational for teacher candidates' continued development in the area of educational technologies, and the ATC encouraged this practice.
Finally, implementing a technology skills assessment forced beginning students to locate and use the computer resources available on campus. Prior to the ATC, teacher education candidates sometimes entered their sophomore and junior years without knowing where the Technology Resource Center was located. Requiring students to visit this area to schedule their assessment appointment, as well as take the ATC, introduced them to a technology-rich environment, specifically designed for them. In addition, realizing that another computer lab existed on campus (StudentTech) and was staffed to offer one-on-one tutoring for computer-related projects was an important awareness, oftentimes unattained by even upper classmen. Because of the ATC experience, students also used the campus online courseware system (BlackBoard) to communicate with staff, download practice documents and test schedules, and check their scores. Becoming familiar with this interface prepared them to continue using the system throughout their coursework. These elements combined to initiate teacher candidates into the university's technological infrastructure. While other campuses will not have identical resources, each possesses various assets related to educational technologies that students need to use. The ATC is a vehicle to introduce students to these campus-specific resources.
Faculty reactions. The college's teacher education faculty has responded supportively of the ATC, using it as both a guide and a support for efforts in technology integration. Several faculty members have used the ATC as a benchmark for their own personal technological expertise, commenting that, if we are expecting students to have these skills, then we should make certain that we, as faculty, are able demonstrate them. Building upon this rationale, instructors have begun to model the use of digital technologies within their courses, offering students practical and effective examples of technology-enhanced instruction (WiIburg, 1995-1996). In addition, faculty are requiring students to use computers to complete course assignments, expecting students to use the skills demonstrated on the ATC and taking advantage of the resources on campus. Because of these practices, students are moving into their junior-level computer integration course, and their methods and student teaching semesters, continuing to extend their abilities and understanding of computer technologies and meaningful classroom integration (Jonassen, Carr, & Yueh, 1998). The teaching staff is gaining confidence in their abilities to facilitate teacher candidates' achievement of the Performance Profiles outlined in the NETS-T through this process.
The development and implementation of the ATC has heightened faculty awareness within our college concerning the importance of computer technology in the schools. Students are beginning to realize that their technology skills are important to BGSU teacher education faculty and that these skills will be integrated into their coursework throughout their time here. Students and faculty experience more conversations about computer applications, campus labs and services, and small things, like file transfer. Steadily, we believe that these conversations and interests will be built upon in their preparatory and professional coursework. We have already seen increased integration of computer technologies in student coursework. Students are producing electronic portfolios that contain digital images, presentations, webpages, digital video, digital charts and graphs, and advanced word-processed documents--all directly tied to their professional development standards.
Other institutions interested in implementing this type of skills assessment could modify our process, based on their resources and the size of their student body. Some might opt for using a vendor to provide this type of skills assessment. Carefully considering your goals and expectations before adopting an outside system solution is advisable. We rejected the possibility of using a prepackaged, automated system like the I[C.sup.3] for several reasons. First, this particular system only assesses the Windows environment. Since we are committed to providing students with cross-platform access and experience, this limitation was a concern. Testing nearly 700 students each semester through Certiport also would be problematic. We could not afford to tie up our own labs to be a part of the Certiport network, but we also found it unreasonable to expect another Certiport site to accommodate our numbers. Finally, the automated testing environment is constraining, requiring students to complete certain tasks in only one manner, in order to pass. For example, in the spreadsheet portion of the I[C.sup.3], students are asked to resize the column width in an Excel spreadsheet. They cannot do this by dragging the column bar, though this is a viable option in actual use of the program, because the system cannot accurately score such a response. For these reasons, we found our local approach to give students a more authentic experience. It is possible that smaller institutions would find the I[C.sup.3] choice an acceptable one.
In the absence of graduate assistants to proctor and score this type of assessment, colleagues could consider tying this assessment to a particular course. In this scenario, the course instructor would schedule one class period in a computer lab and students would complete the assessment at that time. The instructor would then score the assessments for their particular class. Again, this would be feasible for smaller institutions or institutions with significant computer lab availability. Campuses with large numbers of teacher candidates would have to support faculty throughout this process, providing training, technical support, and additional incentives to achieve success.
BGSU is committed to graduating teacher candidates who are equipped to use computer technologies to powerfully impact teaching and learning in their classrooms. To make this vision a reality, we must begin to challenge students early in their university experience to gain and maintain basic computer skills. We are hopeful that these efforts of developing and piloting the technology competency assessment will yield a process that allows us to expand the horizons in our teacher education programs. If faculty can trust that all students have these basic skills, then activities and assignments throughout the remainder of our students' programs can incorporate these abilities and tackle the complexities of effective classroom technology integration. With a firm belief that such progress is within our grasp, we will continue to use the ATC as an instrument that supports this process.
Albion, P.R., & Ertmer, P.A. (2002). Beyond the foundations: The role of vision and belief in teachers' preparation for integration of technology. TechTrends, 46(5), 34-37.
Bartlett, A. (2002). Preparing preservice teachers to implement performance assessment and technology through electronic portfolios. Action in Teacher Education, 24(1), 90-97.
Brown, B.J. (2000). New assessment strategies to improve business teacher preparation. National Business Education Yearbook, 2000, (pp. 143-57).
Certiport. (2004). Internet and computing core certification. Retrieved July 8, 2004, from http://www.certiport.com/yourPersonalPath/ic3Certification
Deal, W.F. (2002). Making the connection: Technological literacy and technology assessment. The Technology Teacher, 61(7), 16-25.
Dusick, D.M. (1998). What social cognitive factors influence faculty members' use of computers for teacher? A literature review. Research on Computing in Education, 31(2), 123-137.
Fabry, D.L., & Higgs, J.R. (1997). Barriers to the effective use of technology in education: Current status. Journal of Educational Computing Research, 17(4), 385-395.
Franicis-Pelton, L., Farragher, P., & Riecken, T. (2000). Content based technology: Learning by modeling. Journal of Technology and Teacher Education, 8(3), 177-186.
Frieden, B., & Scott, S. (2003). Ensuring all students gain technological fluency through online assessment. Paper presented at the Society for Information Technology in Teacher Education, Albuquerque, NM.
Gettinger, M. (2001). Development and implementation of a performance-monitoring system for early childhood education. Early Childhood Education Journal, 29(1), 9-15.
Gomm, S. (2003, March 25). Computer literacy defined and implemented: A university graduation requirement. Paper presented at the Society of Information Technology in Teacher Education, Albuquerque, NM.
Holt, D.M., Claxton, E., & McAllister, P. (2001). Technology 2000: Using electronic portfolios for the performance assessment of teaching and learning. Computers in the Schools, 18(4), 185-198.
Jonassen, D.H., Carr, C., & Yueh, S.P. (1998, March). Computers as mind-tools for engaging learners in critical thinking. Techtrends, (pp. 24-32).
Kelly, M.G. (Ed.). (2002). National educational technology standards for teachers: Preparing teachers to use technology (1st ed.). Eugene, OR: International Society for Technology in Education.
Kemp, L. (2000). Research in teacher education. Technology competencies in teacher education: An evaluation to guide implementation of beginning teacher technology competencies (A research report prepared for Minnesota State Colleges and Universities and The Council on Professional Education No. BBB36098). Mankato, MN: Minnesota State University, College of Education.
Kimball, C., & Cone, T. (2002). Performance assessment in real time. School Administrator, 59(4), 14-19.
Liu, L., Johnson, D.L., & Maddux, C. D. (2001a). Evaluation and assessment in educational information technology: Part I. Computers in the Schools, 18(3), 5-125.
Liu, L., Johnson, D.L., & Maddux, C.D. (2001b). Evaluation and assessment in educational information technology: Part II. Computers in the Schools, 18(4), 127-212.
Morey, A., Bezuk, N., & Chiero, R. (1997). Preservice teacher preparation in the United States. Peabody Journal of Education, 72(1), 4-24.
MyTarget. (2004). MyTarget: Web-based self-assessment tool. Retrieved July 10, 2004, from http://mytarget.iassessment.org
NCREL. (2004). enGauge 21st century skills: Literacy in the digital age. Retrieved July 11, 2004, from http://www.ncrel.org/engauge/skills/skills.htm
Office of Technology Assessment. (1995). Teachers and technology: Making the connection (No. OTA-EHR-616). Washington DC: Office of Technology Assessment U.S. Government Printing Office.
Persichitte, K.A., & Herring, M. (2002). Performance assessment and ECIT Program Review: Nuts and bolts. TechTrends, 46(6), 42-45.
Quatroche, D.J., Duarte, V., & Huffman-Joley, G. (2002, Spring). Redefining assessment of preservice teachers: Standards-based exit portfolios. The Teacher Educator, 37(4).
Reeves, D. (2002). Six principles of effective accountability. Harvard Education Letter, 18(2), 7-8.
Sandholtz, J.H., Ringstaff, C., & Dwyer, D.C. (1997). Teaching with technology: Creating student-centered classrooms. New York: Teachers College Press.
Snider, S.L. (2002). Exploring technology integration in a field-based teacher education program: Implementation effort and findings. Journal of Research on Technology in Education, 34(3), 230-249.
Strickland, J., Salzman, S., & Harris, L. (2000, February 26-29). Meeting the accountability mandate: Linking teacher technology competency to student learning. Paper presented at the Annual Meeting of the American Association of Colleges for Teacher Education, Chicago, IL.
Vannatta, R.A., & Beyerback, B. (2001). Facilitating a constructivist vision of technology integration among education faculty and preservice teachers. Journal of Research on Computing in Education, 33(2), 132-148.
Whittaker, A., & Young, V.M. (2002). Tensions in assessment design: Professional development under high-stakes accountability. Teacher Education Quarterly, 29(3), 43-60.
WiIburg, K. (1995-1996). Changing teaching with technology. Learning & Leading with Technology, 23(4), 46-48.
1. At the time of this writing, the ATC is being redesigned to require three digital products, and be completed in 90 minutes. The graphic illustration components are being assimilated into the word-processing and presentation sections. Thus, the same skills are being assessed, but in a more streamlined approach, requiring less time to complete.
EDHD ASSESSMENT OF TECHNOLOGY COMPETENCIES
This assessment is performance-based in that you will use technology to create four products that demonstrate your technology competency. These four products are:
Word Document that uses: a 2-column format; a picture (located during a web search); text formatted with two fonts, two sizes, and two styles; spell-check; an imported chart (Schoolnet Novice Performance Task #1)
Excel Spreadsheet that applies several formulas, number formats, and creation of charts. (Schoolnet Novice Performance Task #2)
PowerPoint Presentation of two slides that applies layout template, Clip Art image, transition, and animation.
Graphic Illustration using paint/draw tools in Microsoft Word drawing tools and at least one graphic object or Clip Art image. (Schoolnet Novice Performance Task #4)
This assessment is open book. You may use any books, handouts, notes, or other material you choose. You may NOT consult with or look at others' work as you complete the exam. You may not use previously created documents--all products must be created in the Technology Resource Center during your assessment timeslot. For each product/file created, it is essential that you save the files exactly as specified. There is no need to print the files, but you will forward them to the specified server location at the conclusion of the assessment period.
To begin the assessment, you should open/launch the following applications:
An Internet browser of your choice (Netscape or Explorer).
Product #1: Word Document
DIRECTIONS: Read each step first. Then complete each task in order. The disk icon is a reminder to save your work at that point. Raise your hands if you have a question.
In Word, create a new file and format it as a 2 column document. Save this file as usernameword.doc, where username is YOUR BGNET USER-NAME (for example, sbanistword.doc).
At the top of the page, include a large header using WordArt indicating your full name.
In the left hand column, write one or two paragraphs describing your favorite animal. Be sure to include characteristics of its look, habitat, and food. "Left justify" the text of your article be sure to spell-check!
Create a title for your article. Place it above your article text in the left-hand column.
Format the article text and title so that you use 2 font types, 2 font sizes, and 2 font styles.
Go to your Internet browser, search for a picture that depicts your favorite animal. Insert this picture below your article, in the left hand column. You may need to resize the picture. You must include the web site address showing where the picture originated in the left hand column of your Word document below your picture.
In the right hand column, insert the title "My Semester Grades".
Save this document now.
Keep this document open, as you will need to add a chart in the right hand column soon.
Product #2: Excel Spreadsheet
DIRECTIONS: Read each step first. Then complete each task in order. Raise your hands if you have a question. Remember to save often!
1. Within Excel, create a new file and save this file as usernameexcel.xls, where username is YOUR BGNET USERNAME (for example, sbanistexcel.xls). Input the following data to construct a basic grade book.</p> <pre>
Term Final Total Final Name HW 1 HW 2 Test 1 Test 2 Paper Exam Grade Sue Jones 95 83 91 85 92 90 *** &&& Jane Doe 81 78 74 78 88 87 *** &&& John Smith 89 80 77 75 72 78 *** &&& Class Average ### ### ### ### ### ### ### ### </pre> <p>Use a bold font for column headers.
You will need to create formulas for any cells that have ###, ***, or &&&.
For Class Average, use the average function in the selected cells. (###)
For Total, use the sum function in the selected cells. (***)
For Final Grade, average six scores for each student. (&&&)
Format class average cells (###) to two decimal places, and final grade cells (&&&) to one decimal place.
Create a chart depicting Jane Doe's grades for the semester, excluding her total points and final grade.
Choose a columnar (vertical bar) chart format.
Title the chart, Jane Doe's Semester Grades
Title the categories on the x-axis, indicating assignment types. (HW 1, HW 2, Test 1, Test 2, Term Paper, Final Exam)
Save the spreadsheet and chart now.
Paste chart into the right hand column of your word document. You may need to resize your chart BEFORE copying and pasting into the Word document. However, it is also possible to format the chart after it is in Word. Save word document again.
Product 3: PowerPoint Presentation
DIRECTIONS: Read each step first. Then complete each task in order. Raise your hands if you have a question. Remember to save often!
Create a new file in PowerPoint select a slide design from the format menu and choose a layout. Save this file as usernamepres.ppt where username is YOUR BGNET USERNAME (for example, sbanistpres.ppt).
Type in the following text on the first slide. Center text.
Tutoring in Technology for BGSU Students
(Your First Name and Last Name)
On the first slide, insert an image of your choice from Clip Art (in Insert pull-down menu).
Type in the text below on the second slide, with title and bullets.
Tutoring offered to students in:
Transfer all four files to the specified server, per instructions given during the assessment session.
[c] 2003 Savilla Banister
EDHD Assessment of Technology Competencies Scoring Checklists
SAVILLA BANISTER AND RACHEL VANNATTA
Bowling Green State University
Bowling Green, OH USA
Table 1 Number and Percent of Students Passing, Failing, or Not Attempting the Four Assessment Sections (N=532) Pass Fail N % N % Word Processing 417 78.4 115 21.6 Spreadsheet 381 71.6 151 28.4 PowerPoint 225 42.3 307 57.7 Graphic Illustration 387 72.7 145 27.3 Table 2 Number of Assessment Sections Passed by Students (N=532) Number of Sections Passed N % 0 41 7.7 1 54 10.1 2 100 18.8 3 185 34.8 4 152 28.6 Table 3 Number and Percent of Students Passing and Failing the Four Assessment Sections After Retake (N=568) Pass Fail N % N % Word Processing 529 93.1 60 6.9 Spreadsheet 516 90.1 73 9.9 PowerPoint 496 87.3 93 12.7 Graphic Illustration 510 89.7 79 10.3 Table 4 Number of Assessment Sections Passed by Students After Retake (N=568) Number of Sections Passed N % 1 21 3.7 2 37 6.5 3 89 15.7 4 421 74.1
|Printer friendly Cite/link Email Feedback|
|Publication:||Journal of Technology and Teacher Education|
|Date:||Mar 22, 2006|
|Previous Article:||Implementing computer technologies: teachers' perceptions and practices.|
|Next Article:||Field experience in distance delivered initial teacher education programmes.|