Printer Friendly

Formative evaluation of a CSCLIP lesson.

Abstract

Laboratory instruction at a distance using synchronous, remote group settings is the next generation of online learning. Computer-supported, collaborative learning requiring immersive presence (CSCLIP) relies on synchronous web-based instruction and computer-controlled or simulated activities to teach the psychomotor, cognitive, and affective skills that are required in hands-on laboratory instruction. This case study describes the formative evaluation of one CSCLIP lesson. The evaluation identified eight variables that impact such learning. The evaluation resulted in the team sharing ideas on all aspects of the course, brainstorming improvements, and planning activities for their implementation.

Introduction

The noteworthy growth of online learning in higher education will likely continue (Singh & Pan, 2004). Most distance instruction has concentrated on asynchronous learning and computer-based training. The next generation of online learning involves learning at a distance using synchronous, remote group settings. Indeed, telepresence systems can now immerse remote learners in an environment captured by video cameras and permit them to operate and control devices and processes while working at a distance (Ausburn & Ausburn, in press). This manuscript focuses on a specific instructional approach that includes content delivered via telepresence, CSCLIP (computer-supported, collaborative learning requiring immersive presence).

CSCLIP integrates synchronous, collaborative e-learning and computer-controlled or simulated activities to teach the psychomotor, cognitive, and affective skills that are required in hands-on laboratory instruction. Immersive presence is characterized by same time, different place interactions among students and instructors, and by the students' physical control over elements that are typical in laboratory and other situated environments. For example, Juanita participates in a university web-based CSCLIP course while seated at her office desk. She interacts virtually during pre-specified times with equipment, local students, other remote students, and the instructors. She engages in group work, completes laboratory exercises, manipulates equipment and materials, and receives feedback on her learning. Juanita engages in the active, real-time learning of laboratory skills without physically entering the laboratory. She visits all the lab corridors, rooms, etc. virtually, and, she operates and configures many real devices as if she were physically in the lab. She also interacts with other students and instructors in a live mode. Thus the CSCLIP lab experience is a hybrid of real and virtual interaction.

CSCLIP offers increased instructor capabilities. The instructor controls the instructional content, the technology that delivers the content, and CSCLIP's increased interaction capabilities that allow all students to interact with remote students, students in the physical laboratory, the instructor, and the laboratory equipment and materials. The instructor limits interaction as appropriate for the lesson. Consequently, the instructor simultaneously manages more phenomena-of-interest than are required by traditional instructional methods. The increased capabilities afforded by CSCLIP instruction call for evaluation research that informs educators about how they can shape it to maximize learning. In this manuscript, we provide as a case study the formative evaluation of one CSCLIP lesson. We first describe the CSCLIP instruction and then we provide the research methodology, results, and discussion.

The CSCLIP Instruction

CSCLIP was developed by a cross-disciplinary research team with faculty members from engineering, management, and telecommunications. A theoretical foundation for CSCLIP was developed based on scholarly literature in learning and online learning. The team then selected a traditional hands-on laboratory course for transformation to the CSCLIP format (See Scheets & Weiser, 2001). The selected course provided instruction on various aspects of voice, video, and data networking. Furthermore, it grounded the students' learning from several theory-based lecture courses by providing opportunities to apply basic concepts, gain familiarity with the telecommunications equipment, and work in groups to develop hands-on skills for processes that technicians use. The laboratory course is required for the Oklahoma State University Master of Science degree in Telecommunications Management (MSTM) and offered several times a year at two campuses.

The team transformed the MSTM laboratory course to a multi-lesson CSCLIP course that can be offered simultaneously to local students who receive instruction face-to-face and remote students who receive instruction online. In this CSCLIP implementation, the team captured the relevant laboratory activities and also implemented other facets of a typical laboratory experience. For example, they developed capabilities that allow remote students to stroll down the hall virtually, walk into an adjacent room, and interact with the people therein. They also developed a wireless system that enables remote students to virtually look over the instructor's shoulder and a virtual tour that allows students to walk through the laboratory and manipulate selected equipment. The faculty members then studied individual learning in teams at three levels of virtuality (local, remote, and combination of both) (See Lucca, Romano, Sharda, Weiser, 2004). They also identified evaluation as an area of needed research. Evaluation is included in most models of online instruction (see Driscoll, 1998; Sales, 2002). However, as Kilby (2001) noted, few projects actually include formal evaluation.

The team expanded to include two individuals with expertise in adult education and evaluation. The team revisited the CSCLIP's theoretical foundations (See Sharda, Romano, Lucca, Weiser, Scheets, Chung, & Sleezer, 2004). Then the new team members agreed to formatively evaluate a lesson in the CSCLIP laboratory course and work with the team to identify instructional changes and to report the learning. Formative evaluation focuses on improving an entity (Russ-Eft & Preskill, 2001) and usually occurs during the start up of a project (Patton, 1997).

The evaluators' disciplinary lens for viewing the lesson was instructional systems design (ISD), which is the systematic process for developing instruction. New online instructional technologies affect decisions that are made throughout the ISD process and create a unique learning environment (Kilby, 2001). Jacobs and Dempsey (2002) called for research on how to implement emerging instructional technologies. More recently, Schatz (2003) compared the design phase of ISD to a black box and stated:

It is by looking within the processes that we as designers of instruction go through when making crucial design decisions that we may develop the potential for expansion and evaluation of our field's practice. If we can investigate, discuss, and reflect upon these design decisions and evaluate decisions in light of results, we have a powerful tool for guiding and improving practice. (p. 60) A tool that can facilitate studying the black box of lesson design and implementation is Swanson's (1996) Lesson Design form. It includes guiding questions for considering eight design variables that impact learning (i.e., objectives, trainee readiness, content structure, instructional sequence, rate of delivery, repetition and practice, knowledge of results, and reinforcement and rewards). In summary, CSCLIP is an emergent technology that provides increased capabilities for interaction and for learning. Formative evaluation research that examines the design and implementation of CSCLIP lessons can provide insight for improving instructional theory and practice.

Research Methodology

A case study focuses on a bounded system (Merriam, 2001). For this study, it was defined to be the instructional aspects of the MSTM laboratory lesson titled Installing a Local Area Network. This CSCLIP lesson, which was selected through purposive sampling, was chosen because it is typical of the structure and strategies used in delivering a CSCLIP course and because the evaluators could learn the lesson content without being telecommunications experts. The research question was how could formative evaluation be used in this case to improve the CSCLIP lesson and the team's interactions.'? The evaluation goal was three-fold:

a) to assess the lesson's learning activities from an instructional perspective and identify what worked and what could be improved, b) to share the evaluative information with the other team members and facilitate the identification of specific solutions for improvement, and c) to disseminate the knowledge that was gained through the formative evaluation process.

The evaluation plan called for the evaluators to collect data by actively participating in and, if possible, successfully completing the laboratory exercises. They would work in a combination group (one from the local laboratory and one from a remote location) to complete the exercises. Because the evaluators lacked expertise in the lesson's content, teaching assistants were available at each location to answer specific questions. The locations had computers, audio headsets, and Polycom video cameras that the students controlled via their computers. The evaluators agreed to take detailed notes of their experiences and impressions during the exercises and to use Swanson's (1996) Lesson Design Form in considering design variables. Data would be grouped by theme for reporting. At the completion of data collection and analysis, the evaluators would share their evaluation findings with the other team members and facilitate solution finding. The team also agreed to disseminate the evaluation findings at the conclusion of the process.

The evaluation was implemented as designed. The evaluators overviewed the lesson handout titled Peer-to-Peer 10BaseT LAN with sections titled Introduction, Objectives, Rules of Engagement, and Experiments. The Introduction described the purpose of the learning and its fit with other courses. The Objectives listed the following: 1) Become familiar with some of the capabilities of an Ethernet peer-to-peer LAN, 2) Become familiar with some of the duties of a peer-to-peer LAN Network Administrator, and 3) Develop basic LAN trouble-shooting skills. The Rules of Engagement explained that team members received identical scores when the team successfully completed tasks and that the entire group had to complete the exercises on each of their computers before any group member could be signed off. Also, individuals could ask their team members, the teaching assistant or the professor for help, but were not allowed to ask other teams for help. Experiments first described the procedure for setting up a remote computer to communicate with the local computer. Then a two column format was introduced. The left hand column was titled Remote Student and the right hand column was titled Local Student. Each column contained numbered procedures for completing two exercises. The text layout in the columns was such that the numbered procedures in the columns corresponded spatially. This layout allowed the remote and the local students to know exactly the actions that each team member should implement at each step. The procedural steps built on each other with output that one team member keyed into the computer used by the other team member to move the experiment forward. Each column also provided brief explanations for certain actions and cues about checking with teammates.

To complete the lesson, the evaluators actively engaged in a 2-person combination team with one evaluator participating locally and the other evaluator participating remotely. Each evaluator familiarized herself with the equipment at her learning site and then began the exercises. The teaching assistants answered the evaluators' questions, which focused on interpreting jargon in the handout. The course instructors, who are research team members, interacted with the evaluators during the lesson as they would with any students including checking periodically on progress. While working through the experiments, the evaluators recognized the importance of staying on track and communicating with each other because completing the experiments involved several instances of taking actions that transferred total control of the experiment to the other person and then watching her work.

Results and Discussion

Both evaluators found that completing the procedures was doable, but required significant thinking because a) the lesson's content was new, b) each procedure required them to demonstrate a skill, and c) the skills that were lacking had to be learned in a stimuli-rich environment. Each evaluator continually shifted attention among the computer screen, the handout, the remote camera, the instructor, the teaching assistant, and the other team member to receive visual, auditory, and tactile information. Both cheered when they successfully completed the experiments and could show the instructor the final product, their installed working LAN. Immediately after the lesson, the evaluators considered the instruction using Swanson's (1996) Lesson Design as a guide and their notes. They obtained the following results: Objectives were shared with the learners at the beginning of the lesson. The terminal objectives identified expected behavior, but could be strengthened by adding criteria and standards.

Trainee readiness reflected learning from other courses that prepared students for the laboratory. The evaluators, who had little content knowledge, successfully completed the exercises with a few explanations of jargon, and the lesson handout states that such resources are available to all. However, the evaluators suggested using an ice breaker to assure that all students were comfortable using the CSCLIP format for learning and to provide cyber rules that could help manage the environment's stimuli. The content structure reflects abstract content that focuses on procedural learning. The evaluators suggested extending learning by creating a conceptual framework to share with students at the beginning of the lesson that depicts graphically how the procedural steps fit with various telecommunications theories and the team learning.

Rate of delivery was controlled by the students. The experiments divided the work between the students so each had sufficient work that was spread over time, and team interaction was required for success. Sometimes, information overload slowed the learning. The evaluators identified the lack of cyber rules as problematic. For example, which information should the learner attend first when new information comes via the computer screen, the camera picture, and the headset? Repetition and practice were integrated throughout the lesson. However, the evaluators suggested adding practice that focused on cognitive and affective skills (e.g., student discussion of the order for specific steps, the theories that actions reflected, and the teamwork issues that they experienced).

Knowledge of results was built into the experiments because each experiment worked only if certain actions were taken. Students knew immediately when steps needed to be repeated. Reinforcement and rewards were built into the experiments because successfully completing each step of the experiment provided intrinsic rewards. Also the instructors were complementary throughout the exercise. The evaluators, teaching assistants, and other research team members met to discuss the evaluation results. Discussion was characterized by a process of adaptation in which the team members learned the evaluation findings, interacted with the evaluators to understand why addressing each finding was important to learning, and discussed possible solutions and effects. For each evaluation result, specific actions were identified to improve the instruction and specific team members took responsibility for implementing the improvement.

The evaluation served as a vehicle for the team to share ideas on all aspects of the course, brainstorm improvements, and take responsibility for their implementation. Focusing on the intended use of the evaluation required deliberate choices and that team members trust each other. Differences in the evaluation process, data collection instruments, the context, or the CSCLIP team could produce different results. At the conclusion of the evaluation, the team agreed that the formative evaluation stimulated their thinking and facilitated improved decision making for the course. Furthermore, the process elucidated the shared understandings of team members about what they were trying to accomplish through various activities. Finally, the evaluation served a developmental role in integrating the evaluators into the team as collaborators with specific knowledge and roles.

Conclusion

New developments in online instruction invite evaluation research. The formative evaluation of a CSCLIP lesson resulted in shared understanding, identification of effective techniques and technologies, and agreement on improvements for the instruction. The collaborative process facilitated development of the research team. This manuscript contributes to the literature by describing the use of formative evaluation to assess collaborative, online instruction. Future formative evaluation research should focus on CSCLIP instruction for larger teams to create initial knowledge about CSCLIP instruction that can be used to generate research propositions. Given the growth of online learning in general and virtual laboratory instruction in particular, such evaluation research is warranted.

References

Ausburn, L. J. & Ausburn, F. B. (in press). Desktop virtual reality in industrial teacher education: New power technology for teaching and research. Journal of Industrial Teacher Education.

Driscoll, M. (1988). Web-based training: Using technology to design adult learning experiences. San Francisco: Jossey-Bass.

Jacobs, J. W., & Dempsey, J. V. (2002). Emerging Instructional Technologies: The Near Future. In A. Rossett (Ed.), The ASTD E-Learning Handbook. London: McGraw Hill.

Kilby, T. (2001). The direction of web-based training: A practitioners view [Electronic version]. The Learning Organization 8(5), 194.

Lucca, J., Romano, N. C., Sharda, R., & Weiser, M. (2004). An Assessment of E-learning Technologies to Support Telecommunications Laboratory Learning Objectives. Paper presented at the 37th Hawaii International Conference on System Sciences, Waikoloa, HI

Merriam, S. B. (2001). Qualitative research and case study applications in education. San Francisco: Jossey-Bass.

Patton, M. Q. (1997). Utilization-focused evaluation. Thousand Oaks: CA: Sage.

Russ-Eft, D., & Preskill, H. (2001). Evaluation in organizations: A systematic approach to enhancing learning, performance, and change. Cambridge, MA: Perseus.

Sales, G. C. (2002). A quick guide to e-learning. Andover, MN: Seward Learning Systems Inc.

Schatz, S. (2003). A matter of design: The proposal to encourage the evolution of design in instructional design. Performance Improvement Quarterly, 16(4), 59-75.

Scheets, G., & Weiser, M. (2001). Implementing a Remote and Collaborative 'Hands-On' Learning Environment. In C. M. Sleezer, T. L. Wentling & R. L. Cude (Eds.), Human Resource Development and Information Technology: Making Global Connections (pp. 211-230). Boston: Kluwer Academic Publishers.

Sharda, R., Romano, N. C., Lucca, J., Weiser, M., Scheets, G., Chung, J.M., & Sleezer, C.M. (2004). Foundation for the study of computer-supported collaboratoryorative learning requiring immersive presence. Journal of Management Information Systems 20(4) 31-63.

Singh, P., & Pan, W. (2004). Factors affecting student adoption of online education [Electronic version]. Academic Exchange Quarterly, 8(1).

Swanson, R. A. (1996). TPS: Training for Performance System Field Handbook. St. Paul, MN: Swanson & Associates.

Catherine M. Sleezer, Oklahoma State University

Mary Anne Gularte, Oklahoma State University

George Scheets, Oklahoma State University

Mark Weiser, Oklahoma State University

Ramesh Sharda, Oklahoma State University

Sleezer, Ph.D., is Professor of Human Resource/Adult Education; Gularte, M.S., is a doctoral student in Human Resource/Adult Education; Seheets, Ph.D., is Associate Professor of Electrical and Computer Engineering; Weiser, Ph.D., is Director of the Master of Science in Telecommunications Management Program and Fleming Professor of Technology Management; and Sharda is Director of the Institute for Research in Information Systems, Conoco Phillips Chair of Management of Technology and Regents Professor of Management Science and Information Systems.
COPYRIGHT 2004 Rapid Intellect Group, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2004, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:computer-supported, collaborative learning requiring immersive presence
Author:Sharda, Ramesh
Publication:Academic Exchange Quarterly
Geographic Code:1USA
Date:Dec 22, 2004
Words:2979
Previous Article:Ageworks: the evolution of gerontology education.
Next Article:Peace begins in the classroom.
Topics:


Related Articles
Intelligent Systems/Tools in Training and Lifelong Learning.
Teaching teachers to use online information.
A framework for evaluating online courses.
A model of Learner-Centered Computer-Mediated Interaction for Collaborative Distance Learning.
Faculty technology training: learning objects.
Data mining technology for the evaluation of learning content interaction.
Preparing to teach online.
Group formation for web-based collaborative learning with personality information.
Enabling project-based learning in WBT systems.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters