The design and sequencing of e-Learning interactions: a grounded approach.
Advances in telecommunication technologies, changing student demographics and the need for ongoing professional development have resulted in a proliferation of e-Learning opportunities (newpromises.com, maricopa.com, ecollege.com, phoenix.com, corporatetraining.com). The problem is that many e-Learning programs continue to mimic traditional correspondenc e-mail models of distance education. They rely heavily on self-instructional text or lecture-based materials, failing to promote meaningful interactions among students, the instructor and content. Mioduser, Nachmias, Lahav, and Oren (2000) conducted an extensive analysis of 436 web sites designed to promote K-12 mathematics, science and technology education. They found that only 12% of the sites encouraged any form of collaboration among students as supplements to online work, and less than 3% supported online collaboration. The creation of modern e-Learning programs requires research and the development of new instructional strategies that utilize the capabi lities of telecommunication technologies and the potentials they afford collaborative and independent learning (Bates, 1990; Mason & Kaye, 1990; Soby, 1990).
So, how does e-Learning differ from other modes of instruction? What are meaningful e-Learning interactions? How do you design and sequence meaningful e-Learning interactions?
Hirumi, Chandler and St. John (2001) found that asking subject matter experts to reason through these questions can increase their confidence and perceived ability to promote e-Learning. This article chronicles insights gained from working with faculty, staff and students to seek answers to these fundamental questions while developing Web-based graduate and undergraduate courses and degree programs (Hirumi, 2000; Freeman, Zundel, Singleton, Joyce & Hirumi, 2001; Hirumi, Willis, Frey, & Gause, 2001; Hirumi, Youngman, Gannon-Cook, & Haggerty, 2000; Bermudez & Hirumi, 2000; Hirumi & Bermudez, 1996).
HOW DOES e-Learning DIFFER?
With adequate time and incentives, educators and subject matter experts can codify their knowledge into readily accessible, aesthetically appealing, electronic text. It can be an arduous process, much like writing a textbook, but it can be done. Web course authoring applications, such as WebCT, Blackboard and CourseBuilder also make it easier to generate and put course materials online. So, how does e-Learning differ from traditional classroom instruction? Why do experienced educators often find it difficult to transform effective instructor-led teaching materials into innovative e-Learning programs? I've found that novice distance educators frequently find it difficult to plan and manage meaningful e-Learning interactions.
In traditional classroom settings, key interactions that affect learner attitudes and performance often occur spontaneously, in real-time. Good instructors interpret verbal and nonverbal cues, clarify expectations, facilitate activities, promote discussions, elaborate concepts, render guidance and provide timely and appropriate feedback as they present content in a clear and engaging manner. Good instructors can also make up for flaws in design by utilizing their charisma to gain and sustain learners' attention and their experience to shed light on complex or confusing content matter.
During e-Learning, communications are predominately asynchronous and mediated by technology. Opportunities to interact in "real-time" are relatively confined. Key interactions that occur spontaneously in traditional classroom environments must be carefully designed and sequenced as an integral part of e-Learning. Novice distance educators need help to visualize how emerging telecommunication technologies may be used to enhance learning and performance.
With insufficient time, tools or training, educators have little choice but to revert to what they know best -- teacher-directed, instructor-led methods. They post lecture notes, embed links to external sites and ask learners to complete assignments and take an exam to earn course credit. Some add a bulletin board discussion, use e-mail and schedule a few chat sessions to make their courses "interactive." Others fortunate enough to have access to additional resources may use streaming media and animated graphics to capture and sustain learners' attention. However, as most experienced distance educators and students would attest, the use of interactive technologies does not ensure that meaningful interactions will occur. Interactions must be carefully planned and managed to facilitate e-Learning.
WHAT ARE MEANINGFUL e-Learning INTERACTIONS?
Simply stated, interactivity may consist of a learner accessing a page of text via a web interface and reading some content (Carlson & Repman, 1999). A relatively complex interpretation requires the learner and learning system to respond dynamically to one another. For instance, Borsook (1991) suggests that to be interactive, programs should simulate seven characteristics of interpersonal communications -- immediacy of response, non-sequential access of information, adaptability, feedback, options, bi-directional and interruptability. Merrill, Li and Jones (1990) and Weller (1988) also emphasize the dynamic nature of the interaction, requiring the learner and the technology to adapt to each other. Others focus on the quality of the interaction rather than the number, type or modality of responses. Jonassen (1995), for example, depicts interactions as a function of the type of learner response, the meaningfulness responses and the quality of the feedback provided. Apparently, "interactivity" can mean different things to different people. What is common is that interactivity is viewed as one of the defining characteristics of education that is vitally important in the design of distance education (DE) (Moore, 1989).
Kearsley (1997) suggests that the single most important element of successful distance learning is interactivity among participants. Interactions enable both the instructor and learners to communicate and respond to each other's needs and interests. Interactions may help reduce feelings of isolation and anonymity that can result in dissatisfaction, poor performance and dropouts among distance learners. Interactions are also seen as one of the keys to transforming traditional teacher-directed instructional methods to learner-centered approaches (Cuban, 1993). Booher and Seiler (1982) found that learners' avoidance of learner-instruction interactions can harm academic achievement. Without interactions, instruction may simply become "passing on content as it if were dogmatic truth, and the cycle of knowledge acquisition, critical evaluation and knowledge validation that is important for the development of higher-order thinking skills is nonexistent" (Shale & Garrison, 1990, p. 29). Even with its apparent need an d purported benefits, some still question the significance of interactivity in distance education.
In a review of DE research, Simonson, Smaldino, Albright and Zvacek (2000) note studies indicating that different technologies allow for varying degrees of interaction. "However, similar to [media] comparison studies examining achievement, research comparing differing amounts of interaction showed that interaction had little effect on achievement (Beare, 1989; Souder, 1993)" (Simonson, Smaldino, Albright, & Zvacek, 2000, p. 61). It is important to note that, like media comparison research (Clark, 1994), these conclusions are based on investigations that compare the effects of interactivity across delivery systems (traditional vs. two-way audio and video vs. two-way audio). The effects of interactivity may be better ascertained by studying varying degrees or types of interactions within one, rather than across, delivery system. Further research is needed to support the intuitive sense that interaction is important and necessary (Moore, 1995), and effort must be made to synthesize what is known into useful guid elines for research and practice.
An array of taxonomies has been published for classifying interactions. Moore (1989) posits what may be the most widely known "communications-based" framework that specifies the sender and receiver of three key interactions (student--student, student--teacher and student--content). Student-student interactions occur "between one learner and another learner, alone or in group settings, with or without the real-time presence of an instructor" (Moore, 1989, p. 4). Student-teacher interactions attempt to motivate and stimulate the learner and allow for the clarification of misunderstanding by the learner in regard to the content. Student-content interactions are defined as a process of "intellectually interacting with content to bring about changes in the learner's understanding, perspective or cognitive structures" (Moore, 1989, p. 2).
With the increasing use of computer-based delivery systems, Hillman, Willis and Gunawardena (1994) argue convincingly for a fourth class of communication-based interactions -- student-interface. Such interactions allow learners to manipulate electronic tools to complete tasks and participate in other learning events. The interface acts as the point or means of interaction between the learner and the content, instructor, fellow learners or others. It includes learners' use of electronic tools and navigational aids as well as the layout of text and graphical elements.
A number of authors posit additional classes of communication-based interactions. For example, Carlson and Repman (1999) define learner-instructional interactions as those between the learner and the content that traditionally utilize strategies such as questioning, feedback and clarification, and control of lesson pace and sequence to facilitate learning. They further delineate social interactions as personal attempts to modify or enhance the quality of the instructional interaction by interpreting body language, promoting a sense of comfort and developing class management routines. In contrast, Mortera-Gutierrez and Murphy (2000) focus on the roles of the instructor, extending the basic categories to include instructor-facilitator, instructor-peers, instructor-support staff and technical personnel, and instructor-organization interactions.
Alternative taxonomies codify interactions by purpose. For instance, Hannifin (1989) posits a "purpose-based" framework that includes five basic functions for computer-based interactions: (a) confirmation, (b) pacing, (c) inquiry, (d) navigation, and (e) elaboration. With the emerging use of telecommunication technologies, Breakthebarriers.com (American Society for Training and Development, 2001) identifies nine key functions: (a) synchronous communication, (b) asynchronous communication, (c) browsing and clicking, (d) branching, (e) tracking, (f) help, (g) practice, (h) feedback, and (i) coaching. In comparison, to guide the selection of online instructional strategies and tactics, Northrup (2001) proposes five interaction attributes (or purpose): (a) to interact with content, (b) to collaborate, (c) to converse, (d) to help monitor and regulate learning (intrapersonal interaction), and (e) to support performance.
Bonk and King (1998) focus on the use of specific telecommunication tools, positing five levels of "tool-based" interactions: (a) e-mail and delayed messaging, (b) remote access and delayed collaboration, (c) real-time brainstorming and conversation, (d) real-time text collaboration, and (e) real-time multimedia and/or hypermedia collaboration. Still others, such as Bonk and Reynolds (1997) and Harris (1994a, 1994b, 1994c) describe different types of "activity-based" interactions or interactivities -- critical thinking, creative thinking, information searching, information sharing and collaborative problem solving.
Clearly a plethora of interactions may be used to promote e-Learning. It is also evident that e-Learning interactions may be interpreted in many ways. Definitions abound and frameworks for classifying interactions vary almost as much as the types of interactions reported in the literature. So what are meaningful e-Learning interactions? How do we make some sense of the published literature. The most reasonable answer may be, "It depends." It may depend on the instructional goals and objectives, the instructional strategy, the teacher's beliefs and/or the number and nature of learners, among other factors. It may also depend on the overall sequencing of interactions throughout a lesson or course rather than the design of an individual activity or event. The descriptive frameworks give good insights into the nature and range of potential e-Learning interactions. However, they neither illustrate the relationship among, nor prescribe practical guidelines for planning and managing a cohesive set of interactions th at comprise an instructional lesson, module or unit. When do you ask learners to interact with other learners? When should learners interact with content? When are student-instructor interactions necessary? Some process is necessary to help educators answer these questions and to design and sequence e-Learning interactions in a meaningful fashion.
HOW DO YOU DESIGN AND SEQUENCE MEANINGFUL e-Learning INTERACTIONS?
Educators often fail to ground their designs in research and theory (Bonk & King, 1998; Bonk & Cunningham, 1998; Bednar, Cunningham, Duffy, and Perry, 1995). Without sufficient time, training or resources, educators have little choice but to base their designs on past practices -- namely, teacher-directed methods. It is the use of teacher-directed methods and materials that results in what are essentially correspondence mail models of DE. This is not to say that teacher-directed methods are inappropriate for all forms of distance education. Rather, teacher-directed methods may be insufficient and, thus, alternative approaches should be considered. It is believed that the application of theoretically grounded instructional strategies can help educators plan and manage meaningful e-Learning interactions.
Hannifin, Hannifin, Land and Oliver (1997) define "grounded design" as "the systematic implementation of processes and procedures that are rooted in established theory and research in human learning" (p. 102). A grounded approach uses theory and research as a basis for making design decisions to optimize learning. It does not subscribe to or advocate any particular epistemology, but rather promotes alignment between theory and practice.
So, what are some grounded instructional strategies? How can they be used to guide the design and sequencing of meaningful e-Learning interactions? Figure 1 outlines several strategies that are considered "grounded" because they are based on explicit learning theories and have empirical evidence to support their effectiveness under specified conditions.
Each of the events associated with an instructional strategy may be considered an interaction, a transaction that occurs between the learner and other human or non-human resources. Educators can then select an instructional strategy and use each of the events to guide the design and sequencing of e-Learning interactions. The application of a grounded instructional strategy gives educators a foundation for planning and managing e-Learning interactions based on a combination of research, theory and practical experience. Five iterative steps are posited to educators to synthesize and apply these concepts:
Step 1 -- Identify essential experiences that are necessary for learners to achieve specified goals and objectives (optional).
Step 2 -- Select a grounded instructional strategy based on specified objectives, learner characteristics, context and epistemological beliefs.
Step 3 -- Operationalize each event, embedding experiences identified in Step 1 and describing how the selected strategy will be applied during instruction.
Step 4 -- Define the type of interaction(s) that will be used to facilitate each event and analyze the quantity and quality of planned interactions.
Step 5 -- Select the telecommunication tool(s) -- chat, email, bulletin board system -- that will be used to facilitate each event based on the nature of the interaction.
Step 1 is listed as optional because, with experience, designers tend to identify essential experiences during Step 3 as they operationalize instructional events. However, when working with relatively novice designers, it is useful to identify key learning experiences as an initial step for several reasons. Most educators can relate to learning experiences (Figure 2). It is a concept that is familiar to them, it incorporates and recognizes the value of their prior knowledge and experiences and serves as a good activity for stimulating discussion. It is also easier to introduce potentially novel e-Learning experiences (Bonk & Reynolds, 1998; Harris, 1994a, 1994b, 1994c) as a separate initial step, rather than during Steps 2 or 3 when they are trying to select an instructional strategy and operationalize instructional events that may also be new to them.
Figure 3 depicts the components of an instructional treatment plan that may be used as a template to complete Steps 3-5. In the first column, list the events associated with a selected instructional strategy. Consider several factors when selecting an instructional strategy such as the learning goals and objectives, the learner, the context and your epistemological beliefs. To learn a relatively simple set of procedures and/or verbal information, teacher-directed strategies -- direct instruction, elements of lesson design, and nine events of instruction -- may serve as a useful foundation for generating effective self-instructional materials. For example, there is not much need for social interaction when teaching someone how use a photocopying machine. In contrast, a complex, problem-solving goal that is open to multiple interpretations and alternative solutions (designing e-Learning) may be better addressed by strategies that promote exploration and social discourse -- inquiry learning, problem-based learni ng, experiential learning, inquiry learning, student-centered learning and WebQuests. Furthermore, if you are a "constructivist," you may tend toward the inquiry-oriented strategies for most goals and objectives.
In the second column, describe how you would operationalize each event. How will you gain and sustain learners' attention? How will you present learners with the instructional objectives? How will you stimulate the recall prior knowledge? As you design each event, keep in mind that the five steps are a part of an iterative process. You may find that you need to add, delete or re-sequence some events. In addition, as you work through one step, you may find that you need to revise the results of a preceding step. You can also do the work now or you can do the work later. Eventually, you will have to write the actual text information that will be read by learners. If you provide a general summary of how you will design each event at this point, you will need to spend significant time writing text later during the development of your e-Learning program. If you take the time now to detail each event, less time will be required during the development phase. Also, consider, in cases when different people are tasked with designing and developing instruction, the more detail you put into the treatment plan, the less time is required latter to explain your designs to writers, programmers and/or other course developers.
In the third column, define the type of interaction(s) that will be used to facilitate each event. Does the event require learner-teacher interactions? Learner-learner interactions? Learner-content interactions? One event may require multiple interactions -- to elicit performance, the learner may have to interact with the content as well as other learners. This is a good time to analyze and reflect on the quantity and quality of your planned interactions to determine if you have included an appropriate combination. How many learner-instructor and learning-learner interactions are planned? Do students have sufficient opportunities to interact with one another and with the instructor? Do learners require access to others? Are there too many learner-instructor interactions, making it difficult or impossible for the instructor to manage all of the communications? Do students have access to sufficient content information? Do learners require access to manipulatives? If so, how are learners to acquire the manipulat ives. Do students have sufficient opportunities to apply learned skills and knowledge? Are students given sufficient guidance and scaffolding to promote learning and self-regulation? Are the interactions designed and sequenced to facilitate, rather than inhibit, the achievement of targeted goals and objectives? You may find that you need to go back and revise your description of one or more events, again illustrating the iterative nature of the five-step process.
In the fourth column, select and map the telecommunication tool(s) that will be used to facilitate each event. Although your primary delivery system has probably been selected, you still have many options to consider. Your task is to determine the appropriate tool(s) for facilitating each interaction (defined in Column 3) that also fall within the confines of available resources. Relevant questions to consider include: who are the primary senders and receivers of the communications? Do learners need audio, video, text and/or graphics? Are synchronous or asynchronous communications necessary? Are the communications one-to-one, one-to-some, or one to many? What kind of budget do you have? What kind of technologies and human resources are available? How much time do you have to prepare course materials? The resulting instructional treatment plan may then be used to generate flowcharts, storyboards and prototypes of your instruction. Limited space prohibits a continued discussion of how treatment plans may be use d to develop e-Learning materials. However, a sample instructional treatment plan and resulting flowcharts and storyboards may be viewed at http://e-learning.inst.cl.uh.edu.
Various authors have proposed systematic models for designing training and instruction (Dick, Carey, & Carey, 2000; Smith & Ragan, 1999; Kemp, Morrison, & Ross, 1994). Common to these approaches are five basic phases: analysis, design, development, implementation and evaluation. The five-step process for designing and sequencing e-Learning interactions should be applied during the design phase of the systematic process - after the objectives and assessment method have been defined and during the development of an instructional strategy and the selection of media (Figure 3).
Figure 3 also depicts a series of tasks that are not normally associated with systematic design models. To select the appropriate telecommunication tools for facilitating planned interactions (integrate technology), it is important to have some understanding of the use, benefits and limitations associated with each tool. When working with educators new to e-Learning, it is useful to have them utilize the telecommunication tools that will be used to deliver the instruction to communicate and share ideas with each other during the design process (Hill, Williams, & Hirumi, 2001).
Application of systematic design tasks results in an instructional treatment plan (Table 1). The plan serves as a foundation for generating flowcharts, storyboards and prototypes of the instruction. Rather than expending the resources necessary to flowchart and storyboard an entire course or program, experience suggests that it is more efficient to flowchart and storyboard one unit as a basis for generating a vertical and a horizontal prototype (1). Similarly, rather than taking the time to generate an instructional treatment plan for all of the units that may comprise a course or training program, it is recommended that you generate one detailed treatment plan for a unit of instruction and then immediately create and test a vertical and horizontal prototype of your instruction. After testing and revising your prototypes, they may then be used as a template for developing additional instructional units. Further discussion of the development process goes beyond the scope of this article. For a more detailed tr eatment of tasks associated with the development of e-Learning materials, refer to Hirumi (2000); Freeman, Zundel, Singleton, Joyce and Hirumi (2001); Hirumi, Willis, Frey and Gause (2001); and Hirumi, Youngman, Gannon-Cook, and Haggerty (2000).
Like in many applied fields, advances in emerging DE technologies are far outpacing research on its effectiveness. Too often, instructors and course developers fail to ground their designs in research and theory (Bonk & King, 1998; Bonk & Cunningham, 1998; Bednar, Cunningham, Duffy, & Perry, 1995). Advances in telecommunication technologies are increasing access to educational opportunities. However, they are not necessarily enhancing the educational experience (Hirumi & Bermudez, 1996).
Educators may neither have the time nor the inclination to complete all of the prescribed systematic design tasks. If one chooses to skip or gloss over some steps, it is important to remember that an instructor may not be readily available to address inadequacies in the instructional materials during e-Learning. Flaws in design are amplified online. Application of a well-defined systematic instructional design process is encouraged to help ensure the design and development of effective instructional materials.
During the development of e-Learning, it is particularly important to ground instructional design decisions on a combination of experience, research and theory. A good instructor can make up for poorly designed instructional materials by captivating learners with their charisma and by facilitating key interactions that may or may not have been designed as an integral part of instruction. Remove the instructor or place him or her at a distance and the ability to plan, stimulate and manage e-Learning interactions becomes essential. This article posits a five-step process for designing and sequencing e-Learning interactions that should be applied within the context of a systematic design model. By applying the process, it is hoped that educators will be better able to create effective e-Learning programs that promote interactivity and optimize the potential of telecommunications technologies to enhance both individual and collaborative learning.
Figure 1. Sample outlines of grounded instructional strategies
Nine Events of Instruction
1. Gain Attention
2. Inform Learner of Objective(s)
3. Stimulate Recall of Prior Knowledge
4. Present Stimulus Materials
5. Provide Learning Guidance
6. Elicit Performance
7. Provide Feedback
8. Assess Performance.
9. Enhance Retention and Transfer
1. Set Learning Challenge
2. Negotiate Learning Goals and Objectives
3. Negotiate Learning Strategy
4. Construct Knowledge
5. Negotiate Performance Criteria
6. Assess Learning
7. Provide Feedback (Steps 1-6)
8. Communicate Results
1. Orientation to the Case
2. Identifying the Issues
3. Taking Positions
4. Exploring the Stance(s), Patters of Argumentation
5. Refining and Qualifying the Positions
6. Testing Factual Assumptions Behind Qualified Positions
1.1 Present topic of simulation
1.2 Explain simulation
1.3 Give overview
2. Participant Training
2.1 Set-up scenario
2.2 Assign roles
2.3 Hold abbreviated practice
3. Simulation Operations
3.1 Conduct activity
3.2 Feedback and evaluation
3.3 Clarify misconceptions
3.4 Continue simulation
4. Participant Debriefing
4.1 Summarize events
4.2 Summarize difficulties
4.3 Analyze process
4.4 Compare to the real world
5. Appraise and redesign the simulation
1.1 Establish lesson content
1.2 Review previous learning
1.3 Establish lesson objectives
1.4 Establish lesson procedures
2.1 Explain new concept or skill
2.2 Provide visual representation
2.3 Check for understanding
3. Structured Practice
3.1 Lead group through practice
3.2 Students respond
3.3 Provide corrective feedback
4. Guided Practice
4.1 Practice semi-independently
4.2 Circulate, monitor practice
4.3 Provide feedback
5. Independent Practice
5.1 Practice independently
5.2 Provide delayed feedback
1. Experience -- Immerse learner in "authentic" experience.
2. Publish -- Talking or writing about experience. Sharing thoughts and feelings.
3. Process -- Debrief: Interpret published information, defining patterns, discrepancies and overall dynamics.
4. Internalize -- Private process, learner reflects on lessons learned and requirements for future learning.
5. Generalize -- Develop hypotheses, form generalizations and reach conclusions.
6. Apply -- Use information and knowledge gained from lesson to make decisions and solve problems.
1. Confrontation with the Problem
1.1 Explain inquiry procedures
1.2 Present discrepant event
2. Data Gathering - Verification
2.1 Verify nature of objects and conditions
2.2 Verify the occurrence of the problem situation
3. Data Gathering - Experimentation
3.1 Isolate relevant variables
3.2 Hypothesize and test casual relationships
4. Organizing, Formulating and Explanation - Formulate rules or explanations
5. Analysis of inquiry process - Analyze inquiry strategy and develop more effective ones.
1. Concept Formation
1.1 Enumeration and listing
1.3 Labeling, Categorizing
2. Interpretation of Data
2.1 Identify critical relationships
2.2 Explore relationships
2.3 Make inferences
3. Application of Principles
3.1 Predicting consequences
3.2 Explaining predictions
3.3 Verifying predictions
1. Starting a New Problem
1.1 Set problem
1.2 Describe requirements
1.4 Assign tasks
1.5 Reason through the problem
1.6 Commitment to outcome
1.7 Shape issues and assignment
1.8 Identify resource
1.9 Schedule follow-up
2. Problem Follow-Up
2.1 Resources used
2.2 Reassess the problem
3. Performance Presentation(s)
4. After Conclusion of Problem
4.1 Knowledge abstraction and summary
Figure 2. Cursory listing of alternative educational experiences
* Read journal articles or textbooks.
* Listen to a lecture given by the instructor.
* Face to face interactions with students in class.
* Synchronous (Real-time) audio, video, and/or text interactions with students in remote locations.
* Read and Post messages on a listserv.
* Participate in question and answer session with instructor.
* Participate in class discussion.
* Search for and retrieve information via file transfer protocol (FTP).
* Write research, position or concept paper.
* Conduct observations.
* Reflective thinking and writing.
* Create, distribute, compile and analyze surveys/questionnaires.
* Develop and/or analyze case studies.
* Individual problem-solving.
* Group/collaborative problem-solving.
* Simulate a real-world event (e.g., courtroom trial, elections, space launch).
* Interview others.
* Visit community resources.
* Examine on-line resources.
* Conduct library research.
* Examine and/or assess other students work.
* Send and receive e-mail.
* Create and post a world-wide-web site.
* Search for and retrieve information via the World Wide Web.
* Asynchronous audio, video, and/or text communications with students in remote locations.
* Read and post information to a newsgroup.
* Search for and retrieve information on gopher sites.
* Create and make a presentation.
* Chat with others in real-time on the Internet using inter-relay chat (IRC).
* Create and give multimedia presentation.
* Interact with Computer-Based Instruction.
* Interact with Laserdisc program.
* Interact with people in remote locations utilizing desktop video.
* Generate and manipulate a database.
* Generate and manipulate a spreadsheet.
* Watching an instructional video or film.
* Organize, analyze, synthesize and interpret information gathered from sources.
* Participate in a debate.
* Participate in a panel discussion.
* Attend guest lecture.
* Interact with computer simulation.
* Complete individual or group project.
Table 1 Components of instructional treatment plan Event Description 1. Gain Attention Description of how instruction will gain learners attention. 2. Inform Learners Description of how instruction will of Objectives inform learners of objectives. 3. Stimulate Recall Description of how instruction will of Prior Knowledge stimulate recall. 4. Present Stimulus Description of how instruction will present stimulus information. 5. Provide Learning Description of how instruction will Guidance provide learning guidance. 6. Elicit Performance Description of how instruction will elicit learner performance. 7. Provide Feedback Description of how instruction will provide feedback. 8. Assessment Description of how instruction will Performance assess learner performance. 9. Enhance Retention Description of how instruction will and Transfer enhance retention and transfer. Event Interaction Tool 1. Gain Attention Learner-Instructor BBS 2. Inform Learners Learner-Content Web Page of Objectives 3. Stimulate Recall Learner-Content, Web Page of Prior Knowledge Learner-Learner BBS 4. Present Stimulus Learner-Content Web Page 5. Provide Learning Learner-Instructor Chat Guidance 6. Elicit Performance Learner-Content, Learner Learner-Learner Web Page 7. Provide Feedback Learner-Instructor Whiteboard 8. Assessment Learner-Content Web Page Performance 9. Enhance Retention Learner-Instructor, BBS and Transfer Learner-Learner
Note: (1.) Horizontal prototypes present users with a wide range of program features that may not be fully functional. Vertical prototypes, in contrast, present users with relatively few program features that are fully functional.
American Society for Training and Development (2001). www.Breakthebarriers.com [Online]. Orinda, CA: Authors.
Bates, A. W. (1990). Third generation distance education: The challenge of new technology. Paper presented at the XV World Conference on Distance Education, Caracas, Venezuela. (ERIC Document Reproduction Service No. ED 332 688)
Beare, P. L. (1989). The comparative effectiveness of videotape, audiotape, and telelecture in delivering continuing teacher education. The American Journal of Distance Education, 3(2), 57-66.
Bednar, A., Cunningham, D. J., Duffy, T., & Perry, D. (1995). Theory in practice: How do we link? In G. Anglin (Ed.), Instructional technology: Past, present, and future (2nd ed., pp. 100-112). Englewood, CO: Libraries Unlimited.
Bermudez, A., & Hirumi, A. (2000). Examining the effectiveness of systematically designed web-based instruction. Interactive Learning Environments, 8(2), 1-12.
Bonk, C. J., & King, K. (1998). Computer conferencing and collaborative writing tools: Starting a dialogue about student dialogue. In C. J. Bonk & K. King (Eds.). Electronic collaborators: Learner-centered technologies for literacy apprenticeship, and discourse (pp. 3-23). Mahwah, NJ: Lawrence Erlbaum
Bonk, J. C., & Cunningham, D. J. (1998). Searching for learner-centered, constructivist, and sociocultural components of collaborative educational learning tools. In C. J. Bonk & K. S. King (Eds.), Electronic collaborators: Learning-centered technologies for literacy apprenticeship, and discourse (pp. 25-50). Mahwah, NJ: Lawrence Erlbaum Associates.
Bonk, C. J., & Reynolds, T. H. (1997). Learner Centered Web Instruction for Higher-order thinking, teamwork, and apprenticeship. In B. Khan (Ed.), Web-Based Instruction (pp. 167-178), Englewood Cliffs, NJ: Educational Technology Publications.
Booher, R. K., & Seiler, W. J. (1982). Speech communications anxiety: an impediment to academic achievement in the university classroom. Journal of Classroom Instruction, 18(1), 23-27.
Borsook, T. (1991). Harnessing the power of interactivity for instruction. In M.R. Simonson and C. Hargrave (Eds.), Proceedings of the 1991 Convention of the Association for Educational Communications and Technology (pp. 103-117). Orlando, FL: Association for Educational Communications and Technology.
Carlson, R. D., & Repman, J. (1999). Web-based interactivity. WebNet Journal, 1(2), 11-13.
Clark, R.E. (1994). Media and method. Educational Technology Research and Development, 42, 7-10.
Cuban, L. (1993). How teacher taught (2nd ed.). New York: Teachers College Press.
Dick, W., Carey, L., & Carey, J. O. (2000). The systematic design of instruction (5th edition), New York: Addison-Wesley.
Freeman, V., Zundel, B., Singleton, C., Joyce, R., & Hirumi, A. (2001). Leadership and collaboration in the development of an online degree program. Concurrent session presented at the annual Texas Distance Learning Association conference. Houston, TX.
Hannafin, M., J. (1989). Interaction strategies and emerging instructional technologies: Psychological perspectives. Canadian Journal of Educational Communication, 18(3), 167-179.
Hannafin, M., J., Hannafin, K. M., Land, S. M., & Oliver, K. (1997). Grounded practice and the design of learning systems. Educational Technology Research and Development, 45(3), 101-117.
Harris, J. (1994a, February). People-to-people projects on the Internet. The Computing Teacher, 48-52.
Harris, J. (1994b, March). Information collection activities. The Computing Teacher, 32-36.
Harris, J. (1994c. April). Opportunities in work clothes: Online problem-solving project structures. The Computing Teacher, 52-55.
Hill, N., Williams, R., & Hirumi, A. (2001). Facilitating the development of e-Learning through a support site. Concurrent session presented at the annual Texas Distance Learning Association conference, Houston, TX.
Hillman, D.C., Willis, D. J. & Gunawardena, C. N. (1994). Learner-interface interaction in distance education: an extension of contemporary models and strategies for practitioners. The American Journal of Distance Education, 8(2), 30-42.
Hirumi, A. (2000). Chronicling the challenges of Web-basing a degree program: A systems perspective. The Quarterly Review of Distance Education, 1(2), 89-108.
Hirumi, A., Chandler, K. & St. John, C. (2001). Training faculty on the systematic design of e-Learning. Concurrent session presented at the annual Texas Distance Learning Association conference. Houston, TX.
Hirumi, A., & Bermudez, A. (1996). Interactivity, distance education, & instructional systems design converge on the super information highway. Journal of Research on Computing in Education, 24(1), 1-16.
Hirumi, A., Willis, J., Frey, J., & Gause, C. (2001). Online TEKS training: Educating Texas teachers on the TA of TEKS. Concurrent session presented at the annual Texas Distance Learning Association conference. Houston, TX.
Hirumi, A., Youngman, T., Gannon-Cook, R., & Haggerty, B. (2000, February). The systematic design, dDevelopment, and implementation of an online degree and certification programs. Symposium conducted at the annual Association for Educational Communication and Technology conference, Long Beach, CA.
Jonassen, D. H. (1995, June). An instructional design model for designing constructivist learning environments. World Conference on Educational Media, Graz, Austria.
Kearsley, G. (1997). A guide to on-fine education [Online]. (May 25, 1997) Available: http://www.fcae.nova.edu/~kearsley/on-line.htm
Kemp, J. E., Morrison, G. R., & Ross, S. M. (1994). Designing effective instruction. NY: Macmillan College Publishing.
Mason, R., & Kaye, T. (1990). Toward a new paradigm for distance education. In L. M. Harasim (Ed.), Online education: Perspectives on a new environment (pp. 15-30). New York: Praeger.
Merrill, D., Li, Z., & Jones, M. K. (1990). Second generation instructional design. Educational Technology, 30(2), 7-15.
Mioduser, D., Nachmias, R., Lahav., O., and Oren, A. (2000). Web-base learning environments: Current pedagogical and technological state. Journal of Research on Computing in Education, 33, 55-79.
Moore, M. G. (1995). The 1995 distance education research symposium: A research agenda. The American Journal of Distance Education, 9(2), 1-6.
Moore, M. G. (1989). Editorial: Three types of interaction. The American Journal of Distance Education, 3(2), 1-6.
Mortera-Gutierrez, F. & Murphy, K. (2000). Instructor interactions in distance education environments: A case study Concurrent session presented at the annual distance education conference sponsored by the Texas A&M Center for Distance Education, Austin, TX.
Northrup, P. (2001). A framework for designing interactivity in Web-based instruction. Educational Technology 41(2), 31-39.
Shale, D., & Garrison, D. R. (1990). Education and communication. In O. R. Garrison & D. Shale (Eds.), Education at a distance (pp. 23-39). Malabar, FL: Robert E. Krieger.
Simonson, M., Smaldino, S., Albright, M., & Zvacek, S. (2000). Teaching and learning at a distance. Foundations of distance education. Upper Saddle River, NJ: Prentice Hall.
Smith, P. L., & Ragan T. J. (1999). Instructional design (2nd ed.). Upper Saddle River, NJ: Prentice-Hall.
Soby, M. (1990). Traversing distances in education: The PortaCOM Experiment. In A. W. Bates (Ed.), Media and technology in European distance education. Proceedings of the EADTU workshop on media, methods and technology (pp. 241-247). Milton Keynes, UK: Open University for EADTU.
Souder, W. E. (1993). The effectiveness of traditional vs. satellite delivery in three management of technology master's degree programs. The American Journal of Distance Education, 7(1), 37-53.
Weller, H. G. (1988). Interactivity in microcomputer-based instruction: Its essential components and how it can be enhanced. Journal of Educational Technology Systems, 28(2), 23-27.
ATSUSI HIRUMI, UNIVERSITY OF HOUSTON--CLEAR LAKE, HOUSTON, TX USA
|Printer friendly Cite/link Email Feedback|
|Publication:||International Journal on E-Learning|
|Date:||Jan 1, 2002|
|Previous Article:||Meeting the assessment demands of networked courses.|
|Next Article:||Collaborative knowledge building in web-based learning: assessing the quality of dialogue.|