Virtual necessities: assessing online course design.
Areport from the Campus Computing Project noted that "As in the past five years, survey respondents across all sectors of higher education identify assisting faculty integrated technology into instruction as the single most important IT issue confronting their campuses" (Greene, 2001, p. 2, italics added). Educause also conducted member surveys for the years 2000, 2001, and 2002 to identify the most pressing concerns regarding information technology in higher education (Gandel, & The Educause Current Issues Committee, 2000; Lembke, Rudy, & The Educause Current Issues Committee, 2001; Kobulnicky, Rudy, & The Educause Current Issues Committee, 2002). These survey reports noted that faculty development, support, and training was ranked as one of the top three issues by all three surveys (Greene; Gandel, et al.; Lembke, et al.; Kobulnicky, et al.). These surveys included all personnel from several institutions of higher education.
Moskal and Dziuban (2001) reported the results of a survey that was specific to 38 faculty who taught online web-enhanced or web-based courses at a single university. Eighty-five percent of participants reported that teaching online requires additional investments of time. Faculty participants advice to other faculty "included the importance of preparation (30%), technical support (16%), technology knowledge (16%), clearly defined course design (8%) and goals (8%), significant time demands, and commitment (8%)" (Moskal & Dziuban, p. 178). These survey responses affirm the findings from The Educause and Campus Computing Project surveys that included a broader sample of higher education professionals including administrators. Both faculty and administrators identify a need for support and preparation to implement online courses.
In response to this need, several organizations in higher education have published guidelines and benchmarks of quality for distance learning (American Council on Education, 2001; Higher Education Program and Policy Council of the American Federation of Teachers; 2000; Institute for Higher Education Policy (IHEP), 2000). These guidelines have focused on course development in relation to the organizational infrastructure. In a critique of these guidelines, the Pew Learning and Technology Program stated that "What is missing is a process of quality assurance aimed at the course level" (Twigg, 2001, p. 14). The Principles of Online Design and Online Design Checklist were developed to focus on the course level. Referring to the IHEP principles, Twigg noted that "These principles of good practice are basically process-oriented and resemble current accreditation practices. How do we know the institutions and organizations in fact apply them?" (Twigg, p. 9). The development of principles aimed at the application of good practice in the design of online courses has been the focus of ongoing development at Florida Gulf Coast University (FGCU).
DEVELOPMENT OF THE PRINCIPLES OF ONLINE DESIGN
A multidisciplinary collaborative process was used to survey and articulate best practices in online course design at FGCU. Initially, a study team composed of faculty, instructional designers, and media support professionals reviewed the literature to identify guiding principles and best practices for developing courses for the Web. The Principles of Online Design described benchmarks of quality for the development and continuous improvement of online instruction. The first iteration of the principles yielded a lengthy and linear document that was published on the Web with links to websites that served as examples of the principles. Although this document was thorough, usability was a concern.
During 2000-2001, a new study group at FGCU determined that faculty might be more inclined to use the principles if they are presented in a more concise and interactive format. This study group concluded that the linear format of the principles did not foster user control and interactivity. Hence, a second iteration of the Principles of Online Design, shown in Figure 1, was developed.
Content was arranged into a two-column format with each principle in the left column and examples or links to resources in the right column. The original document used a numbered coding or indexing system to catalogue each principle. In the second iteration, this coding system was simplified. Internet links to examples within the principles were also repaired or removed in the second iteration.
[FIGURE 1 OMITTED]
Once the principles were streamlined, the study group turned its attention to assessing the implementation of the principles. In other words, what indicators would provide evidence that the principles were being used to design online courses? To answer this question, a most important determination of the second study group was that a checklist, designed to correspond with the principles, would expedite implementation of the principles. Hence, a primary goal of the checklist has been to help faculty and course developers to review design components more quickly and efficiently. To facilitate usability, each indicator on the checklist was linked to the original text of the Principles of Online Design using the indexing or coding system. In this way, with the click of a mouse, faculty and online course designers would be able to quickly access the more expansive information contained within the Principles of Online Design. This article describes the general design and the process used to develop the Online Design Checklist.
DEVELOPMENT OF THE ONLINE DESIGN CHECKLIST
The checklist was created by carefully paring down the Principles of Online Design through numerous collaborative meetings that took place over a period of five months. The checklist was numbered to correspond with the revised indexing system of the Principles of Online Design. Each indexed principle was restated as a concretely identifiable indicator of the principle.
For example, as shown in Figure 2 on the checklist, number 1.1.4 states "Educational prerequisites are listed." This links to principle number 1.1.4, which states that "Audience analysis should determine the learners' personal characteristics, intellectual skills, subject knowledge level, and the purpose of taking the course" (McKnight, 2001). The following goals were articulated for the Online Design Checklist checklist:
* to serve as a brief and efficient guide to help faculty incorporate pedagogically sound;
* design principles into online courses under development.;
* to be used as a tool to evaluate courses for continuous improvement and redesign; and
* to be used to identify exemplars of quality course design.
During summer 2001, the Online Design Checklist was field-tested with eight instructional technology students at FGCU. Students were asked to review an online course using the checklist. Although the checklist, the principles, and the course are available online, participants used a printed version of the principles and the checklist for the field test. This precluded the necessity of having to toggle between multiple windows to access the principles website, the checklist website, and the course website.
Participants were asked to complete a survey that ranked their level of agreement regarding the usability of the checklist. The following criteria were used:
* ease of using the checklist,
* absence of jargon and ease of understanding,
* clarity of checklist indicators, and
* correspondence between checklist indicators and the principles.
Participants were asked to write their comments and thoughts in the margins as they used the checklist. They were also asked to answer the following questions about the checklist:
* What additional aspects of online design should be included in the checklist?
* How can the checklist be improved?
* As a course developer, would you use the design checklist? Why or why not?
* Will you refer to the Principles of Online Design and the Online Design Checklist as you develop online courses in the future?
Based upon feedback from these student field tests, in October 2001, minor modifications were made to the design checklist and approved by the Department of Course and Faculty Development Advisory Committee. During this same month, the FGCU Faculty Senate approved and recommended use of the Online Design Checklist for peer evaluation of distance learning courses.
In October 2001, the Online Design Checklist was presented during a half-day tutorial at the Association for the Advancement of Computing in Education's (AACE) WebNet 2001 Conference in Orlando, Florida. Using the same surveys, the checklist and principles were demonstrated to 11 participants. Since conference participants did not have access to a computer to review an online course, they observed a demonstration of the checklist. This group, however, was not able to complete the checklist since participants did not have individual access to an online course and due to time limitations. In January 2002, the Board of Trustees of FGCU approved a recommendation from the Faculty Senate that the Online Design Checklist be used for peer evaluation of distance learning courses.
PARTICIPANT FEEDBACK ABOUT THE ONLINE DESIGN CHECKLIST
In this study, both survey data and open-ended questions were used to field-test the checklist. The use of "multiple outcome measures" has been encouraged and demonstrated to be an effective method of evaluation (Moskal & Dziuban, 2001). Seven students participated in a field test in summer 2001. All students were in the instructional technology curriculum at FGCU. Five students had taken at least 10 internet-based courses. All students had used computers for at least two years. Three were females and four were males. Student participants took between 50 and 80 minutes to complete a paper-and-pencil version of the checklist. All seven participants completed this part of the field test. Of those seven, five returned the Design Checklist Usability Feedback Survey. Table 1 summarizes these survey data.
All 11 participants in the conference tutorial field-test completed the Design Checklist Usability Feedback Survey. This international group of participants included an online learning systems administrator, a teacher, a human resources consultant, a user support consultant, a professor, a project director, a technology administrator, a technology trainer, and an instructional designer. Two participants did not provide this information. Table 2 summarizes these survey data.
Neutral responses to item four in the conference field test may reflect the fact participants did not actually use the checklist but observed a demonstration of using the checklist. Sixteen participants in both groups, who completed the usability survey, also answered the open-ended questions.
[FIGURE 2 OMITTED]
What additional aspects of online design should be included in the checklist? Responses to this question varied only slightly. Ten participants, three students and seven conference participants, reported that they would not add to the checklist. Two students noted that "uniformity in design must be stressed." One conference participant suggested that a "not applicable" choice be added to the checklist. Another conference participant suggested that the checklist alone is enough for users and that the Principles of Online Design are "too complete." This last comment seemed to suggest that use of the checklist without the principles would streamline the process.
How can the checklist be improved? Responses to this question yielded the following responses from students: "very thorough and easy to use"; "work on deleting jargon"; "downsize by examining redundant information"; "during a period of time"; and, "design it for resource pages only." Eight of the 11 conference participants did not offer suggestions to improve the checklist. One participant suggested making a "one-page printable version for use during development--very brief descriptors--large font." Another suggested leaving "a place for instructor/reviewer to indicate whether course is blended or stand-alone since data may reflect delivery and methodology." Another participant suggested adding a "hyperlink to examples of each of the quality standards to concrete examples." Although hyperlinks to examples are a part of the Principles of Online Design, apparently this tutorial participant was not aware of these examples or felt there should be more examples. Overall, we found that participants wanted the checklist to be short, concise, and to the point.
As a course developer, would you use the design checklist? Why or why not? To this question, four students responded affirmatively. The student who stated they would not use the checklist suggested that "a standard syllabus template" would eliminate "problems resulting from inconsistencies in online syllabi." All conference participants reported that they would use the design checklist. Comments included, "It helps to ensure a minimum level of usability and quality," "As a developmental tool, it provides a relatively structured approach to bringing educational materials into the online environment. I will use it as an assessment instrument," "Yes, I would use it as a development tool. The examples are very useful," "I will be using online courses in the future and this checklist will give me an outline--good background to make the course more efficient," "It provides a guideline for faculty to help them develop online courses," "very pragmatic," "quite comprehensive," "serves as guideline to ensure that critical components are included in design and that design does comply with educational standards" and "as a rank beginner, the checklist gives me some real hints on what should be used to build a better course online."
Will you refer to the Principles of Online Design and the Design Checklist as you develop online courses in the future? Fifteen participants in the student and conference field tests stated they will refer to the Principles of Online Design with comments that included "and give to the other academic staff," "I will add them to the resources I have on the subject. I believe we are already using similar information," and "I see the relationship between the checklist and the principles to be the strongest feature of your work, particularly as a development tool. Thus, the online version of the checklist seems to be more useful."
Based upon feedback from these field tests, a revised, third iteration of the Online Design Checklist has been developed. The goal of these revisions was to allow users to view a website while having the checklist available on the same screen. This was accomplished using three frames on the same screen resulting in the design shown in Figure 3.
Frame two displays the checklist throughout the time that a user is reviewing an online course. Frame two provides space for users to type in a course URL. Typing the URL into frame two brings the course website into frame three. Frame three is the largest frame in the center of the screen. This frame enables users to browse through an online course while simultaneously using the checklist. As shown in Figure 4, the ability to view an online course while using the checklist adds significantly to the usability of the checklist.
Each of the principles in the scrolling Online Design Checklist is hyperlinked to more expansive content that is provided within the Principles of Online Design.
As shown in Figure 5, when these links are selected, information from the Principles of Online Design is displayed in a pop-up window. In addition, hyperlinks to examples are provided within the Principles of Online Design. Overall, this design simplifies the review of an online course and facilitates movement between the online course website, the checklist, and the principles.
ASSESSING ONLINE COURSE DEVELOPMENT
Participants in these field tests provided validation of the utility of the checklist and insights that guided refinement of the checklist. Overall, participants agreed or strongly agreed that the checklist was easy to use and easy to understand. During 2001 the Principles of Online Design were presented at five international conferences and the Online Design Checklist was presented at two international conferences. The reception for these faculty support and development tools has been further demonstrated through web usage tracking. The Principles of Online Design website continues to receive an increasingly larger number of visits. Between October 2001 and October 2002, the number of unique visits to this website has grown from 1200 to over 2200 visits per month. E-mail correspondence from visitors is positive and frequent requests are made to link to these websites. Recently, the Principles of Online Design were peer reviewed at the Multimedia Educational Resource for Learning and Online Teaching (MERLOT) website (MERLOT, 2002) with an excellent overall rating. Feedback from this peer review has been incorporated into the design of the Online Design Checklist. In accordance with reviewer suggestions, a recent review of literature is underway to update the original bibliography.
[FIGURE 3 OMITTED]
[FIGURE 4 OMITTED]
This reception seems to indicate that the Principles of Online Design is a valuable tool for identifying benchmarks of quality aimed at the course level. The Online Design Checklist was developed to facilitate implementation of the principles by providing an interactive, abbreviated, and useful format. The Online Design Checklist can be found at http://www.onlinedesign.edu/newchecklist. The checklist provides concise indicators that link to the Principles of Online Design at http://www.onlinedesign.edu. In light of the broadly reported need to support faculty in the use of technology (Greene, 2001; Gandel et al., 2000; Lembke et al., 2001; Kobulnicky et al., 2002; Moskal & Dziuban, 2001) and the need to focus on benchmarks of quality aimed at the course level (Twigg, 2001), the Online Design Checklist offers a means to assist faculty and developers in the design and evaluation of quality web-based and web-enhanced courses.
[FIGURE 5 OMITTED]
Table 1. Student Participant Responses to Design Checklist Usability Survey (N=5) Strongly Strongly Agree Agree Neutral Disagree Disagree 1. The design checklist was easy to use. 20% 80% 2. The design checklist was easy to understand (nor or little jargon). 20% 80% 3. Quality indicators were clear and easy to understand. 40% 60% 4. The design checklist corresponded with the Principles of Online Design. 40% 60% Table 2. Conference Participant Responses to Design Checklist Usability Survey (N=11) Strongly Strongly Agree Agree Neutral Disagree Disagree 1. The design checklist was easy to use. 100% 2. The design checklist was easy to understand (no or little jargon). 100% 3. Quality indicators were clear and easy to understand. 64% 36% 4. The design checklist corresponded with the Principles of Online Design. 27% 36% 36%
American Council on Education. (2001). Distance learning evaluation guide. Retrieved on October 18, 2001 from: http://www.acenet.edu/calec/publications.cfm?pubID=109
Gandel, P., & The Educause Current Issues Committee. (2000). Top IT challenges for 2000. Educause Quarterly, 2, 11-16.
Green, K. (2001). The 2001 national survey of information technology in US higher education. Retrieved on July 10, 2002 from: http://www.campuscomputing.net/nav.html
Higher Education Program and Policy Council of the American Federation of Teachers (2000, May). Distance education: Guidelines for good practice. Washington, DC: Author.
Institute for Higher Education Policy (2000, April). Quality on the line: Benchmarks for success in internet-based distance education. Washington, DC: Author.
Kobulnicky, P., Rudy, J.A., & The Educause Current Issues Committee (2002). Third annual EDUCAUSE survey identifies current IT issues. Educause Quarterly, 2. 8-21.
Lembke, R.L., Rudy, J.A., & The Educause Current Issues Committee (2001). Top IT challenges for 2001. Educause Quarterly, 2, 4-19.
McKnight, R. (Ed.) (2001). Online design checklist. Retrieved on November 21, 2002 from: http://www.fgcu.edu/onlinedesign/newchecklist/
Multimedia Educational Resource for Learning and Online Teaching (MERLOT) (2002). Design principles for on-line instruction. Retrieved on November 21, 2002 from: http://www.merlot.org/artifact/ArtifactDetail.po?oid=1400000000000001409
Moskal, P.D., & Dziuban, C.D. (2001). Present and future directions for assessing cybereducation: The changing research paradigm. In L.R. Vandervert, L.V. Shavinina, & R.A. Cornell (Eds.), Cybereducation: The future of long-distance learning. Larchmont, NY: M.A. Liebert.
Nakatani, K., Edwards, N., & Zhu, E. (2001). Development of interactive internet-based learning environments: Just-in-time instruction and effective course management. Proceedings of International Association for Computer Information Systems. CA, 2, (pp. 323-329).
Rodriguez, W., Fornaciari, C.J., Wynkoop, J., Harrington, T., Nakatani, K., Ruiz-Torres, A., Johnson, D., Boggs, R., & Pendergast, M. (2001). Information technology strategies for internet-based education. Journal of Informatics Education and Research, 3(1), 27-42.
Twigg, C. (2001). Quality assurance for whom? Providers and consumers in today's distributed learning environment. Retrieved on November 21, 2002 from: http://www.center.rpi.edu/pewsym/mono3.html
Zhu, E., & McKnight, R. (Eds.) (2001). Principles of online design. Retrieved on November 21, 2002 from: http://www.fgcu.edu/onlinedesign/
ROBERTA MCKNIGHT, HEALTHCARE MULTIMEDIA DESIGN, USA
|Printer friendly Cite/link Email Feedback|
|Publication:||International Journal on E-Learning|
|Date:||Jan 1, 2004|
|Previous Article:||Conceptualizing a framework for design of online communities.|
|Next Article:||A model of Learner-Centered Computer-Mediated Interaction for Collaborative Distance Learning.|