Impact of technology on the content and nature of teaching and learning.
Key Words Distance Learning--Learning Styles--Motivation--Online Education--Nursing Education Research
ALTHOUGH DISTANCE EDUCATION is increasingly important to graduate programs in nursing, little research has been conducted on the impact of technology on teaching and learning. In addition, few studies have examined the synergistic effects of more than one technology with particular student groups. THIS ARTICLE REPORTS ON A PILOT STUDY TO EXAMINE LEARNING STYLES AND MOTIVATION AND THEIR RELATION TO OUTCOMES IN TWO ONLINE COURSES IN A GRADUATE NURSING PROGRAM. THE RESEARCH WAS DESIGNED TO MEET THE FOLLOWING FOUR OBJECTIVES:
* Examine learner characteristics and student motivations in computer-mediated courses
* Identify the skills students need to use the technology effectively
* Describe the characteristics of adult learners in two online cohort courses
* Examine the relationships among learning style, motivation, use of technology, and course outcomes
Background Excelsior College, a pioneer in distance and lifelong learning for three decades, offers a web-based graduate nursing program with a focus on clinical systems management and health care informatics. A needs assessment conducted in 1997 identified the need for a distance-based master's program that included nursing informatics education (1). Since the college admitted the first students into the master's program in 1999, nearly 300 students have enrolled, indicating a significant level of interest at a time when traditional campus-based programs are experiencing declining enrollments in their nursing administration programs. The master's program has three phases: the nursing and informatics core component, the clinical systems management component, and a capstone.
A certificate program in health care informatics initiated by the college at the same time as the master's program consists of six online courses, four of which constitute the informatics core component of the graduate program. The certificate program is designed to provide students with the knowledge and skills necessary to assume leadership roles in the use of information management systems. Both programs incorporate educational strategies that include: 1) synchronous and asynchronous collaborative learning and communication, 2) project/activity-based learning in an information-rich environment, and 3) enhanced student-faculty and student-student interaction and feedback.
A system of program evaluation is in place to measure the effectiveness and efficiency of the current master's and certificate programs. This is a carefully designed and well-planned evaluation, established according to program goals and objectives, and capable of demonstrating and documenting measurable progress toward achieving the stated goals (2). The evaluation extends well beyond quantitative measures to capture the many diverse accomplishments of students and faculty and the programs as a whole. The faculty continually use evaluation findings from many sources to improve the curricula and the activities of these programs.
Literature Review Information that takes into account differences among students and how learning styles relate to the use of technology would have the potential to influence the choice of technology and how a course is designed. However, Phipps and Merisotes note that "despite the large volume of written material concentrating on distance learning, there is a relative paucity of true, original research dedicated to explaining or predicting phenomena related to distance learning" (3, p. 2). These authors point to factors that influence the characteristics of learners, including gender, age, educational experience, and motivation. They identify three broad measures of educational effectiveness: student outcomes, such as grades and test scores; student attitudes about learning through distance education; and overall student satisfaction with regard to distance education.
Focus groups conducted by Lunyk-Child and colleagues (4) explored student and faculty perceptions of self-directed learning. The results reveal major themes, including the need to define self-directed learning, the struggle for consistency, and the need for ongoing faculty development.
Noting that the demographics for online students are different than for other students, Swan and colleagues (5) point to characteristics that may be prerequisites for success, including computer experience, motivation, and gender. "Gender affected course satisfaction and perceived learning. Women were more likely than men to be satisfied with the courses they took and to report higher levels of learning from them" (5, p. 9). Their research found three factors shown to influence satisfaction, learning, and success: consistency in course design, contact with course instructors, and active discussion.
Methodology for the Pilot Study Questionnaires, interviews, standardized scales, and course assessments were selected or developed as needed to collect the essential data for the pilot study. Only aggregate data were used, and results were available only to the researchers. The proposal was approved by the Excelsior College Institutional Review Board.
SUBJECTS Subjects were recruited among students registering for two online cohort courses in the core component of the program. "Professional Role Development and Ethics" (PRDE) focuses on the advanced role of the nurse and includes theoretical foundations of role and ethics, analysis of advanced nursing roles and competencies, and skills of ethical problem solving. "Informatics and the Health Care Delivery System" (IHCDS) focuses on the history of health care informatics and basic informatics concepts. Students learn about specific information management applications in health care administration, practice, education, and research. These are credit-bearing, academic courses delivered online through a web-based learning platform. Each uses additional technology: an information database or a subject-specific CD-ROM on end-of-life care.
Students received a letter describing the purpose and nature of the research, promising confidentiality and asking for cooperation. Those who agreed to participate completed a motivation and learning styles inventory prior to beginning the two courses and were interviewed at completion. Each course was offered a minimum of twice each year with enrollment between eight and 20 students each time. Subjects were included until at least 20 students had taken each course.
INSTRUMENTS The pilot study used existing data for comparisons: 1) a data sheet completed at program entry that includes demographic information and questions about motivation/support, use of technology, and writing skills; 2) results from a self-assessment of computer knowledge and skills completed prior to taking the first course in the program; 3) individual course evaluations, including information about satisfaction, support services, and the use of technology; and 4) the course grade, including both participation and specific course assessments.
Other instruments used for program evaluation were developed using or adapting items from the Flashlight Current Standard Inventory Version 1.0. The Western Cooperative for Educational Telecommunications developed and pretested these 500 survey items and interview questions at five partner institutions. The "inventory was subjected to content validity testing: an 18 month series of focus groups culminated in meetings with students, faculty and administrators" (6, p. xi). This led to specific items being included in Version 1.0.
Two instruments that are part of the overall program evaluation plan were used to collect student information, the data sheet completed at enrollment and the course evaluation form. The data sheet was found to be reliable, with alphas from .80 to .88, for subscales asking for information about demographics, motivation for choosing the program, self-assessment of technology skills, and writing skills. Along with one question on overall program satisfaction, the course evaluation form asks about satisfaction with the way in which electronic media enhance learning, support services such as online writing assistance and technology, and areas such as library and advisement. Students entering the program also complete an orientation to the learning platform that includes a self-assessment of computer knowledge and skills and exercises in the use of computers and course navigation using the learning platform.
After a review of the literature, discussion with experts, and an examination of a number of learning inventories and questionnaires, the Motivated Strategies for Learning Questionnaire (MSLQ) was selected for the pilot study (7). The MSLQ is a self-report instrument designed to measure student motivational beliefs and learning styles using a Likert-type scale ranging from 1 (not at all true of me) to 7 (very true of me).
The motivation section includes an expectancy component (self-efficacy for learning and performance and control of learning beliefs) and a value component (task value, extrinsic goal orientation, and intrinsic goal orientation). The learning style section includes cognitive and metacognitive strategies (rehearsal, elaboration, organization, critical thinking, and metacognitive self-regulation) and resource management strategies (time/study environment, effort regulation, peer learning, and help seeking).
To assist in discovering student perceptions about the technology, a course completion interview format was developed using a protocol from the Flashlight Project. "The open-ended format of the interview protocol allowed the student to offer information about 'unique uses' of technology that might not be revealed through the structured questionnaire format" (6, p. 3-58). A sample script was developed to elicit opinions about the strengths and weaknesses of the technology in relation to communication and completion of assignments. The script also asked, "What is the most important outcome of this course for you?"
Data Analysis Information provided from the program entry data sheet and course evaluations was entered into a database in SPSS. The data were merged with information from the MSLQ, as well as course and computer self-assessment grades. The analysis strategy was specific to the data and the types of questions proposed, with sensitivity to the relatively small sample size. Each question outlined in the proposal was addressed, and the entire process was completed within the span of two years. Interview data were examined separately and content analysis was conducted to identify common themes.
Forty-nine students agreed to participate in the study--29 in PRDE and 20 in IHCDS. The majority were female (94 percent) and between 41 and 50 years of age (53 percent). The second largest group (31 percent) were students 51 to 60 years of age. Almost three-fourths of the students (71 percent) were taking one of the first courses in the program. One third listed their current position as manager/supervisor/coordinator and another third as staff nurse. The rest of the group listed informatician, infection control nurse, academic educator, and other. Scores found for 33 of the participants on the computer self-assessment quiz ranged from 40 to 92; the mean for all students was 69.2
The MSLQ scores for individual items under motivation ranged from "I'm confident I can learn the basic concepts taught in this course" (M = 6.51) to "The most important thing fro me right now is improving my overall GPA" (M = 3.79). Scores for the groupings under the expectancy (self-efficacy, control of learning beliefs) and value (task value, extrinsic goal orientation, intrinsic goal orientation) components are presented in the Table.
Scores for individual items under learning style ranged from "I go online for this course regularly" (M = 6.58) to "When course work is difficult, I either give up or only study the easy parts" (M = 1.72). The Table presents the scores for the groupings under the cognitive and metacognitive (rehearsal, elaboration, organization, critical thinking, self-regulation) and resource management strategy (study environment, help seeking, peer learning, effort regulation) components.
The final grades for each course were calculated numerically and then converted to letter grades (A, B, C, and F). The grade for PRDE included evaluation of discussion participation, a case study presentation, and a marketing plan team assignment. For IHCDS, evaluation of discussion participation and a final paper analyzing an information management application were combined for the final grade.
At the end of each course, students completed standard evaluations containing 26 statements grouped in five areas using a scale ranging from 1 (strongly disagree) to 5 (strongly agree). The results are as follows: course (M = 4.47, SD = .615); computer (M = 4.30, SD = .703); problem solving (M = 4.26, SD = .797); instructor (M = 4.15, SD = .935); and electronic communication (M = 3.65, SD = .517).
The content of the interviews was coded and the following themes emerged: strengths (ease of time, distance, and use; positive student interaction); weaknesses (communication issues related to Learning Space platform problems and lack of face-to-face contact and responsiveness; no problems identified with the assignments); and most important outcomes (content acquisition and job enhancement).
To examine whether learning styles and motivation related to course outcomes, a hierarchical linear regression was performed in which course grade was regressed onto each of the 15 components of the MSLQ entered in four different blocks. The blocks represented resource management strategies, cognitive and metacognitive strategies, value orientation, and expectancy. The MSLQ did not account for a significant portion of variance in course grades (F (14,20) = 1.37, p > .05).
A similar analysis in which computer self-assessment scores were regressed onto MSLQ subscales found similar results (F (14,20) = .527, p > .05). ANOVA models were used to look for a relationship between gender and age (measured on an ordinal scale) and the two criterion variables discussed previously; neither accounted for significant differences. Measures of satisfaction with the online learning experience did not relate to any other variables measured in the study (i.e., MSLQ subscales and experience with web-based instruction).
Implications The tools developed for and used in the study can be applied in all courses across a curriculum to enhance evaluation of program effectiveness and student learning. At Excelsior College, funding is being sought for a larger research study to evaluate all components of the curricula, including program outcomes. The results of this pilot study will be used to guide the development of the research protocol to attain better answers to the study questions and refine the data collection components. Additional research should focus on how individuals learn and which technologies might be better suited for specific learning activities. Other institutions engaged in online education are encouraged to use similar methodologies so that comparisons can be made on a broader scale.
It is hoped that this pilot study and future research will lead to an expanded body of knowledge about technologies best suited to specific tasks. It is also hoped that the results will help alter the faculty role to one that combines expertise in the content area with that of learning process designer and manager.
(1.) Nesler, M., Sopczyk, D., Cummings, K., & Fortunato, V. (1998). Nursing informatics needs assessment: Are distance programs needed? Nurse Educator, 23(5), 25-29.
(2.) Peinovich, P., & Nesler, M. (2001). Excelsior College outcomes assessment framework. Albany, NY: Excelsior College.
(3.) Phipps, R., & Merisotis, J. (1999). What's the difference? A review of contemporary research on the effectiveness of distance learning in higher education. Washington, DC: Institute for Higher Education Policy.
(4.) Lunyk-Child, O., Crooks, D., Ellis, R, Ofosu, C., O'Mara, L., & Rideout, E. (2001). Self-directed learning: Faculty and student perceptions. Journal of Nursing Education, 40(3), 116-123.
(5.) Swan, K., Shea, R, Frederickson, E., Pickett, A., Pelz, W., & Maher, G. (2000). Building knowledge-building communities: Consistency, contact and community in the virtual classroom. Albany, NY: SUNY Learning Network.
(6.) Ehrmann, S., & Zuniga, R. (1997). The Flashlight evaluation handbook. Washington, DC: American Association for Higher Education.
(7.) Pintrich, P., Smith, D., Garcia, T., & McKeachie, W. (1991). A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ). Ann Arbor, MI: National Center for Research to Improve Postsecondary Teaching and Learning, University of Michigan.
Patricia A. Edwards, EdD, RN, CNAA, is associate dean and director of the graduate program, School of Nursing, Excelsior College, Albany, New York. This research was supported by a 2001 National League for Nursing Research Grant. The author acknowledges co-investigators Linnea L. Jatulis, PhD, RN, and Mitchell S. Nesler, PhD, as well as research assistants Susan Ambrosy, MS, RN, and Annie Moore-Cox, MS, RN. For more information, contact Dr. Edwards at firstname.lastname@example.org.
Table 1. Motivation and Learning Style Scores Motivation Mean (SD) Self-efficacy 5.78 (.859) Control of learning beliefs 5.39 (1.049) Task value 6.10 (.813) Extrinsic goal orientation 4.39 (1.374) Intrinsic goal orientation 5.74 (.856) Learning Style Mean (SD) Rehearsal 3.96 (1.346) Elaboration 5.59 (.867) Organization 4.76 (1.568) Critical thinking 5.26 (1.064) Self-regulation 4.77 (.717) Time / Study environment 5.12 (.368) Help seeking 4.55 (.988) Peer learning 4.44 (1.598) Effort regulation 4.02 (.632)