Printer Friendly

Undergraduate student online learning readiness.

INTRODUCTION

According to the "Learning on Demand: Online Education in the United States, 2009" survey completed by the Babson Research Group (Sloan Consortium, 2010) there were over 4.6 million students taking at least one online course during the fall term of 2008. This represents a growth rate of 17 percent for online enrollments compared to just a 1.2 percent growth rate for the higher education student population.

This higher growth rate is due in part from students wanting to save on fuel costs, promotion of more online curriculums and working adults who cannot attend traditional university classes (Dosch, 2010; Li-An et al., 2010). While this growth rate is welcomed by most universities it is of concern if these students are not prepared for online learning and are not able to progress in their curriculums. Lack of student progression in their curriculums may result in lower graduation rates and negative student comments which could mean lower funding levels for universities.

Student online learning readiness also influences most institutions in terms of their curricular development and pedagogies to entire academic divisions dedicated to Web-specific delivery. Measuring student readiness, it follows, should also be of great concern by institutions in order to better face this challenge. Smith (2003, 2005) administered the McVay Online Readiness Survey (2000, 2001) to 314 Australian university students and utilized a factor analysis to identify two primary factors for assessing the reliability and factorability of the instrument.

This study seeks replicate the study by Smith (2005) using the same McVay questionnaire but with American students. The McVay survey could serve as a useful tool for determining the extent of student readiness for online learning and learning in a collaborative environment (Smith, 2005).

LITERATURE REVIEW

According to Lever, Gerlich and Pearson (2006), there are no significant differences between traditional and online course offerings when conveying procedural knowledge. Other studies have shown marginal differences between the two modalities but Anitsal, Anitsal, Barger, Fidan and Allen (2008) found that this may have more to do with differences in student personality profiles. Their findings indicated that those students taking on-ground classes valued thinking less and therefore favored the instructor-led model, due to increased individual attention, over online formats. Additionally, on-ground students tend to be extroverted and place high value on face-to-face interaction. The assumption of placing learners in a situation where less personal forms of communication, like email, might also adversely affect group-based outcomes, especially in online courses seems to have merit. However, Santhanam (2001) found no negative correlation between such communications and group outcomes. Santhanam (2001) also concluded that "students have a strong preference for having face-to-face interactions (p.14)."

Halawi et al (2009) used an exploratory study to evaluate e-learning using Bloom's Taxonomy, a proven model that describes an individual's level of cognitive domain. In Halawi' sstudy, two categories used included individual factors such as gender, age, educational level, familiarity, time dedicated to study, and learning style and instructional factors that include effectiveness of tools used, interaction with the professor, and ease of use of technology. The conclusion was that these factors, including gender, indicated no significant effect on e-learning. Further, Anakwe (2008) concluded that there no significant differences in learner performance between online and paper testing when using gender as a variable.

Tanner et al (2008) performed a study to determine the perceptions of online learning by faculty and staff. They found that students believed that online classes required more self-discipline at a "significantly higher level of agreement than did faculty responses" (p. 34). Students also believed that online classes required them to be more actively involved and to "teach themselves" (p. 36).

When selecting a college, students may evaluate potential programs by mode of delivery as one search criteria along with cost, location and employment prospects. Kulchitsky (2008) categorized students into one of two groups; risk-sensitive, and cost-sensitive. The risk-sensitive student "appeared to be most concerned with issues of 'how' they will receive their education" whereas the cost-sensitive group is concerned almost entirely with cost (p.164-165). This research deals with the risk-sensitive nature of how ready students are for an online class.

Online readiness was also examined by Ho et al. (2010). Their research examines E-learning system quality, e-learning readiness, e-learner competency and learning outcomes. Using structural equation models Ho et al. (2010) show that the perceived quality of a an online learning system and e-learning readiness are positive and important factors in helping students achieve their goals, increase skill development, and produce higher learning satisfaction. Similar results were found by Park and Wentling (2007).

Smith (2003, 2005) administered the McVay Online Readiness Survey (2000, 2001) to 314 Australian university students and utilized a factor analysis to identify two primary factors for assessing the reliability and factorability of the instrument. There are thirteen items on the instrument and participants recorded their responses on a 4 point Likert scale. The responses students could choose from are: Rarely (1), Sometimes (2), Most of the time (3), and All of the time (4), see the Appendix. This survey has been used in several studies (Smith, 2005; Smith, Murphy and Mahoney, 2003; Atkinson, 2008) and its reliability and validity is established by a study performed specifically on the instrument (Smith, Murphy, & Mahoney, 2003). Prior to this study the McVay survey used by Smith (2003, 2005) was only conducted with Australian students. This study will try and replicate the Smith study with American students to see if the same measures or reliability can be achieved with American students. Additional research will be conducted to determine if there are any differences between urban and rural online learners and male and female on-line learners.

The two factors identified by Smith (2005) were self-management of learning and comfort with e-learning. The authors of this study propose that the second factor be renamed to "comfort with non-face-to-face communication" because this better reflects the questions that loaded high on this factor. Prior to classifying students into a category of rural or urban based upon postal code, the authors of this study felt that the debate as to what constitutes urban and rural areas were pertinent and must be dealt with prior to proceeding. For the 2000 census, the Census Bureau defines urban as

"all territory, population, and housing units located within an urbanized area or an urban cluster. It delineates urban area and urban cluster boundaries to encompass densely settled territory, which consists of:

1. Core census block groups or blocks that have a population density of at least 1,000 people per square mile and,

2. Surrounding census blocks that have an overall density of at least 500 people per square mile".

The Census Bureau's classification of rural is "all territory, population, and housing units located outside of urban areas and urban clusters". Urban or rural therefore has less to do with how far from an urban center you live, but rather more to do with the density of the population in the immediate area around where you live. For purposes of comparison in this paper, we used self reported classifications by the student. All zip codes collected from students indicated rural classification if one used the Census Bureau's classification.

OBJECTIVES OF THE STUDY

This paper builds upon the prior work done by Smith (2003, 2005) in order to determine if the McVay Online Readiness Survey still produces the same factor loadings when administered four years later to a group of American students. The McVay Online Readiness Survey is also used to determine if there are any distinctions between urban and rural students in terms of the survey factors: self management of learning and readiness for e-learning.

METHODOLOGY

Sample and Data Collections

The McVay Online Readiness Survey was administered to 146 U.S. undergraduate students at a mid-sized, public university in the southeastern United States. Previous use of the instrument was done in 2000 and 2001 when Smith (2005) administered the survey to 314 Australian students. Question #4 on the instrument was changed from "I am willing to dedicate 8 - 10 hours per week for my studies" to "I am willing to dedicate the necessary time per week for my studies." The reason for this change was that a specified and dedicated time range is questionable due to the variability of cognitive ability of the learner and course load undertaken. The designer of the instrument was contacted and we were given full permission to make the change. The questionnaire was developed electronically and students were asked to go to the survey hosting website and complete the quiz. Since the one question was changed, the authors wanted to make sure that the questions continued to load on the same factors and to see if a different population of students would have any impact on the factor loadings. Table 1 lists the factor loadings from previous research and this research.

THE RESULTS OF THIS STUDY

Factor Analysis

Since this study was meant to replicate the study of Smith (2005), the same type of factor analysis was done, principal components with varimax rotation. Factor loadings of .4 were adopted since this was used in the previous research as well. Smith 2005 stated that factor loadings of .4 or more was more stringent than required by other researchers who recommend a value of .32 or higher. If an item loaded highly on each factor, the item was assigned to the factor that it loaded the highest on. Using a Scree test, a two-factor solution was selected since it provide for 47.2% of the variance and is consistent with the findings of Smith (2003). The loadings are shown in Table 1.

Factor Interpretation

The following items loaded highly on Factor 1, questions 4,6,8,9,10,11, and 13. This is slightly different from the previous study when Items 4, 8, 9,10,11,12, and 13 loaded highly. In the previous study question 6 did not meet the threshold of a loading value of .4, while in this study the loading value was .52. This may be due to cultural differences between Australian students and United States students. The higher factor loadings for this item may also be that students have gotten more comfortable with the self management of their learning and with communication in a non-face-to-face environment which would account for why this item loaded high on both factors. Also, question 12 loaded highly in the Australian study for Factor 1 but did not meet the threshold for Factor 1 in this study. Question 12 did load highly for factor two in this study. This may indicate a cultural bias for working more independently in the American society that is more associated with non-face-to-face communication. Since the questions that loaded high on this first Factor have remained largely the same we believe the factor name proposed by Smith (2005) should remain as "self management of learning."

The following items loaded highly on Factor 2, questions 1,2,3,5,6,7, & 12. In the previous study by Smith (2005) only items 1, 2, 3, and 5 loaded highly. While item 6 loaded highly on both factors it is difficult to say which factor this item should be associated with. The significant difference in this study is the loadings from question 7 and 12 on Factor 2 which was not very high in the previous study by Smith (2005). Question 7 deals with written communication and this may be attributed to the students comfort with texting on their cell phones. The authors of this study can only assume that when Smith (2005) conducted his study that there was not a high prevalence of cell phones and texting, thus the low factor loading for item seven. Since the questions that loaded high for Factor 2 dealt more with communication and less with e-learning the authors felt a more appropriate name for this factor was "comfort with non-face-to-face communication." Rather than "comfort with e-learning" that was proposed by Smith (2005).

Factor 1 accounted for 26.59% of the variance and Factor 2 accounted for an additional variance of 20.58% for a total of 47.17% of the variance. This is significantly higher than the total variance of 42.2% accounted for in the Smith (2005).

Factor Comparisons

Once the factors were determined a factor score for each factor was calculated for each respondent. Table 2 lists the means for these factors in this study.

The factor score was computed by averaging the student's response for those questions that loaded high for each factor. Factor 1 was composed of questions 4, 6, 8, 9, 10, 11, and 13. Factor 2 was composed of questions 1, 2, 3, 5, 6, 7, and 12.

A comparison between self reported urban and rural students on each factor was done. Table 3 lists the descriptive statistics for these groups. A one-way Analysis of Variance (ANOVA) was performed and no significant difference was found to exist between these groups on each of the factors.

A comparison between male and female students was also done on for each factor. Table 4 lists the descriptive statistics for these groups. A one-way Analysis of Variance (ANOVA) was performed and no significant difference was found to exist between these groups on each of the factors.

IMPLICATIONS

The results of comparing the factor results to previous research indicate that all the questions have high factor loadings and no additions or deletions of questions from the survey instrument are needed. Thus the survey is thought to have high reliability and validity.

Based on the comparison between urban and rural students there does not appear to be a difference between these groups of students based on the survey factors. Rural students do not appear to be at a disadvantage due to not having or using the same communication technologies as urban students. These technologies would include Internet access and texting. Also there appears to be no difference between male and female students for these factors either. In the past female students may have been at a disadvantage due to their lack of use and comfort level with technology. Since there are no differences between these groups on the Self Management of Learning and Comfort with Non-Face-to-Face Communication factors, universities can rule out group differences should they begin using the McVay Online Readiness Survey to determine if students are ready to take an online course.

DISCUSSION AND CONCLUSIONS

All of the survey questions had high factor loadings and there is no need to rework the questions as Smith indicated. A second interesting finding is that question 6 now loads high on both factors. Given the length of time students have used the Internet they may now feel their background and experience helps them with factor 1 (self management of learning); also for the same reasons, students may now feel that their background and experience is beneficial to their studies because they feel comfortable with non-face-to-face communication (factor2). This paper seeks to help validate the accuracy of the McVay readiness survey and the next step would be to determine if the factors "self managements of learning" and "comfort with non-face-to-face communication" can be used as predictors of success in an online course.

Research is inconclusive at this point as to the identification of the critical success factors for effective student learning in online environments. Halawi, Pires, and McCarthy (2009) took a traditional and well recognized approach to evaluating e-learning by using Bloom's taxonomy. Their methodology was to distribute surveys to undergraduate students in an attempt to ascertain whether individual and instructional influence the learning process. The questions they used were similar in nature to those in the McVay approach and are categorized into three indices. The first index, Individual Factors, is time dedicated to study, knowledge of computers, and motivation. The second index, Instructional Factors, measured the effectiveness of the tools used, interaction with the professor, and ease of use of technology. The third index, Learning Through WebCT, were nine items used based on learning through WebCT, however, only three factors were found and were considered for further analysis. They concluded that individual and instructional factors don't play a major role in student's learning process. Cao, Griffin, and Bai (2009) started with the assumption that a synchronous component is necessary in any distance learning course. They concluded that improving the student's satisfaction with the synchronous aspect of the course actually raised the overall satisfaction with the course. One criticism of this work is that they use studies from the mid-90's to support claims that direct interaction increases student satisfaction. This might be countered with the innovations available to instructors more recently that strengthens the constructivist approach to learning. When considered with the idea that learner motivation is relevant to learning effectiveness, the likelihood that the presence of a synchronous component being critical to satisfaction is diminished because learners tend to equate the usefulness of the content to their success in the workplace (Womble, 2008).

LIMITATIONS AND FURTHER RESEARCH NEEDS

The items in the McVay Online Readiness Survey has been shown to be consistent on loading highly on two factors and should be used in conjunction with a pre-test and post-test assessment instrument in online courses in an effort to further validate the instruments level of effectiveness. Additional validation with a greater number of American students and students from other countries is also needed to allow for further generalizations. The next step is to see if the factors "self managements of learning" and "comfort with non-face-to-face communication" are good predictors of student performance in an online course.

Appendix

Modified McVay Online Readiness Questionnaire

Rate your agreement with each statement by circling the number that best matches your feelings.

1 = Rarely, 2 = Sometimes, 3 = Most of the time, 4 = All of the time.

If you don't believe that you can categorize your response as a "3" or "4", you may wish to reconsider your decision to pursue an electronic course. Please discuss your responses to these questions with the contact person for the program in which you are interested.

The Readiness for Online Courses survey is taken with permission from How to be a successful distance learning student: Learning on the Internet written by Marguerita McVay.
                                                3          4
                           1        2           Most of    All of
                           Rarely   Sometimes   the time   the time

I am able to easily        1        2           3          4
access the Internet as
needed for my studies.

I am comfortable commu-    1        2           3          4
nicating electronically.

I am willing to actively   1        2           3          4
 communicate with my
classmates and instru-
ctors electronically.

I am willing to dedicate   1        2           3          4
the necessary time per
week for my studies.

I feel that online lear-   1        2           3          4
ning is of at least
equal quality to
traditional classroom
learning.

I feel that my back-       1        2           3          4
ground and experience
will be beneficial to
my studies.

I am comfortable with      1        2           3          4
written communication.

When it comes to lear-     1        2           3          4
ning and studying, I am
a self-directed person.

I believe looking back     1        2           3          4
on what I've learned in
a course will help me to
remember it better.

In my studies, I am        1        2           3          4
self-disciplined and
find it easy to set
aside reading and
homework time.

I am able to manage my     1        2           3          4
study time effectively
and easily complete
assignments on time.

As a student, I enjoy      1        2           3          4
working independently.

In my studies, I set       1        2           3          4
goals and have a high
degree of initiative.

Additional Questions:

If you do not use the Internet, please indicate the reason or reasons
why? --

If you use the Internet, for what purpose(s) do you primarily utilize
it? --

-- Email -- surf the Web -- buy/sell -- entertainment
-- Other (please specify)

Please discuss your experiences with using the Internet,
both positive and negative.--

Would access to broadband (high-speed) Internet affect your use of the
Internet?

Do you live within a city's incorporated boundaries or in the county?
City --  County --
Zip Code: --

Age range: Under 18    --
           18-24       --
           25-34       --
           35-44       --
           45-54       --
           55-64       --
           65 or over  --

Sex: M --  F --

Income Range:

Under $ 10,000 annually

$ 10,000 to 24,999

$ 25,000 to 49,999
$ 50,000 to 74,999
$ 75,000 to 99,999
Over $100,000 annually

Race: White--

Black--

American Indian--

Asian--

Other-


REFERENCES

Atkinson, J.K. (Spring 2008). Rural adult learning in the absence of broadband internet. The Journal of Learning in Higher Education, 4(1).

Anitsal, M., Anitsal, I., Barger, B., Fidan, I., & Allen, M. (2008). Student evaluations of course attributes of online courses versus on-ground courses: Impact of student personality traits. Allied Academies International Conference. Academy of Marketing Studies. Proceedings, 13(1): 1-8.

Cao, Q., Griffin, T.E., & Bai, X. (2009). The importance of synchronous interaction for student satisfaction with course web sites. Journal of Information Systems Education, 20(3): 331-338. Retrieved

Dosch, J.. (2010). Teaqching Management Accounting Online. Cost Management, 24(2): 44-48.

Halawi, L., Pires, S., & McCarthy, R. (2009). An evaluation of e-learning on the basis of Bloom's taxonomy: An exploratory study. Journal of Education for Business, 84(6): 374-380.

Ho, Li-An, Tsung-Hsien Kuo, & Binshan, L. (2010). Influence of online learning skills in cyberspace. Internet Research, 20(1): 55-71.

Lever, J., Gerlich, N., & Pearson, T. (2006). Market segmentation for online courses in the college of business. Academy of Marketing Studies Journal, 10(2): 95-105.

McVay, M. (2000). How to be a successful distance learning student. 2e. Needham Heights, MA: Pearson Publishing.

McVay, M. (2001). How to be a successful distance education student: Learning on the Internet. New York, NY: Prentice Hall.

Kulchitsky, J.D. (2008). High-tech versus high-touch education: perceptions of risk in distance learning. The International Journal of Educational Management, 22(2): 151-167.

Park, J.H., & Wentling T. (2007). Factors associated with transfer of training in workplace e-learning. Journal of Workplace Learning, 19(5): 311-29.

Santhanam, R. (2001). The effects of communication modality on outcomes of collaborative tasks. Information Technology and Management, 2(4): 377-394.

Sloan Consortium; New Study: Online Education up 17% to 4.6 Million. (2010, February). Education Letter,12. Retrieved June 7, 2010, from Career and Technical Education. (Document ID: 1953996241).

Smith, P. J. (2005). Learning preferences and readiness for online learning. Educational Psychology, 25(1): 3-12.

Smith, P. J. Murphy, K.L., & Mahoney, S. E. (2003) Identifying factors underlying readiness for online learning: An exploratory study. Distance Education, 24, 57-68.

Tanner, J., Noser, T., & Totaro, M. (2008). Business faculty and undergraduate students' perceptions of online learning: A comparative study. Journal of Information Systems Education, 20 (1): 29-40.

Womble, J. (2008). The relationship among learner satisfaction, self-efficacy, and usefulness. The Business Review, Cambridge, 10(1): 182-188.

About the Authors:

Ray J. Blankenship (Ph. D. in Business Administration) is an Associate Professor of Computer Information Systems in the Gordon Ford College of Business at Western Kentucky University. Dr. Blankenship has been teaching and consulting in Information Technology for over 20 years and is the author and coauthor numerous refereed articles related to the pedagogy of Information Systems.

J. Kirk Atkinson is an Assistant Professor in the Gordon Ford College of Business at Western Kentucky University and teaches undergraduate courses in Principles of Management Information Systems, Systems Management, and Oracle database courses. He holds a B.S. in Information Systems from Western Kentucky University, an M.B.A. from Murray State University, and an Ed.D.in Adult & Higher Education from Ball State University, plus possesses 20+ years experience in the private sector. His research interests include adult technology learning methods, determinants for online learning readiness, and how various tools affect the learning effectiveness of college students.

Ray Blankenship

J. Kirk Atkinson

Western Kentucky University
Table 1
Factor Loadings of Questionnaire Items

                                        1        1        2
                                        2005     2009     2005
Factors                                 study    study    study

Eigenvalue                              3.21     3.46     2.28

% of Variance                           24.61%   26.59%   17.53%

Questionnaire Item

1. I am able to easily access the       .03      .07      .62
Internet as needed for my studies

2. I am comfortable communicating       .02      .07      .87
electronically.

3. I am willing to actively communi-    .09      .26      .78
cate with my classmates and
instructors electronically.

4. I am willing to dedicate the         .52      .67      .16
necessary time per week for my
studies. Previously this question
was: I am willing to dedicate
8-10 hours per week for my studies

5. I feel that online learning is       .07      .02      .56
of at least equal quality to
traditional  classroom learning.

6. I feel that my background and        .38      .52      .33
experience will be beneficial to
my studies.

7. I am comfortable with written        .36      .29      . 25
communication.

8. When it comes to learning and        .75      .64      .00
studying, I m a self directed
person.

9. I believe looking back on what       .42      .52      .13
I've learned in a course will help
me to remember later.

10. In my studies, I am self            .78      .78      -.02
disciplined and find it easy to set
aside reading and homework time.

11. I am able to manage my study time   .68      .70      .09
effectively and easily complete
assignments on time

12. As a student I enjoy working        .52      .39      -.03
independently.

13. In my studies, I set goals and      .75      .79      .02
have a high degree of initiative.

                                        2
                                        2009
Factors                                 study

Eigenvalue                              2.68

% of Variance                           20.58%

Questionnaire Item

1. I am able to easily access the       .49
Internet as needed for my studies

2. I am comfortable communicating       .80
electronically.

3. I am willing to actively communi-    .66
cate with my classmates and
instructors electronically.

4. I am willing to dedicate the         .18
necessary time per week for my
studies. Previously this question
was: I am willing to dedicate
8-10 hours per week for my studies

5. I feel that online learning is       .43
of at least equal quality to
traditional  classroom learning.

6. I feel that my background and        .54
experience will be beneficial to
my studies.

7. I am comfortable with written        .66
communication.

8. When it comes to learning and        .12
studying, I m a self directed
person.

9. I believe looking back on what       .14
I've learned in a course will help
me to remember later.

10. In my studies, I am self            .26
disciplined and find it easy to set
aside reading and homework time.

11. I am able to manage my study time   .13
effectively and easily complete
assignments on time

12. As a student I enjoy working        .54
independently.

13. In my studies, I set goals and      .09
have a high degree of initiative.

Table 2
Factor Descriptive Statistics

                                                        Std.
Factor               N     Minimum   Maximum   Mean     Deviation

F1: Self             151   1.50      4.00      3.0762   .50469
Management of
Learning

F2:Comfort with      154   1.80      4.00      3.1831   .46129
Non-Face-to-Face

Valid N (listwise)   146

Table 3
City and County Statistics by Factor

Factor                   Location   N    Mean     Std. Deviation

F1: Self Management of   Urban      92   3.2261   .39693
Learning                 Rural      54   3.1296   .54966

F2:Comfort with Non-     Urban      92   3.1214   .47147
Face-to-Face             Rural      54   3.0000   .57004
Communication

Table 4
Male and Female Statistics by Factor

                                                       Std.
Factor                          Sex      N    Mean     Deviation

F1: Self Management of          Male     72   3.0463   .53531
Learning                        Female   79   3.1034   .47689

F2: Comfort with Non-Face-to-   Male     72   3.1861   .40326
Face Communication              Female   82   3.1805   .50928
COPYRIGHT 2010 International Academy of Business and Public Administration Disciplines
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2010 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Blankenship, Ray; Atkinson, J. Kirk
Publication:International Journal of Education Research (IJER)
Article Type:Report
Geographic Code:1USA
Date:Jun 22, 2010
Words:4562
Previous Article:Clicking for comprehension: do personal response systems measure up?
Next Article:The Cameron university green website project part 1: service learning in the fall 2009.
Topics:

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters