Printer Friendly

Emerging educational technology: assessing the factors that influence instructors' acceptance in information systems and other classrooms.

1. INTRODUCTION

Although emerging educational technology usage in the classroom, which is primarily led by information systems (IS) instructors, has increased in recent years, technology acceptance and usage of non-IS instructors continue to be problematic for educational institutions (Baylor & Ritchie, 2002; Gong, Xu, & Yu, 2004; Saunders & Klemming, 2003; Wozney, Venkatesh, & Abrami, 2006). In today's competitive educational environment, emerging educational technologies are required to provide competitive educational services to an increasingly demanding student body (Cheurprakobkit, 2000). Emerging educational technology can be used to provide more flexible approaches to teaching. However, evidence has shown that extensive lecturing continues to be the pedagogical method used most often in IS and other classrooms (Newman & Scurry, 2001). Although general technology usage has increased in the classroom, there is little evidence that these technologies are being integrated into instruction, primarily in the case of non-IS courses (Oncu, Delialioglu, & Brown, 2008).

Emerging educational technology refers to computers and other new electronic technologies that, when applied to educational settings, can be used to significantly change education (Nilson, 2005; Roblyer, 2006). Examples for such emerging educational technology include: a) tools to generate course materials; b) planning and organizational tools for concept mapping and lesson planning; c) electronic research and reference tools; d) tools to support specific content areas; as well as e) tools to record class lectures and notes, and others (Roblyer). While the essence of emerging educational technology is that such technology is nowadays enabled by the Internet. Nilson suggested that integrating emerging educational technology into courses may provide new methods for teaching course content and designing educational experiences. It may also improve learning, provide ways of affirming diversity, and facilitate problem solving and creativity (Wozney et al., 2006). According to Hiltz and Turoff (2005), students generally rate courses that integrate emerging educational technology into traditional classroom settings as significant improvements in their educational experience. Neither students nor instructors see emerging educational technology use as automatically benefiting their education, however; it depends on how and why the emerging educational technology is being used within the curriculum (D'Angelo & Woosley, 2007). Although distance learning is very popular, Hiltz and Turoff stated that "research indicates that 10%-20% of students always prefer the face-to-face environment and believe they learn best in that environment" (p. 61).

Unfortunately, to enable higher education institutions to continue to compete, there has been a rush to implement educational technology and to bring courses online quickly; as a result, quality and educational effectiveness have often been of secondary concern (Lightfoot, 2005). Kingsley (2007) suggested that technology in the classroom often ends up being an obstacle, add-on or seemingly unrelated to the current lesson. According to Lightfoot, traditional curricula and emerging educational technology can be integrated successfully, as long as courses are developed with classic educational pedagogy in mind, and the pedagogy drives the choice of technology. Given the annual investment institutions make in emerging educational technology and the critical role instructors play in return on investment, additional research is necessary to more fully examine the factors involved in instructors' acceptance of emerging educational technology and its use in the classroom (Venkatesh, Speier, & Morris, 2002). This study attempted to address such issues by trying to uncover the factors influencing instructors' intention to use emerging educational technology in traditional classrooms.

2. LITERATURE REVIEW

2.1 Technology Acceptance

Extensive research has been conducted investigating the variables associated with technology acceptance in a wide variety of settings (Agarwal & Prasad, 1998; Dillon & Morris, 1996; Taylor & Todd, 1995b). As a result, several theoretical models have been developed to explain both users' intention to use technology, and actual technology use (Agarwal & Prasad; Venkatesh, Morris, Davis, & Davis, 2003). The Technology Acceptance Model (TAM), proposed by Davis (1989), is the classical IS model developed to explain computer-usage behavior and constructs associated with acceptance of technology. The TAM is based on the Theory of Reasoned Action (TRA), which posits that the most significant predictor of behavior is intention (Fishbein & Ajzen, 1975). The TRA is especially helpful regarding behavior, as it asserts that other factors that influence behavior do not do so directly, but indirectly by influencing other factors (Davis, Bagozzi, & Warshaw, 1989). The TAM extends the TRA and suggests that perceived usefulness and perceived ease of use determine an individual's intention to use a system.

Some researchers believe that technology acceptance is more complex than originally thought, and have investigated other variables that influence acceptance (Taylor & Todd, 1995b; Thompson, Compeau, & Higgins, 2006). Although TAM and TRA have strong behavioral elements and predict intention well, they are limited in explanatory power and do not account for other factors that may influence technology acceptance (Sun & Zhang, 2006; Thompson et al.; Venkatesh & Davis, 1996). In a systematic analysis of technology acceptance studies, Sun and Zhang identified three main factors and 10 moderating factors that were associated with technology acceptance models in the literature. From these factors, Sun and Zhang developed an integrative model and corresponding propositions associated with each of the factors. According to Sun and Zhang, it appears that, even though technology acceptance models have received considerable empirical validation and confirmation, acceptance models still have room for improvement. One factor that has led to mixed and inconclusive outcomes in acceptance research is inadequate definition and measurement of constructs (Korukonda, 2006; Moore & Benbesat, 1991; Sun & Zhang).

According to Chau and Hu (2001), behavioral intention, or intention to use (IU) a technology, has long been used as a dependent variable rather than actual use. IU refers to whether one intends to use a technology (Levy & Green, in press). Following a meta-analysis of technology acceptance related studies, Legris, Ingham, and Collerette (2003) indicated that a majority of technology acceptance studies used IU as the dependent variable without measuring actual technology use. According to Venkatesh et al. (2003), as well as Simon and Paper (2007), IU has been found to be a valid predictor of actual technology use, especially when the use of the technology is voluntary. Moreover, Levy and Green indicated that in some contexts IU "appears to be more appropriate dependent variable to measure than actual system use, as system use measure for such system is challenging due to context of the system" (p. 3). Thus, in the context of this study IU, or instructors' intention to use emerging educational technology, will be used as a surrogate predictor of technology use without actually measuring technology use itself.

There are two main themes that are prominent in most technology acceptance models: parsimony and instrumental determinants (Thompson et al., 2006). According to Thompson et al., although these main themes have served the technology adoption stream well, perceived usefulness and perceived ease of use are not the only valid factors related to technology acceptance, especially with newer technologies. Further research into the generalizability of factors associated with technology acceptance and refinement of acceptance models has been recommended (Sun & Zhang, 2006; Thompson et al.). Thus, this work attempted to uncover other factors than perceived usefulness and perceived ease of use that are associated with instructors' acceptance of emerging educational technology in traditional classrooms.

2.2 Educational Technology

Roblyer (2006) defined educational technology as "a combination of the processes and tools involved in addressing educational needs and problems, with an emphasis on applying the most current tools: computers and other electronic technologies" (p. 9). Baek, Jung, and Kim (2006) described emerging educational technology as being simply the latest developments in educational tools, and one of the most exciting areas of change in education. Some of the emerging trends in educational technology include wireless connectivity, merged technologies, handheld devices, high-speed communications, artificial intelligence, and virtual systems (Roblyer). According to Roblyer, these trends represent major changes in the way education is provided. Kingsley (2007) suggested that integrating emerging educational technology into traditional learning environments may improve learning, provide ways of affirming diversity, and facilitate problem solving and creativity. Integrating educational technology, both established and emerging, has also enabled educational institutions to address many of the barriers encountered by those wishing to pursue higher education (Duhaney, 2005).

There are three main categories of technology usage in educational environments: (a) instructional, (b) productivity, and (c) administrative (Roblyer, 2006). Many of the emerging educational technology tools address functional areas such as drill and practice, tutorial, simulation, instructional games, and problem solving (Roblyer). Woods, Baker, and Hopper (2004) surveyed how instructors were using an online learning system (OLS) to supplement their face-to-face courses. Their results indicated that instructors primarily used the OLS system as a non-interactive course management and administrative tool to transact information. According to Bernard et al. (2004), more recent uses of emerging educational technology include supporting constructivist approaches to education and an increased use of collaborative learning. Debevec, Shih, and Kashyap (2006) suggested that usage of emerging educational technology has "dramatically increased to include emerging technology for visual presentation, simulation, accessing course materials and the World Wide Web resources, and interactivity" (p. 293). According to Hiltz and Turoff (2005), traditional face-to-face courses are being moved to online and hybrid courses that use emerging educational technology to deliver course content and support learning objectives. However, this transition has proven to be challenging and, according to Schmidt (2002), "effectively replacing the traditional classroom interaction is one of the greatest challenges in placing an entire course on the Internet" (p. 6). Schmidt suggested that it is emerging educational technology that can be used to bring online teaching and learning to a higher level and to ensure that online learning equals or surpasses the quality of education in traditional environments.

Along with the benefits that increasing technological options can provide, there are still many barriers to the successful integration and usage of emerging educational technology within educational environments (Levy, in press; Roblyer, 2006; Wenglinsky, 1998; Wozney et al., 2006). According to Levy (2006), there is also no consensus in research on the effectiveness of using emerging educational technology, so administrators and other policymakers are left wondering about how best to invest in technology infrastructure and training. The absence of systematic policies and institutional planning strategies hampers instructors' efforts to integrate emerging educational technology effectively into their courses (Wozney et al.).

2.3 Technology Acceptance in Education

According to literature, there seems to be a consensus among researchers that additional research investigating the factors involved with instructors' decisions to integrate emerging educational technology in the classroom is necessary (Baek et al., 2006; Ngai, Poon, & Chan, 2007; Wozney et al., 2006). According to Baek et al., much of the research in educational settings has generally approached the topic from the perspective of how to make instructors technology professionals and how to integrate emerging educational technology into the curriculum, but has largely ignored the factors involved in influencing instructors to use emerging educational technology in the classroom.

Another limitation of prior technology acceptance research is that the majority of studies examine technology acceptance in business settings, although a considerable number of studies use students as participants, which may lead to different conclusions than in educational settings (Gong et al., 2004; Hu et al., 2003). Hu et al. suggested that, since instructors are more independent and have more autonomy over their work than many business technology users, research results in educational settings may differ from those in business settings. The characteristics of instructors may also differ from those of business users and may lead to different research results (Gong et al.). Thus, additional investigations on the factors influencing instructors' intention to use emerging educational technology are warranted.

2.4 Computer Self-Efficacy

Self-efficacy (SE), the belief that one has the capability to perform a particular behavior, has often been investigated as a construct in technology acceptance research (Compeau & Higgins, 1995). Bandura (1977) defined SE as people's beliefs about their capabilities to produce effects. Computer self-efficacy (CSE) refers to SE as it relates to computing behavior (Compeau & Higgins). According to Agarwal and Karahanna (2000), an individual's beliefs about or perceptions of IS have a significant influence on their usage behavior. According to Compeau and Higgins, researchers generally agree that a positive relationship exists between CSE and IS use, and that understanding CSE is important to the successful implementation of systems in organizations. In a study based on the work of Bandura, Compeau and Higgins developed a 10-item measure of CSE and empirically tested the measure on a group of managers and other professionals. Their results confirmed that CSE was a valid and reliable construct and that CSE is an important individual trait to organizations in the successful implementation of computer systems. In a further empirical validation of the CSE instrument developed by Compeau and Higgins, Compeau, Higgins, and Huff (1999) confirmed the findings of the prior work. The results of Compeau et al.'s study provided strong confirmation and evidence that CSE impacts an individual's affective and behavioral reactions to IS.

Following the classical work by Compeau and Higgins (1995), numerous studies in IS have investigated and validated CSE in various contexts (Levy & Green, in press). CSE has often been linked with other variables in technology acceptance research (Agarwal & Karahanna, 2000; Levy & Green). In their original study, Compeau and Higgins (1995) found significant relationships between CSE as well as outcome expectations, affect, anxiety, and use. CSE was also found to have a moderating influence on encouragement by others and support (Compeau & Higgins). Compeau et al. (1999) also found CSE to have a significant relationship with outcome expectations and affect. Moreover, Compeau et al., found a significant relationship between computer anxiety (CA) and system usage. In a study designed to investigate how a user's CSE is related to other technology acceptance constructs, Agarwal and Karahanna developed a model and empirically tested it with 186 university students. Their results indicated that CSE was a key antecedent of perceived ease of use, and was strongly influenced by personal innovativeness with IS. They also concluded that prior experience with the use of technology (EUT) had a significant effect on CSE. In an empirical test of their model, Havelka (2003) surveyed 324 students and found that users with lower levels of CA had higher levels of CSE. Results also indicated a strong, positive relationship between EUT and CSE. Havelka found other significant differences in CSE among students with different majors and family income levels. Havelka suggested that additional research is warranted to clarify the details of the relationships between the constructs of CSE, CA, and EUT in the context of education. Additionally, Agarwal and Karahanna suggested that, although the results of their research supported the relationship between EUT and CSE, further research is necessary to test their proposed model in different contexts, with a wider variety of technologies.

2.5 Computer Anxiety

According to literature, it appears researchers generally agree that CA plays an important role in technology acceptance among instructors (Christensen, 2002; Korukonda, 2006; Venkatesh, 2000). However, research results are mixed, and there is no agreement on a specific definition of CA (Korukonda). According to Korukonda, literature has generally defined and operationalized CA as being "synonymous with negative thoughts and attitudes about the use of computers" (p. 1921). According to Venkatesh, CA is a negative affective reaction toward computer use, and has a significant impact on attitudes toward computer use. Korukonda, however, suggested that CA is not simply a negative, short-term attitude toward computers that can be overcome by increasing EUT. In the context of this study, CA was defined as "the fear or apprehension felt by individuals when they used computers, or when they considered the possibility of computer utilization" (Simonson, Maurer, Montag-Torardi, & Whitaker, 1987, p. 238).

According to Yang et al. (1999), CA is not only a stumbling block for instructors in integrating emerging educational technology into educational programs, but is one of the main reasons for limited instructors' technology acceptance. In an empirical study designed to investigate the effects of educational technology integration on the attitudes of instructors and students, Christensen (2002) found that instructors' CA tended to increase along with the level of technological skill of students. Results also suggested that greater levels of perceived importance of computers in students fostered higher levels of CA in instructors. Although a substantial amount of work has been done investigating the role of CA in technology acceptance, research results on CA have generally been mixed and additional research related to acceptance of OLS is warranted (Fuller, Vician, & Brown, 2006; Saade & Kira, 2006).

2.6 Experience with the Use of Technology

It appears from literature that there is a consensus among researchers that EUT plays a significant role in technology acceptance (Taylor & Todd, 1995a; Thompson et al., 2006; Venkatesh et al., 2003). The role of EUT has also been fairly consistent across acceptance models, with EUT playing both a direct role and a moderating role through its influence on other variables (Taylor & Todd; Venkatesh et al.). In a review of eight acceptance models, Venkatesh et al. found EUT to be a key moderator of other variables in the models. Additional evidence of the role of EUT was provided in Venkatesh et al.'s study, as EUT was found to have significant moderating influence and to be an integral feature of the Unified Theory of Acceptance and Use of Technology (UTAUT) model. Similarly, in an empirical study assessing the influence of EUT on IS usage, Taylor and Todd found that EUT influenced both the determinants of intention to use and actual IS usage.

In spite of these findings, it seems that there is little agreement in literature on a precise definition of EUT (Sun & Zhang, 2006; Thompson et al., 2006). Thompson et al. suggested that, although EUT influences other factors in technology acceptance models, previous research findings do not define EUT clearly. In their research, Thompson et al. defined an individual's EUT as a combination of "exposure to the tool" as well as "the skills and abilities that one gains through using a technology" (p. 43). However, Thompson et al. suggested that EUT may also entail habit, skill, and/or simply exposure. Sun and Zhang claimed that no specific definition of EUT has been provided to date, and stated, "considering the key role of experience in understanding the belief-intention-acceptance relationship, researchers might use more finely grained detail in its conceptualization of experience" (p. 69). Additional research clarifying the definition and role of EUT in technology acceptance has been recommended (Thompson et al.; Sun & Zhang). In the context of this study, EUT was defined as "the amount and type of computer skills a person acquires over time" (Smith, Caputi, Crittenden, Jayasuriya, & Rawstorne, 1999, p. 227).

3. RESEARCH MODEL AND RESEARCH QUESTIONS

The main goal of this study was to empirically investigate the contribution of instructors' CSE, CA, and EUT to their intention to use (i.e. IU) emerging educational technology in traditional classrooms. Because emerging educational technology is being used in IS classes, this study proposed that it would be useful to look at both IS and non-IS instructors combined to better understand factors that may contribute to their overall acceptance of such emerging educational technology. Moreover, this study used both IS and non-IS instructors due to the concern of external validity of comparing a smaller group of IS instructors and relatively larger group of non-IS instructors to make any comparative implications. As such, this study concentrated at a larger sample including both IS and non-IS instructors together. Figure 1 presents the conceptual map for this study.

The four specific research questions that this study addressed were:

RQ1 To what extent does CSE contribute to instructors' intention to use emerging educational technology in traditional classrooms?

RQ2 To what extent does CA contribute to instructors' intention to use emerging educational technology in traditional classrooms?

RQ3 To what extent does EUT contribute to instructors' intention to use emerging educational technology in traditional classrooms?

RQ4 Which construct out of the three independent variables (CSE, CA, and EUT) provides the most significant contribution to instructors' intention to use (i.e., IU) emerging educational technology in traditional classrooms?

[FIGURE 1 OMITTED]

4. METHODOLOGY

4.1 Instrument Development

This study developed a survey instrument by using survey items from the following prior validated instruments: Compeau and Higgins (1995), Fuller et al. (2006), Cassidy and Eachus (2002), Igbaria and Iivari (1998), as well as Chen et al. (2007). The following four sections will outline the measures used to assess each of the constructs investigated in this study (CSE, CA, EUT, and IU).

4.1.1 Computer self-efficacy measure: CSE was measured using the 10 item CSE instrument developed by Compeau and Higgins (1995). Compeau and Higgins found the instrument to have a reliability measure using Cronbach's Alpha of .80, meaning that the CSE items were reliable. The 10 items surveyed the respondents as to how confident they felt as to whether they could complete a job using an unfamiliar software package under a variety of conditions. The original instrument developed by Compeau and Higgins was based on a 10-point Likert scale. Chu (2003) conducted research investigating the effects of Web page design instruction on improving the CSE of pre-service instructors, and adapted the original scale to a 5-point Likert scale on the original 10-item instrument. The 5-point scale was found to be both reliable and valid for measuring CSE, with a reliability measure using Cronbach's Alpha of .79 in pretest and .70 in posttest. This research study followed the method used by Chu and used a 5-point Likert scale for the 10-item CSE measure (see items CSE1 to CSE10 in Appendix A).

4.1.2 Computer anxiety measure: CA was measured using the 7-item instrument developed by Fuller et al. (2006). Fuller et al.'s instrument exhibited high reliability and validity, with a reliability measure using Cronbach's Alpha of .74. Participants responded using self-reported measures on a 5-point Likert scale as to as to their level of CA (see items CA1 to CA7 in Appendix A).

4.1.3 Experience with the use of technology measure: EUT was measured following the approach used by Cassidy and Eachus (2002), as well as Igbaria and Iivari (1998). Cassidy and Eachus measured EUT using a single item and a 5-point Likert scale, where one indicated "None" and five indicated "Extensive.". Igbaria and Iivari (1998) measured EUT by asking participants about the extent of their experience with six types of software. Igbaria and Iivari also used a 5-point Likert scale, where one indicated "None" and five indicated "Extensive." This study adapted the items from Igbaria and Iivari. Cronbach's Alpha was used to test the reliability of the measure. Following the scale used in prior literature, this study asked participants to indicate their degree of EUT with 7 items using a 5-point Likert scale, where one indicated "None" and five indicated "Extensive" (see items EUT1 to EUT7 in Appendix A).

4.1.4 Intention to use measure: Instructors' intention to use emerging educational technology was based on two IU items. IU was measured using the instrument developed by Chen et al. (2007). Participants indicated their level of intention to use an IS using two items on a 5-point Likert scale. According to Chen et al., the instrument exhibited high reliability and validity, with a reliability measure using Cronbach's Alpha of over .90. This study adapted Chen et al.'s 2-item measure. The wording of the two IU items was adapted to reflect the specific technology being investigated in the current research study.

4.2 Sample and Data Collection

The sample population in this study included IS and non-IS instructors at a small private university in the southeastern United States. The total population consisted of 111 instructors that are teaching IS and non-IS courses. Demographic data were collected from the participants in order to determine if the sample was representative of the population. After being exposed to the target software through an introductory training class, instructors were surveyed as to their intention to use a specific emerging educational technology in the classroom

The emerging educational technology that provided the basis for this study was the Tegrity[R] Educational System. Figure 2 shows a screen capture from the Tegrity[R] Educational System. Tegrity[R] is an educational system that can be used in the classroom to capture class lectures and experiences for students to replay later at their convenience via the Web. In most cases, the captured content from Tegrity[R] is stored and provided to students via an OLS or an instructor's Website. According to Tegrity[R], over the last few years there has been a shift in the emphasis of emerging educational technology from use in online settings to supporting face-to-face and mixed delivery courses. Tegrity[R] supports multiple teaching approaches and does not force instructors to change the way they teach to deliver content via the Web. Tegrity[R] content also integrates with OLSs such as Blackboard[TM] and WebCT[TM]. To capture class sessions, instructors need to click a button to start a Tegrity[R] recording session at the beginning of class, and click another button to end the recording when done. Then, the session is transmitted to a Website, either to their instructor's Website or to their course on the university's OLS for delivery to students. Tegrity[R] appears to support multiple student learning styles. Students benefit from Tegrity[R] as they can focus their attention on understanding the lecture topic and participating in class discussions, instead of trying to keep up with taking notes that they will have to decipher later during their study time. A lecture recorded via Tegrity[R] allows students to replay parts as often as needed to reinforce what they have learned or to help them better understand parts of the lecture they may not have completely understood in class.

[FIGURE 2 OMITTED]

5. ANALYSIS AND RESULTS

5.1 Demographics and Descriptive Statistics

In order to determine the representativeness of the sample, demographic data were requested from the survey participants. The population of all instructors at the university consisted of 111 instructors with approximately 59% males and 41% females. Responses from 58 instructors were received. Following a Mahalanobis Distance test for multivariate outliers, two items were removed, resulting in 56 usable responses. The 56 responses included over 57% males and nearly 43% females, indicating a good gender representation of the population. About 84% of the population of all instructors at the university were 40 years of age or older, with 42% of the population between the ages of 50-59. Results of this study indicated that 89% of the respondents were 40 years of age or older, with 46% of the respondents between the ages of 50-59. Thus, the distribution of the data collected appears to be a good representative of the population of instructors at that university. Table 1 depicts the descriptive statistics and demographics of the study participants.

5.2 Data Analysis

This study examined three independent variables: CSE, CA, and EUT and their contribution to the dependent variable: IU. The current study used regression analysis to develop the prediction model. As the data collected was ordinal, Ordinal Logistic Regression (OLR) was used to empirically validate the predictive model based on the data that were collected.

5.2.1 Reliability analysis of constructs: Cronbach's Alpha reliability tests were conducted for the CSE, CA, EUT, and IU constructs to determine consistency across items for each scale. Before the analysis was conducted, the scores for the positive CA items were inversely scored, following the example of Fuller et al. (2006). Table 2 depicts the results of the reliability analysis for the constructs in this study. The results demonstrated high reliability for CSE, CA, EUT, and IU, with Cronbach's Alphas well above the desired minimum of .70, indicating high reliability of all constructs in this study (Sprinthall, 1997).

5.2.2 Ordinal logistic regression analysis: An OLR model was developed to predict the probability of the dependent variable (IU) based on the three independent variables (CSE, CA, and EUT). Table 3 depicts the overall OLR model significance results. The overall model for predicting the probability of IU based on the three predictors (CSE, CA, and EUT) showed a significant fit with -2 Log Likelihood = 96.117 and X2(3) = 13.141 p<.005.

The results of the OLR analysis indicated that only one of the three individual predictors (CSE) was significant (p<.005), with a positive parameter estimate, indicating that IU increased as scores on CSE increased. The negative parameter estimates for CA and EUT indicated that higher scores on CA and higher scores on EUT both indicated lower scores on IU; however, neither of these two independent variables were significant predictors of IU.

5.3 Results

The first research question was: To what extent does CSE contribute to instructors' intention to use emerging educational technology in traditional classrooms? Evidence from the OLR analysis demonstrated that CSE was the only significant predictor of IU among the three independent variables investigated. The findings on CSE represented the main strength and further validated the findings of other researchers such as Compeau and Higgins (1995), Gong et al. (2004), Hu et al. (2003), Igbaria and Iivari (1995), as well as Levy and Green (in press) that CSE is an important contributing factor in predicting IU as it relates to technology acceptance and usage.

The second research question was: To what extent does CA contribute to instructors' intention to use emerging educational technology in traditional classrooms? Results from the OLR analysis demonstrated that CA was not a significant predictor of IU. These results were consistent with the research of Venkatesh (2000), who found that CA did not have a direct influence on technology acceptance, and with other researchers who suggested that CA generally acts as an antecedent to and a moderator of other variables rather than having a direct influence (Hackbarth et al., 2003; Igbaria & Iivari, 1995; Saade & Kira, 2006; Yang et al., 1999). For example, Venkatesh et al. (2000) found CA to be an antecedent to perceived ease of use. Saade and Kira found CA to have a moderating influence on perceived ease of use and perceived usefulness. Moreover, Hackbarth et al. found that CA had a negative influence on perceived ease of use through direct system experience. Results from the OLR analysis further validated prior research and the call of others for additional research investigating CA and its role in technology acceptance (Korukonda, 2006).

The third research question was: To what extent does EUT contribute to instructors' intention to use emerging educational technology in traditional classrooms? Evidence from the OLR analysis demonstrated that EUT was not a significant predictor of IU among the three independent variables investigated. However, the OLR analysis demonstrated a negative relationship between EUT and IU, with higher levels of EUT associated with lower levels of IU. In the current study, 50% of the instructors with higher levels of EUT had also been teaching for over 10 years. These results were consistent with the findings of Baek et al. (2006), who found that instructors with more teaching experience generally decided to use technology involuntarily in response to external forces, while instructors with less teaching experience were more likely to use technology on their own will. The results further validated the recommendations of other researchers that more research is necessary regarding the construct of EUT and its role in technology acceptance (Sun & Zhang, 2006; Thompson et al., 2006).

The fourth research question was: Which construct out of the three independent variables (CSE, CA, or EUT) provides the most significant contribution to instructors' intention to use (i.e., IU) emerging educational technology in traditional classrooms? Evidence from the OLR analysis demonstrated that CSE provided the most significant contribution out of the three independent variables investigated when predicting the probability of instructors' intention to use (i.e., IU) emerging educational technology. This validated the results of other studies that identified the importance and role of CSE in technology acceptance models (Agarwal & Karahanna, 2000; Compeau et al., 1999; Igbaria & Iivari, 1995; Levy & Green, in press).

6. DISCUSSION AND CONCLUSIONS

6.1 Discussion

The main goal of this study was to empirically investigate the contribution of IS and non-IS instructors' CSE, CA, and EUT to their intention to use emerging educational technology in traditional classrooms. As emerging educational technology appears to be accepted by IS instructors more than non-IS instructors, this study attempted to investigate all instructors including IS and non-IS. Thus, the population of this study included all instructors at single small, private university in southwest Florida. Responses included 56 usable records, indicating approximately a 53% response rate, with the sample appearing to be a good representation of the population.

6.2 Summary of Results

Evidence demonstrated that CSE was the only significant predictor of IU among the three independent variables investigated. Results demonstrated that CA was not a significant predictor of IU. These results may suggest that CA acts as an antecedent to or a moderator of other variables rather than having a direct influence on the overall acceptance of IS. Moreover, results further call of other researchers to investigate the role of CA in technology acceptance. Additionally, results demonstrated that EUT was also not a significant predictor of IU. Thus, the results provided additional evidence that more research is necessary regarding the construct of EUT and its role in technology acceptance (Sun & Zhang, 2006; Thompson et al., 2006).

6.3 Implications for IS Education

This investigation has several implications for the existing body of knowledge for research and practice, especially within IS education. This study developed a predictive model using the constructs of CSE, CA, and EUT in order to predict the probability of instructors' intention to use (i.e. IU) emerging educational technology in IS and non-IS traditional classrooms. The context was specifically among instructors and investigated instructors' intention to use emerging educational technology in traditional classrooms. The first implication to IS education research that this study made is that it investigated factors that are known in IS literature to contribute to users' acceptance of technology, but were not appeared to be investigated in the context of emerging educational technology in traditional classrooms. The second implication to research is that it investigated key constructs contributing to instructors' intention to use emerging educational technology in the classroom rather than in an online environment.

This investigation also contributed to practice in that it provided valuable information that can be used to increase intention and usage of the technology under investigation. It may also help administrators become aware of issues with CSE, particularly for administrators of non-IS instructors, so they can better meet the needs of faculty members as to where to target training and other initiatives to increase CSE in order to ultimately increase usage of emerging educational technology in the classroom.

Several specific recommendations for practice can be made to increase instructors' acceptance of emerging educational technology and its use in traditional classrooms. Institutions should make a systematic effort to provide instructors with training on how to use educational technology effectively. As CSE has been found to have a strong direct effect on intention to use IS, both in this study and in prior studies, training should be designed to increase instructors' CSE. Institutions should take advantage of those instructors who are early adopters of emerging educational technology and utilize them in assisting those who may not adopt emerging educational technology as quickly. Institutions must also ensure that instructors are properly informed as to the pedagogical benefits of using emerging educational technology in the classroom, and help education instructors in how to make effective use of technology in supporting their educational objectives.

6.4 Limitations

There were several limitations to this study. The first limitation was that the current study was conducted at a single small, private university in southwest Florida and the sample combined both IS and non-IS instructors. The sample was relatively small and was comprised only of 56 IS and non-IS instructors. The concern of external validity of comparing a smaller group of IS instructors and relatively larger group of non-IS instructors in order to make comparative implications, led this study to investigate a larger group of both IS and non-IS instructors. Therefore, further research is needed in different types of institutions where a larger sample of IS instructors is available and compare the results with those found in this study to help better understand the implications of the factors investigated to acceptance of emerging educational technology in the classrooms. A second limitation was that a single technology was investigated within the context of traditional classrooms. Therefore, IS researchers should be cautious when attempting to generalize the results found here to other technologies or teaching contexts (Healy, 1998). A third limitation stems from the self-report method of reporting EUT. Self-report measures of EUT are subjective and may be limited in the true assessment of an individual's actual EUT. Moreover, although the finding that EUT made no significant contribution to IU was consistent with the results of Baek et al. (2006), they were not consistent with the findings of others (Igbaria & Iivari, 1998; Woods et al., 2004). As prior research results have been mixed, the results from this study further validated the call for additional research clarifying the construct of EUT and its role in technology acceptance (Thompson et al., 2006). A fourth limitation was demonstrated by the fact that nearly 95% of the respondents had been using computers for 10 or more years, with 59% having used computers for 20 or more years. As the number of years using a computer does not necessarily equate to greater EUT, different results may have been received among instructors who have not been using computers very long. A fifth limitation was that nearly 79% of instructors had been teaching for over six years, with 68% having more than 10 years' teaching experience. Different results may also be received among instructors who have not been teaching very long. The sixth limitation is that approximately 67% of instructors were over 40 years of age, and 88% were over 50. Different results may be obtained from instructors who are younger.

6.5 Suggestions for Future Research

Several areas for future research were identified. This study investigated factors associated with instructors' intention to use a single emerging educational technology in traditional classrooms. More work is needed in investigating other emerging educational technologies in other teaching contexts. For example, this study could be replicated in other environments, such as online class environments. Moreover, as this study was conducted at a small university in a single location, additional investigations are warranted at larger and geographically dispersed institutions to validate the results. Additionally, as CSE has been found here to have a strong direct effect on intention to use, additional work should investigate the constructs that may significantly predict changes in CSE.

As the literature generally reports mixed findings regarding CA and EUT, additional research investigating the definitions and roles of CA and EUT in technology acceptance, especially in educational environments, is warranted. Research identifying other factors associated with instructor technology acceptance should be conducted. Moreover, all instructors were investigated, without regard to academic rank, status, or demographics. Future research should attempt to investigate the influence of age and years of teaching as a demographic characteristic on their acceptance level. For example, additional research should investigate whether there is a difference between full-time and part-time instructors or among instructors of different rank or demographics might provide additional insight as to the factors that influence instructors' technology acceptance. Moreover, due to the limited number of IS instructors in this study, a comparison between IS and non-IS may be limited. However, additional research should investigate the differences between IS and non-IS instructors on their acceptance of emerging educational technology in the classroom.

Although there are numerous research studies that validated the link between intentions to use and actual usage (Legris et al., 2003), additional research should investigate if instructors intentions to use emerging educational technology does indeed provide a strong prediction of actual use of such technology. Additional research on how to encourage instructors to use emerging educational technology in the classroom would also benefit both researchers and practitioners.

7. ACKNOWLEDGEMENTS

The authors would like to thank all the anonymous instructors who participated in this study, Dr. Timothy J. Ellis and Dr. Marti M. Snyder for their constructive comments on an earlier version of this work. Moreover, the authors would like to thank the editor-in-chief Dr. Albert L. Harris and the anonymous referees for their careful review and valuable suggestions.
APPENDIX A

Please respond to the following statements from one to five,
with one indicating "Strongly disagree" and five indicating
"Strongly agree."

 Item

CSE1. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if
 there was no
 one around
 to tell me
 what to do
 as I go.

CSE2. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if I
 had never
 used a
 package like
 it before.

CSE3. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if I
 had only the
 software
 manuals for
 reference.

CSE4. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if I
 had seen
 someone else
 using it
 before
 trying it
 myself.

CSE5. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if I
 could call
 someone for
 help if I
 got stuck.

CSE6. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if
 someone else
 had helped
 me get
 started.

CSE7. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if I
 had a lot of
 time to
 complete the
 job for
 which the
 software was
 provided.

CSE8. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if I
 had just the
 built-in
 help
 facility for
 assistance.

CSE9. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if
 someone
 showed me
 how to do it
 first.

CSE10. I could (1) (2) (3) (4) (5)
 record a Strongly Disagree Neither Agree Strongly
 class disagree Disagree Agree
 lecture nor Agree
 using the
 Tegrity[R]
 software
 system if I
 had used
 similar
 packages
 before this
 one to do
 the same
 job.

Please respond to the following statements from one to five,
with one indicating "Strongly disagree" and five indicating
"Strongly agree."

 Item

CA1. I am able to (1) (2) (3) (4) (5)
 keep up with Strongly Disagree Neither Agree Strongly
 important disagree Disagree Agree
 technological nor Agree
 advances in
 computers.

CA2. Computers (1) (2) (3) (4) (5)
 make me feel Strongly Disagree Neither Agree Strongly
 uncomfortable. disagree Disagree Agree
 nor Agree

CA3. I get a (1) (2) (3) (4) (5)
 sinking Strongly Disagree Neither Agree Strongly
 feeling when disagree Disagree Agree
 I think of nor Agree
 trying to
 use a
 computer.

CA4. Computers (1) (2) (3) (4) (5)
 scare me. Strongly Disagree Neither Agree Strongly
 disagree Disagree Agree
 nor Agree

CA5. I look (1) (2) (3) (4) (5)
 forward to Strongly Disagree Neither Agree Strongly
 using a disagree Disagree Agree
 computer. nor Agree

CA6. The (1) (2) (3) (4) (5)
 challenge of Strongly Disagree Neither Agree Strongly
 learning disagree Disagree Agree
 about nor Agree
 computers is
 exciting.

CA7. If given the (1) (2) (3) (4) (5)
 opportunity, Strongly Disagree Neither Agree Strongly
 I would like disagree Disagree Agree
 to learn nor Agree
 more about
 computers.

Please indicate your level of experience with the following
technologies, from one to five, with one indicating "None" and
five indicating "Extensive."

 Item

EUT1. Email (1) (2) (3) (4) (5)
 None Very Some Quite Extensive
 Limited Experience a Lot

EUT2. Internet and (1) (2) (3) (4) (5)
 the world None Very So me Quite Extensive
 wide Web Limited Experience a Lot

EUT3. Spreadsheets (1) (2) (3) (4) (5)
 None Very So me Quite Extensive
 Limited Experience a Lot

EUT4. Word (1) (2) (3) (4) (5)
 processors None Very So me Quite Extensive
 Limited Experience a Lot

EUT5. Presentation (1) (2) (3) (4) (5)
 software None Very So me Quite Extensive
 Limited Experience a Lot

EUT6. Database (1) (2) (3) (4) (5)
 software None Very So me Quite Extensive
 Limited Experience a Lot

EUT7. Blackboard (1) (2) (3) (4) (5)
 [TM] online None Very So me Quite Extensive
 platform Limited Experience a Lot

Please respond to the following statements from one to five, with
one indicating "Strongly disagree" and five indicating
"Strongly agree."

Item

BI1 I intend to (1) (2) (3) (4) (5)
 use Strongly Disagree Neither Agree Strongly
 Tegrity[TM] disagree Disagree Agree
 in my nor Agree
 on-campus
 courses as
 soon as
 possible

BI2. I will use (1) (2) (3) (4) (5)
 Tegrity[TM] Strongly Disagree Neither Agree Strongly
 in my disagree Disagree Agree
 on-campus nor Agree
 courses soon
 after it is
 launched.

Please provide the following information about you.

Number of years using a computer:

Gender: [] Male [] Female

Age: [] 20-29 [] 30-39 [] 40-49 [] 50-59 [] 60 and over

Number of years' teaching [] Less [] 1-5 [] 11-15
 than years years
 1 year

 [] 16-20 [] Over
 years 20 years


8. REFERENCES

Agarwal, Ritu and Karahanna, Elena (2000), "Time Flies When You're Having Fun: Cognitive Absorption and Beliefs about Information Technology Usage." MIS Quarterly, Vol. 24, No. 4, pp. 665-694.

Agarwal, Ritu and Prasad, Jayesh (1998), "A Conceptual and Operational Definition of Personal Innovativeness in the Domain of Information Technology." Information Systems Research, Vol. 9, No. 2, pp. 204-215.

Baek, Youngkyun, Jung, Jaeyeob and Kim, Baeyeob (2006), "What Makes Teachers Use Technology in the Classroom? Exploring the Factors Affecting Facilitation of Technology with a Korean Sample." Computers & Education, Vol. 50, No. 1, pp. 224-234.

Bandura, Albert (1977), "Self-Efficacy: Toward a Unifying Theory of Behavioral Change." Psychological Review, Vol. 84, No. 2, pp. 191-215.

Baylor, L. Amy and Ritchie, Donn (2002), "What Factors Facilitate Teacher Skill, Teacher Morale, and Perceived Student Learning in Technology-Using Classrooms?" Computers & Education, Vol. 39, No. 4, pp. 395-414.

Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P. A., Fiset, M., and Huang, B. (2004), "How Does Distance Education Compare With Classroom Instruction? A Meta-Analysis of the Empirical Literature. Review of Educational Research, Vol. 74, No. 3, pp. 379-439.

Cassidy, Simon and Eachus, Peter (2002), "Developing the Computer User Self-Efficacy (CUSE) Scale: Investigating the Relationship Between Computer Self-Efficacy, Gender and Experience with Computers." Journal of Educational Computing Research, Vol. 26, No. 2, pp. 169-189.

Chen, Chun-Der, Fan, Yi-Wen, and Farn, Cheng-Kiang (2007), "Predicting Electronic Toll Collection Service Adoption: An Integration of the Technology Acceptance Model and the Theory of Planned Behavior." Transportation Research Part C, Vol. 15, No. 5, pp. 300-311.

Cheurprakobkit, Sutham (2000), "Web-Based Criminology/Criminal Justice Programs in Texas Colleges and Universities." Journal of Criminal Justice Education, Vol. 11, No. 2, pp. 279-294.

Christensen, Rhonda (2002), "Effects of Technology Integration Education on the Attitudes of Teachers and Students." Journal of Research on Technology in Education, Vol. 34, No. 4, pp. 411-433.

Chau, Y. K. Patrick, and Hu, J. Paul (2001), "Information Technology Acceptance by Individual Professionals: A Model Comparison Approach." Decision Sciences, Vol. 32, No. 4, pp. 699-718.

Chu, Li-Li (2003), "The Effects of Web Page Design Instruction on Computer Self-Efficacy of Preservice Teachers and Correlates." Journal of Educational Computing Research, Vol. 28, No. 2, pp. 127-142.

Compeau, R. Deborah and Higgins, A. Christopher (1995), "Computer Self-Efficacy: Development of a Measure and Initial Test." MIS Quarterly, Vol. 19, No. 2, pp. 189-211.

Compeau, R. Deborah, Higgins, A. Christopher, and Huff, Sid (1999), "Social Cognitive Theory and Individual Reactions to Computing Technology: A Longitudinal Study." MIS Quarterly, Vol. 23, No. 2, pp. 145-158.

D'Angelo, M. Jill and Woosley, A. Sherry (2007), "Technology in the Classroom: Friend or Foe." Education, Vol. 127, No. 4, pp. 462-471.

Davis, D. Fred (1989), "Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology." MIS Quarterly, Vol. 13, No. 3, pp. 319-340.

Davis, D. Fred, Bagozzi, P. Richard, and Warshaw, R. Paul (1989), "User acceptance of Computer Technology: A Comparison of Two Theoretical Models." Management Science, Vol. 35, No. 8, pp. 982-1003.

Debevec, Kathleen, Shih, Mei-Yau, and Kashyap, Vishal (2006), "Learning Strategies and Performance in a Technology Integrated Classroom." Journal of Research on Technology in Education, Vol. 38, No. 3, pp. 293-307.

Dillon, Andrew and Morris, G. Michael (1996), User Acceptance of Information Technology--Theories and Models. In M. Williams (ed.) Annual Review of Information Science and Technology, Vol. 31, Medford, NJ: Information Today, pp. 3-32.

Duhaney, C. Devon (2005), "Technology and Higher Education: Challenges in the Halls of Academe. International Journal of Instructional Media, Vol. 32, No. 1, pp. 7-15.

Fishbein, Martin and Ajzen, Icek (1975), Belief, Attitude, Intention and Behavior: An Introduction to Theory and Research. Addison-Wesley, Reading, MA.

Fuller, M. Robert, Vician, Chelley, and Brown, A. Susan (2006), "E-learning and Individual Characteristics: The Role of Computer Anxiety and Communication Apprehension." Journal of Computer Information Systems, Vol. 46, No. 4, pp.103-115.

Garrison, D. Randy and Kanuka, Heather (2004), "Blended Learning: Uncovering its Transformative Potential in Higher Education." Internet and Higher Education, Vol. 7, No. 2, pp. 95-105.

Gong, Min, Xu, Yan, and Yu, Yuecheng (2004), "An Enhanced Technology Acceptance Model for Web-Based Learning." Journal of Information Systems Education, Vol. 15, No. 4, pp. 365-374.

Hackbarth, Gary, Grover, Varun, and Yi, Y. Mun (2003), "Computer Playfulness and Anxiety: Positive and Negative Mediators of the System Experience Effect on Perceived Ease of Use." Information & Management, Vol. 40, No. 3, pp. 221-232.

Havelka, Douglas (2003), "Predicting Software Self-Efficacy among Business Students: A Preliminary Assessment." Journal of Information Systems Education, Vol. 14, No. 2, pp. 145-152.

Healy, M. Jane (1998), Failure to connect. Simon & Schuster, New York.

Hiltz, R. Starr and Turoff, Murray (2005), "Education Goes Digital: The Evolution of Online Learning and the Revolution in Higher Education. Communications of the ACM, Vol. 48, No. 10, pp. 59-64.

Hu, J. Paul, Clark, H. K. Theodore, and Ma, W. Will (2003), "Examining Technology Acceptance by School Teachers: A Longitudinal Study." Information & Management, Vol. 41, No. 2, pp. 227-241.

Igbaria, Magid and Iivari, Juhani (1995), "The Effects of Self-Efficacy on Computer Usage." Omega, International Journal of Management Science, Vol. 23, No. 6, pp. 587605.

Igbaria, Magid and Iivari, Juhani (1998), "Microcomputer Utilization Patterns among Managers and Professionals: The Case in Finland." Journal of Computer Information Systems, Vol. 38, No. 3, pp. 28-43.

Kingsley, V. Karla (2007), "Empower Diverse Learners with Educational Technology and Digital Media." Intervention in School and Clinic, Vol. 43, No. 1, pp. 52-56.

Korukonda, R. Appa (2006), "Differences That Do Matter: A Dialectic Analysis of Individual Characteristics and Personality Dimensions Contributing to Computer Anxiety." Computers in Human Behavior, Vol. 23, No. 4, pp. 1921-1942.

Legris, Paul, Ingham, John, and Collerette, Pierre (2003), "Why Do People Use Information Technology? A Critical Review of the Technology Acceptance Model." Information & Management, Vol. 40, pp. 191-204.

Levy, Yair (2006), Assessing the Value of E-Learning Systems. Information Science Publishing, Hershey, PA.

Levy, Yair (2008), "An Empirical Development of Critical Value Factors (CVF) of Online Learning Activities: An Application of Activity Theory and Cognitive Value Theory." Computers & Education, Vol. 51, No. 4, pp. 1664-1675.

Levy, Yair and Green, D. Bruce (in press), "An Empirical Study of Computer Self-Efficacy and the Technology Acceptance Model in the Military: A Case of a U.S. Navy Combat Information System." Journal of Organizational and End User Computing, Vol. 21, No. 2, pp. 1-23.

Lightfoot, M. Jay (2005), "Integrating Emerging Technologies into Traditional Classrooms: A Pedagogic Approach." International Journal of Instructional Media, Vol. 32, No. 3, pp. 209-224.

Moore, C. Gary and Benbasat, Izak (1991), "Development of an Instrument to Measure the Perceptions of Adopting an Information Technology Innovation." Information Systems Research, Vol. 2, No. 3, pp. 192-222.

Newman, Frank and Scurry, Jamie (2001), "Online Technology Pushes Pedagogy to the Forefront." The Chronicle of Higher Education, Vol. 47, No. 44, p. 7.

Ngai, E. W. T., Poon, J. K. L., and Chan, Y. H. C. (2007), "Empirical Examination of the Adoption of WebCT Using Tam." Computers & Education, Vol. 48, No. 2, pp. 250-267.

Nilson, Hallgeir and Purao, Sandeep (2005), "Balancing Objectivist and Constructivist Pedagogies for Teaching Emerging Technologies: Evidence From a Scandinavian Case Study." Journal of Information Systems Education, Vol. 16, No. 3, pp. 281-292.

Oncu, Semiral, Delialioglu, Omer, and Brown, A. Catherine (2008), "Critical Components for Technology Integration: How Do Instructors Make Decisions?" The Journal of Computers in Mathematics and Science Teaching, Vol. 27, No. 1, pp. 19-46.

Roblyer, D. Margaret (2006), Integrating Educational Technology into Teaching (4th ed.). Pearson Prentice Hall, NJ.

Saade, G. Raafat and Kira, Dennis (2006), "The Emotional State of Technology Acceptance." Issues in Informing Science and Information Technology, Vol. 3, pp. 529-539.

Sahim, Ismail and Thompson, Ann (2007), "Analysis of Predictive Factors that Influence Faculty Members' Technology Adoption Level." Journal of Technology and Teacher Education, Vol. 15, No. 2, pp. 167-190.

Saunders, Gunter and Klemming, Fredrik (2003), "Integrating Technology into a Traditional Learning Environment: Reasons for and Risks of Success." Active Learning in Higher Education, Vol. 4, No. 1, pp. 74-86.

Schmidt, Klaus (2002). "The Web-Enhanced Classroom." Journal of Industrial Technology, Vol. 18, No. 2, pp. 2-6.

Simon, J. Steven and Paper, David (2007), "User Acceptance of Voice Recognition Technology: An Empirical Extension of the Technology Acceptance Model." Journal of Organizational and End User Computing, Vol. 19, No. 1, pp. 24-50.

Simonson, M. M., Maurer, R. M., Montag-Torardi, M., and Whitaker, M. (1987). "Development of a Standardized Test of Computer Literacy and a Computer Anxiety Index." Journal of Educational Computing Research, Vol. 3, No. 2, pp. 231-247.

Smith, Brooke, Caputi, Peter, Crittenden, N., Jayasuriya, Rohan and Rawstorne, Patrick (1999), "A Review of the Construct of Computer Experience." Computers in Human Behavior, Vol. 15, No. 2, pp. 227-242.

Sprinthall, C. Richard (1997), Basic Statistical Analysis. Allyn and Bacon, Boston, MA.

Sun, Heshan and Zhang, Ping (2006), "The Role of Moderating Factors in User Technology Acceptance." International Journal of Human-Computer Studies, Vol. 64, No. 2, pp. 53-78.

Taylor, Shirley and Todd, Peter (1995a), "Assessing IT Usage: The Role of Prior Experience." MIS Quarterly, Vol. 19, No. 4, pp. 561-570.

Taylor, Shirley and Todd, Peter (1995b), "Understanding Information Technology Usage: A Test of Competing Models." Information Systems Research, Vol. 6, No. 2, pp. 144-176.

Tegrity[R] Software (2006), Tegrity Campus (Version 2.0). [Computer software]. http://www.tegrity.com/products. php .

Thompson, Ron, Compeau, R. Deborah, and Higgins, Chris (2006), "Intentions to Use Information Technologies: An Integrative Model." Journal of Organizational and End User Computing, Vol. 18, No. 3, pp. 25-47.

Venkatesh, Viswanath (2000), "Determinants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation and Emotion into the Technology Acceptance Model." Information Systems Research, Vol. 11, No. 4, pp. 342-365.

Venkatesh, Viswanath and Davis, D. Fred (1996), "A Model of the Antecedents of Perceived Ease of Use: Development and Test." Decision Sciences, Vol. 27, No. 3, 451-481.

Venkatesh, Viswanath, Morris, G. Michael, Davis, B. Gordon, and Davis, D. Fred (2003), "User acceptance of information technology: Toward a unified view." MIS Quarterly, Vol. 27, No. 3, pp. 425-478.

Venkatesh, Viswanath, Speier, Cheri and Morris, G. Michael (2002), "User Acceptance Enablers in Individual Decision Making about Technology: Toward an Integrated Model." Decision Sciences, Vol. 33, No. 2, pp. 297-316.

Wenglinsky, Harold (1998), Does it Compute? The Relationship between Educational Technology and Student Achievement in Mathematics. Educational Testing Service, Princeton, NJ.

Woods, Robert, Baker, D. Jason, and Hopper, Dave (2004), "Hybrid Structures: Instructors Use and Perception of Web-Based Courseware as a Supplement to Face-To-Face Instruction." The Internet and Higher Education, Vol. 7, No. 4, pp. 281-297.

Wozney, Lori, Venkatesh, Vivek, and Abrami, C. Philip (2006), "Implementing Computer Technologies: Teachers' Perceptions and Practices." Journal of Technology and Teacher Education, Vol. 14, No. 1, pp. 173-207.

Yang, H. Harrison, Mohamed, Dominic, and Beyerbach, Barbara (1999). "An Investigation of Computer Anxiety among Vocational-Technical Teachers." Journal of Industrial Teacher Education, Vol. 37, No. 1, pp. 64-82.

AUTHOR BIOGRAPHIES

Diane Ball is an Associate Professor at the School of Technology at Hodges University in Naples, Florida. She earned her Master's degree in Business Education from Johnson and Wales University. She received her Ph.D. in Information Systems from Nova Southeastern University. Diane also serves as the Director of Institutional Effectiveness and Research at Hodges University and heads their planning and assessment functions. Her current research interests include the effectiveness of IS in educational environments.

Yair Levy is an Associate Professor at the Graduate School of Computer and Information Sciences at Nova Southeastern University. During the mid to late 1990s, he assisted NASA to develop e-learning systems. He earned his Bachelor's degree in Aerospace Engineering from the Technion (Israel Institute of Technology). He received his MBA with MIS concentration and Ph.D. in MIS from Florida International University. His current research interests include cognitive value of IS, e-learning systems, effectiveness of IS, cognitive aspects of IS, as well as security and ethical issues of e-learning systems. Yair is the author of the book "Assessing the Value of e-Learning systems." His research publications appear in the IS journals, conference proceedings, invited book chapters, and encyclopedias. Additionally, he chaired and co-chaired multiple sessions/tracks in recognized conferences. Since early 2006, Yair has been serving as the Editor-in-Chief of the International Journal of Doctoral Studies (IJDS). Additionally, he is serving as an associate editor for the International Journal of Web-based Learning and Teaching Technologies. Moreover, he is serving as a member of editorial review and advisory board of several refereed journals. Additionally, Yair has been serving as a referee research reviewer for numerous national and international scientific journals, conference proceedings, as well as MIS and Information Security textbooks. He is also a frequent speaker at national and international meetings on MIS and online learning topics.

Diane M. Ball

Hodges University

2655 Northbrooke Drive

Naples, Florida 34119, USA

dball@hodges.edu

Yair Levy

Graduate School of Computer and Information Sciences

Nova Southeastern University

3301 College Avenue

Fort Lauderdale, Florida 33314, USA

levyy@nova.edu
Table 1: Descriptive statistics and demographics of the
study participants (N=56)

Item Frequency Percentage

Gender
 Male 32 57.1%
 Female 24 42.9%
Age
 20-29 2 3.6%
 30-39 4 7.1%
 40-49 12 21.4%
 50-60 26 46.4%
 Over 60 12 21.4%
Number of Years
 Teaching
 Experience
 1-5 12 21.4%
 6-10 6 10.7%
 11-15 13 23.2%
 16-20 7 12.5%
 Over 20 18 32.1%

 Min Max Mean SD
Number of Years 5 40 20.09 7.702
using a Computer

Table 2: Results of Reliability
Analysis (N=56)

Construct Cronbach's
 Alpha

 CSE .916
 CA .870
 EUT .859
 IU .943

Table 3: OLR Model Significance (N=56)

Model -2 Log Chi- Df Sig.
 Likelihood Square

Intercept Only 109.258
Final 96.117 13.141 3 .004

Table 4: OLR Parameter Estimates (N=56)

 Estimate Std. Wald Sig. 95 %
 Error Confidence
 Interval

CSE 1.018 .320 10.101 .001 .390 1.645
CA -.521 .457 1.300 .254 -- .375
 1.41
 6
EUT -.580 .408 2.019 .155 -- .220
 1.37
 9
_cut1 -2.925 2.482
_cut2 -1.666 2.435
_cut3 .142 2.424
_cut4 2.388 2.446
COPYRIGHT 2008 Journal of Information Systems Education
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2008 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Ball, Diane M.; Levy, Yair
Publication:Journal of Information Systems Education
Article Type:Report
Geographic Code:1USA
Date:Dec 22, 2008
Words:9942
Previous Article:A preliminary examination of the factors for knowledge sharing in technology mediated learning.
Next Article:Business student computer self-efficacy: ten years later.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters